Apr 16 16:01:11.903244 ip-10-0-131-24 systemd[1]: Starting Kubernetes Kubelet... Apr 16 16:01:12.430912 ip-10-0-131-24 kubenswrapper[2584]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:01:12.430912 ip-10-0-131-24 kubenswrapper[2584]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 16:01:12.430912 ip-10-0-131-24 kubenswrapper[2584]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:01:12.430912 ip-10-0-131-24 kubenswrapper[2584]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 16:01:12.430912 ip-10-0-131-24 kubenswrapper[2584]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 16:01:12.433144 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.433065 2584 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 16:01:12.436182 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436169 2584 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:01:12.436182 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436183 2584 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:01:12.436240 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436187 2584 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:01:12.436240 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436190 2584 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:01:12.436240 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436193 2584 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:01:12.436240 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436196 2584 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:01:12.436240 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436198 2584 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:01:12.436240 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436201 2584 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:01:12.436240 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436203 2584 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:01:12.436240 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436207 2584 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:01:12.436240 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436210 2584 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:01:12.436240 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436213 2584 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:01:12.436240 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436215 2584 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:01:12.436240 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436218 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:01:12.436240 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436220 2584 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:01:12.436240 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436227 2584 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:01:12.436240 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436229 2584 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:01:12.436240 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436232 2584 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:01:12.436240 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436234 2584 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:01:12.436240 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436237 2584 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:01:12.436240 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436240 2584 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:01:12.436240 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436243 2584 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:01:12.436723 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436246 2584 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:01:12.436723 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436249 2584 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:01:12.436723 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436251 2584 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:01:12.436723 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436254 2584 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:01:12.436723 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436257 2584 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:01:12.436723 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436259 2584 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:01:12.436723 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436262 2584 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:01:12.436723 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436264 2584 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:01:12.436723 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436267 2584 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:01:12.436723 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436270 2584 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:01:12.436723 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436272 2584 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:01:12.436723 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436274 2584 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:01:12.436723 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436277 2584 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:01:12.436723 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436279 2584 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:01:12.436723 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436282 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:01:12.436723 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436284 2584 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:01:12.436723 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436287 2584 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:01:12.436723 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436289 2584 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:01:12.436723 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436291 2584 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:01:12.436723 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436294 2584 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:01:12.437266 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436296 2584 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:01:12.437266 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436300 2584 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:01:12.437266 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436304 2584 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:01:12.437266 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436307 2584 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:01:12.437266 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436310 2584 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:01:12.437266 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436312 2584 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:01:12.437266 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436315 2584 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:01:12.437266 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436317 2584 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:01:12.437266 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436320 2584 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:01:12.437266 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436322 2584 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:01:12.437266 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436325 2584 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:01:12.437266 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436327 2584 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:01:12.437266 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436330 2584 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:01:12.437266 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436334 2584 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:01:12.437266 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436336 2584 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:01:12.437266 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436339 2584 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:01:12.437266 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436341 2584 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:01:12.437266 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436344 2584 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:01:12.437266 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.436347 2584 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:01:12.437727 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.437708 2584 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:01:12.437727 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.437714 2584 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:01:12.437727 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.437719 2584 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:01:12.437727 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.437722 2584 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:01:12.437727 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.437725 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:01:12.437842 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.437730 2584 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:01:12.437842 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.437734 2584 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:01:12.437842 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.437737 2584 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:01:12.437842 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.437739 2584 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:01:12.437842 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.437742 2584 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:01:12.437842 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.437745 2584 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:01:12.437842 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.437747 2584 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:01:12.437842 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.437750 2584 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:01:12.437842 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.437752 2584 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:01:12.437842 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.437755 2584 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:01:12.437842 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.437757 2584 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:01:12.437842 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.437760 2584 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:01:12.437842 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.437763 2584 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:01:12.437842 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.437765 2584 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:01:12.437842 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.437767 2584 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:01:12.437842 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.437770 2584 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:01:12.437842 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.437772 2584 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:01:12.437842 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.437775 2584 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:01:12.437842 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.437777 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:01:12.438305 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.437780 2584 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:01:12.438305 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438154 2584 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:01:12.438305 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438159 2584 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:01:12.438305 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438162 2584 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:01:12.438305 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438165 2584 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:01:12.438305 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438167 2584 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:01:12.438305 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438170 2584 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:01:12.438305 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438172 2584 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:01:12.438305 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438174 2584 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:01:12.438305 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438179 2584 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:01:12.438305 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438182 2584 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:01:12.438305 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438185 2584 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:01:12.438305 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438188 2584 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:01:12.438305 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438192 2584 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:01:12.438305 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438194 2584 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:01:12.438305 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438197 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:01:12.438305 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438200 2584 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:01:12.438305 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438202 2584 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:01:12.438305 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438205 2584 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:01:12.438854 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438207 2584 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:01:12.438854 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438210 2584 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:01:12.438854 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438212 2584 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:01:12.438854 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438214 2584 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:01:12.438854 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438216 2584 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:01:12.438854 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438219 2584 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:01:12.438854 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438221 2584 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:01:12.438854 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438224 2584 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:01:12.438854 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438226 2584 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:01:12.438854 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438229 2584 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:01:12.438854 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438231 2584 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:01:12.438854 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438234 2584 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:01:12.438854 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438236 2584 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:01:12.438854 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438239 2584 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:01:12.438854 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438243 2584 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:01:12.438854 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438246 2584 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:01:12.438854 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438248 2584 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:01:12.438854 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438251 2584 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:01:12.438854 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438254 2584 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:01:12.438854 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438256 2584 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:01:12.439407 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438259 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:01:12.439407 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438261 2584 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:01:12.439407 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438264 2584 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:01:12.439407 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438266 2584 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:01:12.439407 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438269 2584 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:01:12.439407 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438272 2584 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:01:12.439407 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438274 2584 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:01:12.439407 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438277 2584 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:01:12.439407 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438280 2584 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:01:12.439407 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438282 2584 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:01:12.439407 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438284 2584 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:01:12.439407 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438287 2584 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:01:12.439407 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438289 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:01:12.439407 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438292 2584 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:01:12.439407 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438294 2584 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:01:12.439407 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438297 2584 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:01:12.439407 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438299 2584 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:01:12.439407 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438302 2584 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:01:12.439407 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438304 2584 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:01:12.439407 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438306 2584 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:01:12.439883 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438309 2584 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:01:12.439883 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438312 2584 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:01:12.439883 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438314 2584 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:01:12.439883 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438318 2584 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:01:12.439883 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438322 2584 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:01:12.439883 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438325 2584 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:01:12.439883 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438327 2584 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:01:12.439883 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438330 2584 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:01:12.439883 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438333 2584 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:01:12.439883 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438335 2584 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:01:12.439883 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438337 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:01:12.439883 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438340 2584 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:01:12.439883 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438342 2584 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:01:12.439883 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438344 2584 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:01:12.439883 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438347 2584 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:01:12.439883 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438349 2584 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:01:12.439883 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438352 2584 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:01:12.439883 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438354 2584 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:01:12.439883 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438356 2584 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:01:12.439883 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438359 2584 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:01:12.440385 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438362 2584 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:01:12.440385 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438364 2584 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:01:12.440385 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438366 2584 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:01:12.440385 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438369 2584 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:01:12.440385 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438371 2584 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:01:12.440385 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438374 2584 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:01:12.440385 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438376 2584 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:01:12.440385 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.438379 2584 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:01:12.440385 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439770 2584 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 16:01:12.440385 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439781 2584 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 16:01:12.440385 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439787 2584 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 16:01:12.440385 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439791 2584 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 16:01:12.440385 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439796 2584 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 16:01:12.440385 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439799 2584 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 16:01:12.440385 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439803 2584 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 16:01:12.440385 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439807 2584 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 16:01:12.440385 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439812 2584 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 16:01:12.440385 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439815 2584 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 16:01:12.440385 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439819 2584 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 16:01:12.440385 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439822 2584 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 16:01:12.440385 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439825 2584 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 16:01:12.440385 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439828 2584 flags.go:64] FLAG: --cgroup-root="" Apr 16 16:01:12.440921 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439831 2584 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 16:01:12.440921 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439834 2584 flags.go:64] FLAG: --client-ca-file="" Apr 16 16:01:12.440921 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439837 2584 flags.go:64] FLAG: --cloud-config="" Apr 16 16:01:12.440921 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439839 2584 flags.go:64] FLAG: --cloud-provider="external" Apr 16 16:01:12.440921 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439842 2584 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 16:01:12.440921 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439847 2584 flags.go:64] FLAG: --cluster-domain="" Apr 16 16:01:12.440921 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439850 2584 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 16:01:12.440921 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439853 2584 flags.go:64] FLAG: --config-dir="" Apr 16 16:01:12.440921 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439856 2584 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 16:01:12.440921 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439859 2584 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 16:01:12.440921 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439863 2584 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 16:01:12.440921 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439866 2584 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 16:01:12.440921 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439869 2584 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 16:01:12.440921 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439873 2584 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 16:01:12.440921 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439876 2584 flags.go:64] FLAG: --contention-profiling="false" Apr 16 16:01:12.440921 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439879 2584 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 16:01:12.440921 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439882 2584 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 16:01:12.440921 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439886 2584 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 16:01:12.440921 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439889 2584 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 16:01:12.440921 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439894 2584 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 16:01:12.440921 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439897 2584 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 16:01:12.440921 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439900 2584 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 16:01:12.440921 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439903 2584 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 16:01:12.440921 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439906 2584 flags.go:64] FLAG: --enable-server="true" Apr 16 16:01:12.440921 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439909 2584 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 16:01:12.441536 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439913 2584 flags.go:64] FLAG: --event-burst="100" Apr 16 16:01:12.441536 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439916 2584 flags.go:64] FLAG: --event-qps="50" Apr 16 16:01:12.441536 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439919 2584 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 16:01:12.441536 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439922 2584 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 16:01:12.441536 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439925 2584 flags.go:64] FLAG: --eviction-hard="" Apr 16 16:01:12.441536 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439928 2584 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 16:01:12.441536 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439931 2584 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 16:01:12.441536 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439934 2584 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 16:01:12.441536 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439937 2584 flags.go:64] FLAG: --eviction-soft="" Apr 16 16:01:12.441536 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439939 2584 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 16:01:12.441536 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439942 2584 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 16:01:12.441536 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439945 2584 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 16:01:12.441536 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439948 2584 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 16:01:12.441536 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439963 2584 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 16:01:12.441536 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439966 2584 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 16:01:12.441536 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439969 2584 flags.go:64] FLAG: --feature-gates="" Apr 16 16:01:12.441536 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439972 2584 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 16:01:12.441536 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439975 2584 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 16:01:12.441536 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439979 2584 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 16:01:12.441536 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439982 2584 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 16:01:12.441536 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439985 2584 flags.go:64] FLAG: --healthz-port="10248" Apr 16 16:01:12.441536 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439988 2584 flags.go:64] FLAG: --help="false" Apr 16 16:01:12.441536 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439991 2584 flags.go:64] FLAG: --hostname-override="ip-10-0-131-24.ec2.internal" Apr 16 16:01:12.441536 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439994 2584 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 16:01:12.441536 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.439997 2584 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 16:01:12.442136 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440000 2584 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 16:01:12.442136 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440004 2584 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 16:01:12.442136 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440007 2584 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 16:01:12.442136 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440010 2584 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 16:01:12.442136 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440013 2584 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 16:01:12.442136 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440016 2584 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 16:01:12.442136 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440018 2584 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 16:01:12.442136 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440021 2584 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 16:01:12.442136 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440024 2584 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 16:01:12.442136 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440027 2584 flags.go:64] FLAG: --kube-reserved="" Apr 16 16:01:12.442136 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440030 2584 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 16:01:12.442136 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440032 2584 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 16:01:12.442136 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440036 2584 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 16:01:12.442136 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440038 2584 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 16:01:12.442136 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440041 2584 flags.go:64] FLAG: --lock-file="" Apr 16 16:01:12.442136 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440044 2584 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 16:01:12.442136 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440047 2584 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 16:01:12.442136 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440049 2584 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 16:01:12.442136 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440054 2584 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 16:01:12.442136 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440058 2584 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 16:01:12.442136 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440061 2584 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 16:01:12.442136 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440064 2584 flags.go:64] FLAG: --logging-format="text" Apr 16 16:01:12.442136 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440066 2584 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 16:01:12.442685 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440069 2584 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 16:01:12.442685 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440072 2584 flags.go:64] FLAG: --manifest-url="" Apr 16 16:01:12.442685 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440075 2584 flags.go:64] FLAG: --manifest-url-header="" Apr 16 16:01:12.442685 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440079 2584 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 16:01:12.442685 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440083 2584 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 16:01:12.442685 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440087 2584 flags.go:64] FLAG: --max-pods="110" Apr 16 16:01:12.442685 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440090 2584 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 16:01:12.442685 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440093 2584 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 16:01:12.442685 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440096 2584 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 16:01:12.442685 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440099 2584 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 16:01:12.442685 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440102 2584 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 16:01:12.442685 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440105 2584 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 16:01:12.442685 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440108 2584 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 16:01:12.442685 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440115 2584 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 16:01:12.442685 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440118 2584 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 16:01:12.442685 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440121 2584 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 16:01:12.442685 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440124 2584 flags.go:64] FLAG: --pod-cidr="" Apr 16 16:01:12.442685 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440127 2584 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 16:01:12.442685 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440132 2584 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 16:01:12.442685 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440135 2584 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 16:01:12.442685 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440138 2584 flags.go:64] FLAG: --pods-per-core="0" Apr 16 16:01:12.442685 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440141 2584 flags.go:64] FLAG: --port="10250" Apr 16 16:01:12.442685 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440144 2584 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 16:01:12.442685 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440147 2584 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0b37d429f0ce93112" Apr 16 16:01:12.443275 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440150 2584 flags.go:64] FLAG: --qos-reserved="" Apr 16 16:01:12.443275 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440153 2584 flags.go:64] FLAG: --read-only-port="10255" Apr 16 16:01:12.443275 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440156 2584 flags.go:64] FLAG: --register-node="true" Apr 16 16:01:12.443275 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440158 2584 flags.go:64] FLAG: --register-schedulable="true" Apr 16 16:01:12.443275 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440161 2584 flags.go:64] FLAG: --register-with-taints="" Apr 16 16:01:12.443275 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440165 2584 flags.go:64] FLAG: --registry-burst="10" Apr 16 16:01:12.443275 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440167 2584 flags.go:64] FLAG: --registry-qps="5" Apr 16 16:01:12.443275 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440170 2584 flags.go:64] FLAG: --reserved-cpus="" Apr 16 16:01:12.443275 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440173 2584 flags.go:64] FLAG: --reserved-memory="" Apr 16 16:01:12.443275 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440177 2584 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 16:01:12.443275 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440180 2584 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 16:01:12.443275 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440183 2584 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 16:01:12.443275 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440185 2584 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 16:01:12.443275 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440189 2584 flags.go:64] FLAG: --runonce="false" Apr 16 16:01:12.443275 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440192 2584 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 16:01:12.443275 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440195 2584 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 16:01:12.443275 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440197 2584 flags.go:64] FLAG: --seccomp-default="false" Apr 16 16:01:12.443275 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440200 2584 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 16:01:12.443275 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440203 2584 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 16:01:12.443275 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440205 2584 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 16:01:12.443275 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440208 2584 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 16:01:12.443275 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440212 2584 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 16:01:12.443275 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440214 2584 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 16:01:12.443275 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440217 2584 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 16:01:12.443275 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440220 2584 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 16:01:12.443275 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440222 2584 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 16:01:12.443880 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440225 2584 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 16:01:12.443880 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440228 2584 flags.go:64] FLAG: --system-cgroups="" Apr 16 16:01:12.443880 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440231 2584 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 16:01:12.443880 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440236 2584 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 16:01:12.443880 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440239 2584 flags.go:64] FLAG: --tls-cert-file="" Apr 16 16:01:12.443880 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440242 2584 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 16:01:12.443880 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440246 2584 flags.go:64] FLAG: --tls-min-version="" Apr 16 16:01:12.443880 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440249 2584 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 16:01:12.443880 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440252 2584 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 16:01:12.443880 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440255 2584 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 16:01:12.443880 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440258 2584 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 16:01:12.443880 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440261 2584 flags.go:64] FLAG: --v="2" Apr 16 16:01:12.443880 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440265 2584 flags.go:64] FLAG: --version="false" Apr 16 16:01:12.443880 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440268 2584 flags.go:64] FLAG: --vmodule="" Apr 16 16:01:12.443880 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440272 2584 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 16:01:12.443880 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440275 2584 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 16:01:12.443880 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440383 2584 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:01:12.443880 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440387 2584 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:01:12.443880 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440390 2584 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:01:12.443880 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440393 2584 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:01:12.443880 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440400 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:01:12.443880 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440403 2584 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:01:12.443880 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440407 2584 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:01:12.444434 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440411 2584 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:01:12.444434 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440416 2584 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:01:12.444434 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440418 2584 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:01:12.444434 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440421 2584 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:01:12.444434 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440424 2584 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:01:12.444434 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440426 2584 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:01:12.444434 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440430 2584 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:01:12.444434 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440434 2584 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:01:12.444434 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440437 2584 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:01:12.444434 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440439 2584 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:01:12.444434 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440442 2584 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:01:12.444434 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440444 2584 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:01:12.444434 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440447 2584 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:01:12.444434 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440449 2584 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:01:12.444434 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440459 2584 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:01:12.444434 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440463 2584 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:01:12.444434 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440466 2584 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:01:12.444434 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440468 2584 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:01:12.444434 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440470 2584 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:01:12.444940 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440473 2584 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:01:12.444940 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440476 2584 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:01:12.444940 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440479 2584 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:01:12.444940 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440481 2584 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:01:12.444940 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440483 2584 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:01:12.444940 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440486 2584 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:01:12.444940 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440489 2584 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:01:12.444940 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440491 2584 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:01:12.444940 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440494 2584 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:01:12.444940 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440496 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:01:12.444940 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440498 2584 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:01:12.444940 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440502 2584 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:01:12.444940 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440505 2584 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:01:12.444940 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440507 2584 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:01:12.444940 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440511 2584 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:01:12.444940 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440514 2584 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:01:12.444940 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440516 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:01:12.444940 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440519 2584 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:01:12.444940 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440521 2584 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:01:12.444940 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440524 2584 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:01:12.445445 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440526 2584 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:01:12.445445 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440528 2584 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:01:12.445445 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440531 2584 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:01:12.445445 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440533 2584 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:01:12.445445 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440536 2584 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:01:12.445445 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440539 2584 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:01:12.445445 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440541 2584 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:01:12.445445 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440544 2584 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:01:12.445445 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440547 2584 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:01:12.445445 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440549 2584 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:01:12.445445 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440552 2584 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:01:12.445445 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440555 2584 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:01:12.445445 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440557 2584 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:01:12.445445 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440560 2584 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:01:12.445445 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440562 2584 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:01:12.445445 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440565 2584 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:01:12.445445 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440567 2584 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:01:12.445445 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440570 2584 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:01:12.445445 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440573 2584 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:01:12.445445 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440575 2584 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:01:12.445925 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440578 2584 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:01:12.445925 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440580 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:01:12.445925 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440582 2584 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:01:12.445925 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440585 2584 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:01:12.445925 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440587 2584 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:01:12.445925 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440590 2584 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:01:12.445925 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440593 2584 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:01:12.445925 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440596 2584 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:01:12.445925 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440598 2584 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:01:12.445925 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440601 2584 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:01:12.445925 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440603 2584 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:01:12.445925 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440606 2584 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:01:12.445925 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440608 2584 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:01:12.445925 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440611 2584 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:01:12.445925 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440613 2584 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:01:12.445925 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440620 2584 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:01:12.445925 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440622 2584 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:01:12.445925 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440625 2584 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:01:12.445925 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440627 2584 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:01:12.445925 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.440630 2584 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:01:12.446456 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.440635 2584 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:01:12.447712 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.447689 2584 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 16:01:12.447712 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.447711 2584 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 16:01:12.447862 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447778 2584 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:01:12.447862 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447786 2584 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:01:12.447862 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447791 2584 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:01:12.447862 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447795 2584 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:01:12.447862 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447800 2584 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:01:12.447862 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447804 2584 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:01:12.447862 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447808 2584 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:01:12.447862 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447812 2584 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:01:12.447862 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447816 2584 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:01:12.447862 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447820 2584 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:01:12.447862 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447824 2584 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:01:12.447862 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447829 2584 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:01:12.447862 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447834 2584 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:01:12.447862 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447839 2584 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:01:12.447862 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447844 2584 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:01:12.447862 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447848 2584 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:01:12.447862 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447852 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:01:12.447862 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447856 2584 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:01:12.447862 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447861 2584 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:01:12.447862 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447865 2584 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:01:12.448762 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447870 2584 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:01:12.448762 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447874 2584 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:01:12.448762 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447878 2584 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:01:12.448762 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447882 2584 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:01:12.448762 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447886 2584 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:01:12.448762 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447890 2584 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:01:12.448762 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447916 2584 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:01:12.448762 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447922 2584 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:01:12.448762 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447927 2584 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:01:12.448762 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447932 2584 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:01:12.448762 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447937 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:01:12.448762 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447942 2584 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:01:12.448762 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447947 2584 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:01:12.448762 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447969 2584 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:01:12.448762 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447974 2584 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:01:12.448762 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447979 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:01:12.448762 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447983 2584 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:01:12.448762 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447987 2584 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:01:12.448762 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447991 2584 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:01:12.448762 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.447996 2584 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:01:12.449341 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448000 2584 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:01:12.449341 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448004 2584 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:01:12.449341 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448008 2584 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:01:12.449341 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448012 2584 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:01:12.449341 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448017 2584 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:01:12.449341 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448021 2584 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:01:12.449341 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448025 2584 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:01:12.449341 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448029 2584 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:01:12.449341 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448033 2584 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:01:12.449341 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448037 2584 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:01:12.449341 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448042 2584 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:01:12.449341 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448048 2584 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:01:12.449341 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448054 2584 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:01:12.449341 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448059 2584 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:01:12.449341 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448064 2584 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:01:12.449341 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448068 2584 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:01:12.449341 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448073 2584 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:01:12.449341 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448077 2584 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:01:12.449341 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448081 2584 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:01:12.449870 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448088 2584 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:01:12.449870 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448095 2584 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:01:12.449870 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448100 2584 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:01:12.449870 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448105 2584 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:01:12.449870 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448109 2584 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:01:12.449870 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448114 2584 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:01:12.449870 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448119 2584 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:01:12.449870 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448123 2584 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:01:12.449870 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448127 2584 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:01:12.449870 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448131 2584 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:01:12.449870 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448135 2584 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:01:12.449870 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448139 2584 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:01:12.449870 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448143 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:01:12.449870 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448147 2584 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:01:12.449870 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448151 2584 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:01:12.449870 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448155 2584 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:01:12.449870 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448159 2584 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:01:12.449870 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448163 2584 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:01:12.449870 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448167 2584 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:01:12.449870 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448171 2584 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:01:12.450574 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448175 2584 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:01:12.450574 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448179 2584 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:01:12.450574 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448183 2584 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:01:12.450574 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448187 2584 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:01:12.450574 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448192 2584 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:01:12.450574 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448196 2584 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:01:12.450574 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448201 2584 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:01:12.450574 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.448209 2584 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:01:12.450574 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448364 2584 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 16:01:12.450574 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448372 2584 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 16:01:12.450574 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448377 2584 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 16:01:12.450574 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448382 2584 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 16:01:12.450574 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448387 2584 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 16:01:12.450574 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448391 2584 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 16:01:12.450574 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448396 2584 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 16:01:12.451241 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448400 2584 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 16:01:12.451241 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448404 2584 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 16:01:12.451241 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448408 2584 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 16:01:12.451241 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448413 2584 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 16:01:12.451241 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448418 2584 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 16:01:12.451241 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448422 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 16:01:12.451241 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448426 2584 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 16:01:12.451241 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448430 2584 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 16:01:12.451241 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448435 2584 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 16:01:12.451241 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448439 2584 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 16:01:12.451241 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448444 2584 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 16:01:12.451241 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448448 2584 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 16:01:12.451241 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448460 2584 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 16:01:12.451241 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448464 2584 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 16:01:12.451241 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448469 2584 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 16:01:12.451241 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448473 2584 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 16:01:12.451241 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448477 2584 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 16:01:12.451241 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448481 2584 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 16:01:12.451241 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448485 2584 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 16:01:12.451868 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448489 2584 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 16:01:12.451868 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448493 2584 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 16:01:12.451868 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448497 2584 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 16:01:12.451868 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448502 2584 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 16:01:12.451868 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448507 2584 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 16:01:12.451868 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448511 2584 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 16:01:12.451868 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448516 2584 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 16:01:12.451868 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448520 2584 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 16:01:12.451868 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448523 2584 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 16:01:12.451868 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448527 2584 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 16:01:12.451868 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448531 2584 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 16:01:12.451868 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448535 2584 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 16:01:12.451868 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448542 2584 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 16:01:12.451868 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448548 2584 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 16:01:12.451868 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448554 2584 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 16:01:12.451868 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448561 2584 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 16:01:12.451868 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448566 2584 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 16:01:12.451868 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448571 2584 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 16:01:12.451868 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448576 2584 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 16:01:12.452345 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448580 2584 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 16:01:12.452345 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448585 2584 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 16:01:12.452345 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448589 2584 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 16:01:12.452345 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448593 2584 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 16:01:12.452345 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448598 2584 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 16:01:12.452345 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448601 2584 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 16:01:12.452345 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448605 2584 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 16:01:12.452345 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448609 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 16:01:12.452345 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448614 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 16:01:12.452345 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448618 2584 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 16:01:12.452345 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448622 2584 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 16:01:12.452345 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448625 2584 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 16:01:12.452345 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448629 2584 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 16:01:12.452345 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448634 2584 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 16:01:12.452345 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448638 2584 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 16:01:12.452345 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448642 2584 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 16:01:12.452345 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448647 2584 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 16:01:12.452345 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448651 2584 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 16:01:12.452345 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448655 2584 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 16:01:12.452345 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448659 2584 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 16:01:12.452967 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448663 2584 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 16:01:12.452967 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448667 2584 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 16:01:12.452967 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448671 2584 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 16:01:12.452967 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448675 2584 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 16:01:12.452967 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448679 2584 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 16:01:12.452967 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448683 2584 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 16:01:12.452967 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448687 2584 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 16:01:12.452967 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448691 2584 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 16:01:12.452967 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448695 2584 feature_gate.go:328] unrecognized feature gate: Example Apr 16 16:01:12.452967 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448699 2584 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 16:01:12.452967 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448703 2584 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 16:01:12.452967 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448708 2584 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 16:01:12.452967 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448712 2584 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 16:01:12.452967 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448716 2584 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 16:01:12.452967 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448720 2584 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 16:01:12.452967 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448724 2584 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 16:01:12.452967 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448728 2584 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 16:01:12.452967 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448732 2584 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 16:01:12.452967 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448736 2584 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 16:01:12.452967 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448740 2584 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 16:01:12.453455 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:12.448744 2584 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 16:01:12.453455 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.448752 2584 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 16:01:12.453455 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.449605 2584 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 16:01:12.453455 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.452434 2584 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 16:01:12.453595 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.453532 2584 server.go:1019] "Starting client certificate rotation" Apr 16 16:01:12.453643 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.453627 2584 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:01:12.454367 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.454355 2584 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 16:01:12.486399 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.486374 2584 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:01:12.488688 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.488667 2584 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 16:01:12.507509 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.507487 2584 log.go:25] "Validated CRI v1 runtime API" Apr 16 16:01:12.512778 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.512764 2584 log.go:25] "Validated CRI v1 image API" Apr 16 16:01:12.514641 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.514625 2584 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 16:01:12.518433 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.518418 2584 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:01:12.520052 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.520035 2584 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 e004c3e6-0f14-4caa-b2be-fbd547eb828e:/dev/nvme0n1p3 e066bf18-02dc-4ac1-86a3-53f7ae49b71c:/dev/nvme0n1p4] Apr 16 16:01:12.520097 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.520052 2584 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 16:01:12.525946 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.525828 2584 manager.go:217] Machine: {Timestamp:2026-04-16 16:01:12.523683784 +0000 UTC m=+0.483875501 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3152922 MemoryCapacity:32812171264 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2d865d9ab002a065527a4770c221b9 SystemUUID:ec2d865d-9ab0-02a0-6552-7a4770c221b9 BootID:b7f9eda6-3ee1-499d-9b4a-e35cd662d781 Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406085632 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:c0:0d:bb:e8:3b Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:c0:0d:bb:e8:3b Speed:0 Mtu:9001} {Name:ovs-system MacAddress:66:3c:d9:72:44:84 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812171264 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 16:01:12.525946 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.525940 2584 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 16:01:12.526074 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.526025 2584 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 16:01:12.527012 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.526993 2584 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 16:01:12.527146 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.527015 2584 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-131-24.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 16:01:12.527191 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.527159 2584 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 16:01:12.527191 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.527168 2584 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 16:01:12.527191 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.527181 2584 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:01:12.527989 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.527978 2584 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 16:01:12.528761 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.528751 2584 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:01:12.528864 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.528855 2584 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 16:01:12.531861 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.531851 2584 kubelet.go:491] "Attempting to sync node with API server" Apr 16 16:01:12.531899 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.531871 2584 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 16:01:12.531899 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.531883 2584 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 16:01:12.531899 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.531891 2584 kubelet.go:397] "Adding apiserver pod source" Apr 16 16:01:12.532012 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.531901 2584 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 16:01:12.533184 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.533170 2584 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:01:12.533232 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.533198 2584 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 16:01:12.536631 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.536616 2584 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 16:01:12.538551 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.538534 2584 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 16:01:12.541205 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.541192 2584 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 16:01:12.541277 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.541212 2584 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 16:01:12.541277 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.541221 2584 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 16:01:12.541277 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.541236 2584 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 16:01:12.541277 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.541246 2584 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 16:01:12.541277 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.541255 2584 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 16:01:12.541277 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.541263 2584 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 16:01:12.541277 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.541272 2584 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 16:01:12.541520 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.541282 2584 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 16:01:12.541520 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.541292 2584 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 16:01:12.541520 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.541332 2584 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 16:01:12.541520 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.541348 2584 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 16:01:12.541520 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.541351 2584 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-gbz7j" Apr 16 16:01:12.542567 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.542555 2584 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 16:01:12.542624 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.542570 2584 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 16:01:12.543094 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:12.543049 2584 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 16:01:12.545692 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:12.545666 2584 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-131-24.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 16:01:12.546149 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.546126 2584 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 16:01:12.546231 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.546192 2584 server.go:1295] "Started kubelet" Apr 16 16:01:12.546545 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.546515 2584 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 16:01:12.546863 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.546508 2584 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 16:01:12.546905 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.546892 2584 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 16:01:12.547312 ip-10-0-131-24 systemd[1]: Started Kubernetes Kubelet. Apr 16 16:01:12.547594 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.547566 2584 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-131-24.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 16:01:12.548096 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.548078 2584 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 16:01:12.548868 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.548851 2584 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-gbz7j" Apr 16 16:01:12.550197 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.550181 2584 server.go:317] "Adding debug handlers to kubelet server" Apr 16 16:01:12.555820 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.555801 2584 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 16:01:12.555820 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.555817 2584 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 16:01:12.556427 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.556411 2584 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 16:01:12.556427 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.556428 2584 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 16:01:12.556548 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.556433 2584 factory.go:55] Registering systemd factory Apr 16 16:01:12.556548 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.556446 2584 factory.go:223] Registration of the systemd container factory successfully Apr 16 16:01:12.556548 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.556411 2584 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 16:01:12.556548 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.556541 2584 reconstruct.go:97] "Volume reconstruction finished" Apr 16 16:01:12.556548 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.556548 2584 reconciler.go:26] "Reconciler: start to sync state" Apr 16 16:01:12.557072 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:12.557051 2584 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-24.ec2.internal\" not found" Apr 16 16:01:12.557890 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.557859 2584 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:01:12.559912 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.559574 2584 factory.go:153] Registering CRI-O factory Apr 16 16:01:12.559912 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.559590 2584 factory.go:223] Registration of the crio container factory successfully Apr 16 16:01:12.559912 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.559646 2584 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 16:01:12.559912 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.559668 2584 factory.go:103] Registering Raw factory Apr 16 16:01:12.559912 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.559682 2584 manager.go:1196] Started watching for new ooms in manager Apr 16 16:01:12.560231 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.560091 2584 manager.go:319] Starting recovery of all containers Apr 16 16:01:12.560991 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:12.560947 2584 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 16:01:12.562168 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:12.562150 2584 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-131-24.ec2.internal\" not found" node="ip-10-0-131-24.ec2.internal" Apr 16 16:01:12.570794 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.570690 2584 manager.go:324] Recovery completed Apr 16 16:01:12.574671 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.574660 2584 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:01:12.580403 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.580385 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-24.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:01:12.580501 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.580419 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-24.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:01:12.580501 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.580434 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-24.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:01:12.580885 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.580871 2584 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 16:01:12.580930 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.580885 2584 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 16:01:12.580930 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.580903 2584 state_mem.go:36] "Initialized new in-memory state store" Apr 16 16:01:12.583033 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.583021 2584 policy_none.go:49] "None policy: Start" Apr 16 16:01:12.583071 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.583039 2584 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 16:01:12.583071 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.583048 2584 state_mem.go:35] "Initializing new in-memory state store" Apr 16 16:01:12.622519 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.622501 2584 manager.go:341] "Starting Device Plugin manager" Apr 16 16:01:12.630115 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:12.622534 2584 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 16:01:12.630115 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.622548 2584 server.go:85] "Starting device plugin registration server" Apr 16 16:01:12.630115 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.622819 2584 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 16:01:12.630115 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.622829 2584 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 16:01:12.630115 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.622939 2584 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 16:01:12.630115 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.623041 2584 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 16:01:12.630115 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.623051 2584 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 16:01:12.630115 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:12.623470 2584 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 16:01:12.630115 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:12.623509 2584 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-131-24.ec2.internal\" not found" Apr 16 16:01:12.692204 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.692139 2584 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 16:01:12.693499 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.693485 2584 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 16:01:12.693569 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.693510 2584 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 16:01:12.693569 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.693527 2584 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 16:01:12.693569 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.693533 2584 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 16:01:12.693714 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:12.693597 2584 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 16:01:12.697468 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.697451 2584 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:01:12.723312 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.723294 2584 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:01:12.724037 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.724023 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-24.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:01:12.724137 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.724055 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-24.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:01:12.724137 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.724067 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-24.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:01:12.724137 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.724088 2584 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-131-24.ec2.internal" Apr 16 16:01:12.736304 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.736286 2584 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-131-24.ec2.internal" Apr 16 16:01:12.736354 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:12.736306 2584 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-131-24.ec2.internal\": node \"ip-10-0-131-24.ec2.internal\" not found" Apr 16 16:01:12.765759 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:12.765737 2584 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-24.ec2.internal\" not found" Apr 16 16:01:12.793941 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.793907 2584 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-24.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-131-24.ec2.internal"] Apr 16 16:01:12.794014 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.793990 2584 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:01:12.794744 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.794728 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-24.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:01:12.794797 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.794758 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-24.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:01:12.794797 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.794768 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-24.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:01:12.796173 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.796161 2584 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:01:12.796330 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.796317 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-24.ec2.internal" Apr 16 16:01:12.796366 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.796346 2584 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:01:12.796906 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.796892 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-24.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:01:12.796973 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.796913 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-24.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:01:12.796973 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.796932 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-24.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:01:12.796973 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.796942 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-24.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:01:12.796973 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.796916 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-24.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:01:12.797112 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.796997 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-24.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:01:12.798632 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.798618 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-24.ec2.internal" Apr 16 16:01:12.798689 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.798643 2584 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 16:01:12.799415 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.799397 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-24.ec2.internal" event="NodeHasSufficientMemory" Apr 16 16:01:12.799508 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.799424 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-24.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 16:01:12.799508 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.799437 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-24.ec2.internal" event="NodeHasSufficientPID" Apr 16 16:01:12.816480 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:12.816456 2584 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-24.ec2.internal\" not found" node="ip-10-0-131-24.ec2.internal" Apr 16 16:01:12.820646 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:12.820628 2584 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-131-24.ec2.internal\" not found" node="ip-10-0-131-24.ec2.internal" Apr 16 16:01:12.866520 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:12.866504 2584 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-24.ec2.internal\" not found" Apr 16 16:01:12.958386 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.958344 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0fb0a319680077d647b66b0c3297064a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-24.ec2.internal\" (UID: \"0fb0a319680077d647b66b0c3297064a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-24.ec2.internal" Apr 16 16:01:12.958386 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.958374 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fb0a319680077d647b66b0c3297064a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-24.ec2.internal\" (UID: \"0fb0a319680077d647b66b0c3297064a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-24.ec2.internal" Apr 16 16:01:12.958472 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:12.958398 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bb825418a7522e6122bfaed620c21321-config\") pod \"kube-apiserver-proxy-ip-10-0-131-24.ec2.internal\" (UID: \"bb825418a7522e6122bfaed620c21321\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-24.ec2.internal" Apr 16 16:01:12.967406 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:12.967383 2584 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-24.ec2.internal\" not found" Apr 16 16:01:13.059034 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.059014 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0fb0a319680077d647b66b0c3297064a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-24.ec2.internal\" (UID: \"0fb0a319680077d647b66b0c3297064a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-24.ec2.internal" Apr 16 16:01:13.059099 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.059038 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fb0a319680077d647b66b0c3297064a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-24.ec2.internal\" (UID: \"0fb0a319680077d647b66b0c3297064a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-24.ec2.internal" Apr 16 16:01:13.059099 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.059055 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bb825418a7522e6122bfaed620c21321-config\") pod \"kube-apiserver-proxy-ip-10-0-131-24.ec2.internal\" (UID: \"bb825418a7522e6122bfaed620c21321\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-24.ec2.internal" Apr 16 16:01:13.059162 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.059108 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0fb0a319680077d647b66b0c3297064a-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-131-24.ec2.internal\" (UID: \"0fb0a319680077d647b66b0c3297064a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-24.ec2.internal" Apr 16 16:01:13.059162 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.059133 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0fb0a319680077d647b66b0c3297064a-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-131-24.ec2.internal\" (UID: \"0fb0a319680077d647b66b0c3297064a\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-24.ec2.internal" Apr 16 16:01:13.059162 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.059108 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/bb825418a7522e6122bfaed620c21321-config\") pod \"kube-apiserver-proxy-ip-10-0-131-24.ec2.internal\" (UID: \"bb825418a7522e6122bfaed620c21321\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-131-24.ec2.internal" Apr 16 16:01:13.068130 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:13.068110 2584 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-24.ec2.internal\" not found" Apr 16 16:01:13.122282 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.122265 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-24.ec2.internal" Apr 16 16:01:13.123422 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.123405 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-24.ec2.internal" Apr 16 16:01:13.168529 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:13.168509 2584 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-24.ec2.internal\" not found" Apr 16 16:01:13.269024 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:13.268978 2584 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-24.ec2.internal\" not found" Apr 16 16:01:13.369448 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:13.369427 2584 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-131-24.ec2.internal\" not found" Apr 16 16:01:13.382201 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.382169 2584 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:01:13.429472 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.429453 2584 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:01:13.453343 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.453324 2584 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 16:01:13.453720 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.453440 2584 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:01:13.453720 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.453446 2584 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:01:13.453720 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.453458 2584 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:01:13.453720 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.453457 2584 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 16:01:13.456475 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.456460 2584 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-24.ec2.internal" Apr 16 16:01:13.477880 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.477854 2584 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:01:13.478810 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.478796 2584 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-24.ec2.internal" Apr 16 16:01:13.485719 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.485705 2584 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 16:01:13.532986 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.532923 2584 apiserver.go:52] "Watching apiserver" Apr 16 16:01:13.540476 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.540448 2584 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 16:01:13.540837 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.540817 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-65x5s","kube-system/kube-apiserver-proxy-ip-10-0-131-24.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xz8m","openshift-image-registry/node-ca-nffwb","openshift-multus/multus-additional-cni-plugins-zhztn","openshift-multus/multus-trctq","kube-system/konnectivity-agent-mx7rd","openshift-cluster-node-tuning-operator/tuned-kzd5d","openshift-dns/node-resolver-lrjg9","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-24.ec2.internal","openshift-multus/network-metrics-daemon-snrt9","openshift-network-diagnostics/network-check-target-xg7zh","openshift-network-operator/iptables-alerter-p96jv"] Apr 16 16:01:13.543645 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.543626 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-mx7rd" Apr 16 16:01:13.543747 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.543732 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xz8m" Apr 16 16:01:13.546172 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.546131 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-66kbc\"" Apr 16 16:01:13.546262 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.546192 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 16:01:13.546262 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.546229 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 16:01:13.546370 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.546327 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 16:01:13.546547 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.546437 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nffwb" Apr 16 16:01:13.546547 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.546459 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-9lfkb\"" Apr 16 16:01:13.546547 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.546476 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 16:01:13.546547 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.546525 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 16:01:13.548559 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.548539 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-5hqq4\"" Apr 16 16:01:13.548559 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.548541 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 16:01:13.548785 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.548767 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 16:01:13.548891 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.548874 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 16:01:13.549463 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.549448 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zhztn" Apr 16 16:01:13.550649 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.550608 2584 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 15:56:12 +0000 UTC" deadline="2027-12-27 11:47:53.890412246 +0000 UTC" Apr 16 16:01:13.550649 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.550646 2584 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14875h46m40.339768705s" Apr 16 16:01:13.550649 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.550638 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-trctq" Apr 16 16:01:13.550851 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.550787 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.551226 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.551210 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 16:01:13.551508 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.551494 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 16:01:13.551613 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.551501 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 16:01:13.551613 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.551577 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 16:01:13.551723 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.551648 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-9tznm\"" Apr 16 16:01:13.551723 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.551651 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 16:01:13.552048 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.552033 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.552809 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.552794 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 16:01:13.552993 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.552978 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 16:01:13.553052 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.553003 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-q22dv\"" Apr 16 16:01:13.553142 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.553129 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 16:01:13.553202 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.553195 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 16:01:13.553252 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.553207 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 16:01:13.553252 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.553219 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 16:01:13.553252 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.553198 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 16:01:13.553497 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.553480 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-9d7ct\"" Apr 16 16:01:13.553572 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.553560 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lrjg9" Apr 16 16:01:13.553913 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.553900 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:01:13.554241 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.554228 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-5wrm7\"" Apr 16 16:01:13.554828 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.554813 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 16:01:13.555039 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.555024 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:13.555143 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:13.555114 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-snrt9" podUID="d280ea9a-de22-4d14-8870-0fbcbb459f8f" Apr 16 16:01:13.555711 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.555691 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 16:01:13.555711 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.555689 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-sxb74\"" Apr 16 16:01:13.555811 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.555735 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 16:01:13.556098 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.556082 2584 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 16:01:13.556098 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.556095 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:01:13.556213 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:13.556145 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg7zh" podUID="a3db670c-92c1-42b9-94ed-04726fe1d07a" Apr 16 16:01:13.557377 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.557361 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-p96jv" Apr 16 16:01:13.559367 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.559351 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 16:01:13.559461 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.559435 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-7xt72\"" Apr 16 16:01:13.559514 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.559473 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:01:13.559566 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.559546 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 16:01:13.561983 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.561943 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4fec1617-e681-417d-aaff-5272a1fd3065-host\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.562058 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.562011 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6ffr\" (UniqueName: \"kubernetes.io/projected/82145526-4c6f-43c3-8850-84adf5e445e9-kube-api-access-k6ffr\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.562058 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.562038 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8312a792-e613-4889-91d5-3f89d0f32c1d-device-dir\") pod \"aws-ebs-csi-driver-node-5xz8m\" (UID: \"8312a792-e613-4889-91d5-3f89d0f32c1d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xz8m" Apr 16 16:01:13.562145 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.562064 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-system-cni-dir\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.562145 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.562107 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8e6b59ea-f783-49ae-902d-b33f1ca6c234-cni-binary-copy\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.562145 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.562124 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-host-slash\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.562234 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.562157 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-host-run-ovn-kubernetes\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.562234 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.562181 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-host-cni-bin\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.562234 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.562195 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldk9f\" (UniqueName: \"kubernetes.io/projected/343ffde8-d18b-4f37-b099-b7e664a816d3-kube-api-access-ldk9f\") pod \"iptables-alerter-p96jv\" (UID: \"343ffde8-d18b-4f37-b099-b7e664a816d3\") " pod="openshift-network-operator/iptables-alerter-p96jv" Apr 16 16:01:13.562234 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.562215 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8312a792-e613-4889-91d5-3f89d0f32c1d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5xz8m\" (UID: \"8312a792-e613-4889-91d5-3f89d0f32c1d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xz8m" Apr 16 16:01:13.562357 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.562238 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-multus-socket-dir-parent\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.562357 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.562261 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xf64\" (UniqueName: \"kubernetes.io/projected/4fec1617-e681-417d-aaff-5272a1fd3065-kube-api-access-8xf64\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.562357 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.562315 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-etc-openvswitch\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.562357 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.562343 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-cnibin\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.562529 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.562360 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-host-var-lib-kubelet\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.562529 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.562381 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8e6b59ea-f783-49ae-902d-b33f1ca6c234-multus-daemon-config\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.562529 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.562396 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4fec1617-e681-417d-aaff-5272a1fd3065-etc-modprobe-d\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.562529 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.562410 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-os-release\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.562529 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.562424 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr7k9\" (UniqueName: \"kubernetes.io/projected/80b825a5-15e5-402b-ae57-d2e283b0e8f8-kube-api-access-rr7k9\") pod \"node-resolver-lrjg9\" (UID: \"80b825a5-15e5-402b-ae57-d2e283b0e8f8\") " pod="openshift-dns/node-resolver-lrjg9" Apr 16 16:01:13.562529 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.562482 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/97141e56-55f0-4d10-ba67-fabe3d76d95d-cni-binary-copy\") pod \"multus-additional-cni-plugins-zhztn\" (UID: \"97141e56-55f0-4d10-ba67-fabe3d76d95d\") " pod="openshift-multus/multus-additional-cni-plugins-zhztn" Apr 16 16:01:13.562829 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.562557 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/97141e56-55f0-4d10-ba67-fabe3d76d95d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zhztn\" (UID: \"97141e56-55f0-4d10-ba67-fabe3d76d95d\") " pod="openshift-multus/multus-additional-cni-plugins-zhztn" Apr 16 16:01:13.562829 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.562609 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrdfc\" (UniqueName: \"kubernetes.io/projected/97141e56-55f0-4d10-ba67-fabe3d76d95d-kube-api-access-rrdfc\") pod \"multus-additional-cni-plugins-zhztn\" (UID: \"97141e56-55f0-4d10-ba67-fabe3d76d95d\") " pod="openshift-multus/multus-additional-cni-plugins-zhztn" Apr 16 16:01:13.562829 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.562666 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-host-kubelet\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.562829 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.562711 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ldqfz\" (UniqueName: \"kubernetes.io/projected/07434d52-3b5d-4eaf-ba37-9f1b957e938a-kube-api-access-ldqfz\") pod \"node-ca-nffwb\" (UID: \"07434d52-3b5d-4eaf-ba37-9f1b957e938a\") " pod="openshift-image-registry/node-ca-nffwb" Apr 16 16:01:13.562829 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.562737 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/97141e56-55f0-4d10-ba67-fabe3d76d95d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zhztn\" (UID: \"97141e56-55f0-4d10-ba67-fabe3d76d95d\") " pod="openshift-multus/multus-additional-cni-plugins-zhztn" Apr 16 16:01:13.562829 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.562761 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/82145526-4c6f-43c3-8850-84adf5e445e9-env-overrides\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.563488 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.562884 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8312a792-e613-4889-91d5-3f89d0f32c1d-socket-dir\") pod \"aws-ebs-csi-driver-node-5xz8m\" (UID: \"8312a792-e613-4889-91d5-3f89d0f32c1d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xz8m" Apr 16 16:01:13.563488 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.562920 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4fec1617-e681-417d-aaff-5272a1fd3065-sys\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.563488 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.562991 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/80b825a5-15e5-402b-ae57-d2e283b0e8f8-hosts-file\") pod \"node-resolver-lrjg9\" (UID: \"80b825a5-15e5-402b-ae57-d2e283b0e8f8\") " pod="openshift-dns/node-resolver-lrjg9" Apr 16 16:01:13.563488 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563009 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/97141e56-55f0-4d10-ba67-fabe3d76d95d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zhztn\" (UID: \"97141e56-55f0-4d10-ba67-fabe3d76d95d\") " pod="openshift-multus/multus-additional-cni-plugins-zhztn" Apr 16 16:01:13.563488 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563031 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07434d52-3b5d-4eaf-ba37-9f1b957e938a-host\") pod \"node-ca-nffwb\" (UID: \"07434d52-3b5d-4eaf-ba37-9f1b957e938a\") " pod="openshift-image-registry/node-ca-nffwb" Apr 16 16:01:13.563488 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563072 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4fec1617-e681-417d-aaff-5272a1fd3065-etc-systemd\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.563488 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563087 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97141e56-55f0-4d10-ba67-fabe3d76d95d-system-cni-dir\") pod \"multus-additional-cni-plugins-zhztn\" (UID: \"97141e56-55f0-4d10-ba67-fabe3d76d95d\") " pod="openshift-multus/multus-additional-cni-plugins-zhztn" Apr 16 16:01:13.563488 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563118 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-host-run-netns\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.563488 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563145 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-var-lib-openvswitch\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.563488 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563164 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7h5s\" (UniqueName: \"kubernetes.io/projected/8312a792-e613-4889-91d5-3f89d0f32c1d-kube-api-access-x7h5s\") pod \"aws-ebs-csi-driver-node-5xz8m\" (UID: \"8312a792-e613-4889-91d5-3f89d0f32c1d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xz8m" Apr 16 16:01:13.563488 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563179 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-hostroot\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.563488 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563202 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-multus-conf-dir\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.563488 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563265 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/343ffde8-d18b-4f37-b099-b7e664a816d3-host-slash\") pod \"iptables-alerter-p96jv\" (UID: \"343ffde8-d18b-4f37-b099-b7e664a816d3\") " pod="openshift-network-operator/iptables-alerter-p96jv" Apr 16 16:01:13.563488 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563297 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4fec1617-e681-417d-aaff-5272a1fd3065-etc-sysconfig\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.563488 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563325 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-log-socket\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.563488 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563353 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wgm8h\" (UniqueName: \"kubernetes.io/projected/a3db670c-92c1-42b9-94ed-04726fe1d07a-kube-api-access-wgm8h\") pod \"network-check-target-xg7zh\" (UID: \"a3db670c-92c1-42b9-94ed-04726fe1d07a\") " pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:01:13.564165 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563379 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/97141e56-55f0-4d10-ba67-fabe3d76d95d-cnibin\") pod \"multus-additional-cni-plugins-zhztn\" (UID: \"97141e56-55f0-4d10-ba67-fabe3d76d95d\") " pod="openshift-multus/multus-additional-cni-plugins-zhztn" Apr 16 16:01:13.564165 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563401 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-run-ovn\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.564165 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563434 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d280ea9a-de22-4d14-8870-0fbcbb459f8f-metrics-certs\") pod \"network-metrics-daemon-snrt9\" (UID: \"d280ea9a-de22-4d14-8870-0fbcbb459f8f\") " pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:13.564165 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563466 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nmxd\" (UniqueName: \"kubernetes.io/projected/8e6b59ea-f783-49ae-902d-b33f1ca6c234-kube-api-access-4nmxd\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.564165 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563492 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4fec1617-e681-417d-aaff-5272a1fd3065-etc-tuned\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.564165 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563506 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-host-cni-netd\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.564165 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563521 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8312a792-e613-4889-91d5-3f89d0f32c1d-etc-selinux\") pod \"aws-ebs-csi-driver-node-5xz8m\" (UID: \"8312a792-e613-4889-91d5-3f89d0f32c1d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xz8m" Apr 16 16:01:13.564165 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563544 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-host-run-multus-certs\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.564165 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563567 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-systemd-units\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.564165 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563590 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-node-log\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.564165 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563607 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6cmx6\" (UniqueName: \"kubernetes.io/projected/d280ea9a-de22-4d14-8870-0fbcbb459f8f-kube-api-access-6cmx6\") pod \"network-metrics-daemon-snrt9\" (UID: \"d280ea9a-de22-4d14-8870-0fbcbb459f8f\") " pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:13.564165 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563622 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/343ffde8-d18b-4f37-b099-b7e664a816d3-iptables-alerter-script\") pod \"iptables-alerter-p96jv\" (UID: \"343ffde8-d18b-4f37-b099-b7e664a816d3\") " pod="openshift-network-operator/iptables-alerter-p96jv" Apr 16 16:01:13.564165 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563636 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a7e8c4ea-3157-441e-873c-cf283ecb2c2a-agent-certs\") pod \"konnectivity-agent-mx7rd\" (UID: \"a7e8c4ea-3157-441e-873c-cf283ecb2c2a\") " pod="kube-system/konnectivity-agent-mx7rd" Apr 16 16:01:13.564165 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563649 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-host-run-netns\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.564165 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563662 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-etc-kubernetes\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.564165 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563676 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4fec1617-e681-417d-aaff-5272a1fd3065-etc-kubernetes\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.564694 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563694 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-multus-cni-dir\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.564694 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563712 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/82145526-4c6f-43c3-8850-84adf5e445e9-ovnkube-config\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.564694 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563751 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a7e8c4ea-3157-441e-873c-cf283ecb2c2a-konnectivity-ca\") pod \"konnectivity-agent-mx7rd\" (UID: \"a7e8c4ea-3157-441e-873c-cf283ecb2c2a\") " pod="kube-system/konnectivity-agent-mx7rd" Apr 16 16:01:13.564694 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563776 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8312a792-e613-4889-91d5-3f89d0f32c1d-registration-dir\") pod \"aws-ebs-csi-driver-node-5xz8m\" (UID: \"8312a792-e613-4889-91d5-3f89d0f32c1d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xz8m" Apr 16 16:01:13.564694 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563821 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-host-run-k8s-cni-cncf-io\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.564694 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563854 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4fec1617-e681-417d-aaff-5272a1fd3065-etc-sysctl-d\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.564694 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563904 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/80b825a5-15e5-402b-ae57-d2e283b0e8f8-tmp-dir\") pod \"node-resolver-lrjg9\" (UID: \"80b825a5-15e5-402b-ae57-d2e283b0e8f8\") " pod="openshift-dns/node-resolver-lrjg9" Apr 16 16:01:13.564694 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563920 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-run-systemd\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.564694 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563933 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-run-openvswitch\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.564694 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.563974 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.564694 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.564007 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4fec1617-e681-417d-aaff-5272a1fd3065-run\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.564694 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.564019 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4fec1617-e681-417d-aaff-5272a1fd3065-tmp\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.564694 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.564032 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/82145526-4c6f-43c3-8850-84adf5e445e9-ovn-node-metrics-cert\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.564694 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.564048 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8312a792-e613-4889-91d5-3f89d0f32c1d-sys-fs\") pod \"aws-ebs-csi-driver-node-5xz8m\" (UID: \"8312a792-e613-4889-91d5-3f89d0f32c1d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xz8m" Apr 16 16:01:13.564694 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.564106 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/07434d52-3b5d-4eaf-ba37-9f1b957e938a-serviceca\") pod \"node-ca-nffwb\" (UID: \"07434d52-3b5d-4eaf-ba37-9f1b957e938a\") " pod="openshift-image-registry/node-ca-nffwb" Apr 16 16:01:13.564694 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.564137 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-host-var-lib-cni-multus\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.565527 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.564159 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4fec1617-e681-417d-aaff-5272a1fd3065-lib-modules\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.565527 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.564175 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4fec1617-e681-417d-aaff-5272a1fd3065-var-lib-kubelet\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.565527 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.564207 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/97141e56-55f0-4d10-ba67-fabe3d76d95d-os-release\") pod \"multus-additional-cni-plugins-zhztn\" (UID: \"97141e56-55f0-4d10-ba67-fabe3d76d95d\") " pod="openshift-multus/multus-additional-cni-plugins-zhztn" Apr 16 16:01:13.565527 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.564226 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/82145526-4c6f-43c3-8850-84adf5e445e9-ovnkube-script-lib\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.565527 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.564261 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-host-var-lib-cni-bin\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.565527 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.564285 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4fec1617-e681-417d-aaff-5272a1fd3065-etc-sysctl-conf\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.570002 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.569984 2584 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 16:01:13.596473 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.596454 2584 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-xgj5l" Apr 16 16:01:13.604992 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.604978 2584 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-xgj5l" Apr 16 16:01:13.657202 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.657188 2584 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 16:01:13.665355 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.665338 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-log-socket\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.665520 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.665363 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgm8h\" (UniqueName: \"kubernetes.io/projected/a3db670c-92c1-42b9-94ed-04726fe1d07a-kube-api-access-wgm8h\") pod \"network-check-target-xg7zh\" (UID: \"a3db670c-92c1-42b9-94ed-04726fe1d07a\") " pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:01:13.665520 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.665378 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/97141e56-55f0-4d10-ba67-fabe3d76d95d-cnibin\") pod \"multus-additional-cni-plugins-zhztn\" (UID: \"97141e56-55f0-4d10-ba67-fabe3d76d95d\") " pod="openshift-multus/multus-additional-cni-plugins-zhztn" Apr 16 16:01:13.665520 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.665392 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-run-ovn\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.665520 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.665406 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d280ea9a-de22-4d14-8870-0fbcbb459f8f-metrics-certs\") pod \"network-metrics-daemon-snrt9\" (UID: \"d280ea9a-de22-4d14-8870-0fbcbb459f8f\") " pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:13.665520 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.665420 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4nmxd\" (UniqueName: \"kubernetes.io/projected/8e6b59ea-f783-49ae-902d-b33f1ca6c234-kube-api-access-4nmxd\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.665520 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.665481 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-log-socket\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.665520 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.665506 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/97141e56-55f0-4d10-ba67-fabe3d76d95d-cnibin\") pod \"multus-additional-cni-plugins-zhztn\" (UID: \"97141e56-55f0-4d10-ba67-fabe3d76d95d\") " pod="openshift-multus/multus-additional-cni-plugins-zhztn" Apr 16 16:01:13.665520 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.665509 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-run-ovn\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.665825 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.665525 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4fec1617-e681-417d-aaff-5272a1fd3065-etc-tuned\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.665825 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.665549 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-host-cni-netd\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.665825 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:13.665549 2584 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:01:13.665825 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.665587 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8312a792-e613-4889-91d5-3f89d0f32c1d-etc-selinux\") pod \"aws-ebs-csi-driver-node-5xz8m\" (UID: \"8312a792-e613-4889-91d5-3f89d0f32c1d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xz8m" Apr 16 16:01:13.665825 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.665599 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-host-cni-netd\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.665825 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:13.665617 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d280ea9a-de22-4d14-8870-0fbcbb459f8f-metrics-certs podName:d280ea9a-de22-4d14-8870-0fbcbb459f8f nodeName:}" failed. No retries permitted until 2026-04-16 16:01:14.165585721 +0000 UTC m=+2.125777425 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d280ea9a-de22-4d14-8870-0fbcbb459f8f-metrics-certs") pod "network-metrics-daemon-snrt9" (UID: "d280ea9a-de22-4d14-8870-0fbcbb459f8f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:01:13.665825 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.665704 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-host-run-multus-certs\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.665825 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.665734 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-systemd-units\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.665825 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.665737 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/8312a792-e613-4889-91d5-3f89d0f32c1d-etc-selinux\") pod \"aws-ebs-csi-driver-node-5xz8m\" (UID: \"8312a792-e613-4889-91d5-3f89d0f32c1d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xz8m" Apr 16 16:01:13.665825 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.665759 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-node-log\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.665825 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.665775 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-host-run-multus-certs\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.665825 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.665784 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-systemd-units\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.665825 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.665796 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-node-log\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.665825 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.665810 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6cmx6\" (UniqueName: \"kubernetes.io/projected/d280ea9a-de22-4d14-8870-0fbcbb459f8f-kube-api-access-6cmx6\") pod \"network-metrics-daemon-snrt9\" (UID: \"d280ea9a-de22-4d14-8870-0fbcbb459f8f\") " pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:13.666469 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.665838 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/343ffde8-d18b-4f37-b099-b7e664a816d3-iptables-alerter-script\") pod \"iptables-alerter-p96jv\" (UID: \"343ffde8-d18b-4f37-b099-b7e664a816d3\") " pod="openshift-network-operator/iptables-alerter-p96jv" Apr 16 16:01:13.666469 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.665863 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a7e8c4ea-3157-441e-873c-cf283ecb2c2a-agent-certs\") pod \"konnectivity-agent-mx7rd\" (UID: \"a7e8c4ea-3157-441e-873c-cf283ecb2c2a\") " pod="kube-system/konnectivity-agent-mx7rd" Apr 16 16:01:13.666469 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.665891 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-host-run-netns\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.666469 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.665970 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-etc-kubernetes\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.666469 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666003 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4fec1617-e681-417d-aaff-5272a1fd3065-etc-kubernetes\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.666469 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666011 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-host-run-netns\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.666469 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666027 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-multus-cni-dir\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.666469 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666067 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/82145526-4c6f-43c3-8850-84adf5e445e9-ovnkube-config\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.666469 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666071 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-etc-kubernetes\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.666469 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666083 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-multus-cni-dir\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.666469 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666092 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a7e8c4ea-3157-441e-873c-cf283ecb2c2a-konnectivity-ca\") pod \"konnectivity-agent-mx7rd\" (UID: \"a7e8c4ea-3157-441e-873c-cf283ecb2c2a\") " pod="kube-system/konnectivity-agent-mx7rd" Apr 16 16:01:13.666469 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666117 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8312a792-e613-4889-91d5-3f89d0f32c1d-registration-dir\") pod \"aws-ebs-csi-driver-node-5xz8m\" (UID: \"8312a792-e613-4889-91d5-3f89d0f32c1d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xz8m" Apr 16 16:01:13.666469 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666141 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-host-run-k8s-cni-cncf-io\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.666469 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666164 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4fec1617-e681-417d-aaff-5272a1fd3065-etc-sysctl-d\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.666469 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666190 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/80b825a5-15e5-402b-ae57-d2e283b0e8f8-tmp-dir\") pod \"node-resolver-lrjg9\" (UID: \"80b825a5-15e5-402b-ae57-d2e283b0e8f8\") " pod="openshift-dns/node-resolver-lrjg9" Apr 16 16:01:13.666469 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666192 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4fec1617-e681-417d-aaff-5272a1fd3065-etc-kubernetes\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.666469 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666227 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-run-systemd\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.666469 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666257 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8312a792-e613-4889-91d5-3f89d0f32c1d-registration-dir\") pod \"aws-ebs-csi-driver-node-5xz8m\" (UID: \"8312a792-e613-4889-91d5-3f89d0f32c1d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xz8m" Apr 16 16:01:13.667270 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666257 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-run-openvswitch\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.667270 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666284 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4fec1617-e681-417d-aaff-5272a1fd3065-etc-sysctl-d\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.667270 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666290 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.667270 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666289 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-run-openvswitch\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.667270 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666231 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-host-run-k8s-cni-cncf-io\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.667270 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666322 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4fec1617-e681-417d-aaff-5272a1fd3065-run\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.667270 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666348 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-run-systemd\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.667270 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666377 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4fec1617-e681-417d-aaff-5272a1fd3065-tmp\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.667270 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666389 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4fec1617-e681-417d-aaff-5272a1fd3065-run\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.667270 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666429 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/82145526-4c6f-43c3-8850-84adf5e445e9-ovn-node-metrics-cert\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.667270 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666429 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.667270 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666481 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8312a792-e613-4889-91d5-3f89d0f32c1d-sys-fs\") pod \"aws-ebs-csi-driver-node-5xz8m\" (UID: \"8312a792-e613-4889-91d5-3f89d0f32c1d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xz8m" Apr 16 16:01:13.667270 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666535 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/07434d52-3b5d-4eaf-ba37-9f1b957e938a-serviceca\") pod \"node-ca-nffwb\" (UID: \"07434d52-3b5d-4eaf-ba37-9f1b957e938a\") " pod="openshift-image-registry/node-ca-nffwb" Apr 16 16:01:13.667270 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666560 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/8312a792-e613-4889-91d5-3f89d0f32c1d-sys-fs\") pod \"aws-ebs-csi-driver-node-5xz8m\" (UID: \"8312a792-e613-4889-91d5-3f89d0f32c1d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xz8m" Apr 16 16:01:13.667270 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666562 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-host-var-lib-cni-multus\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.667270 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666592 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-host-var-lib-cni-multus\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.667270 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666601 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4fec1617-e681-417d-aaff-5272a1fd3065-lib-modules\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.668070 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666609 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/343ffde8-d18b-4f37-b099-b7e664a816d3-iptables-alerter-script\") pod \"iptables-alerter-p96jv\" (UID: \"343ffde8-d18b-4f37-b099-b7e664a816d3\") " pod="openshift-network-operator/iptables-alerter-p96jv" Apr 16 16:01:13.668070 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666675 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4fec1617-e681-417d-aaff-5272a1fd3065-var-lib-kubelet\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.668070 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666636 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4fec1617-e681-417d-aaff-5272a1fd3065-var-lib-kubelet\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.668070 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666695 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4fec1617-e681-417d-aaff-5272a1fd3065-lib-modules\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.668070 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666702 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/80b825a5-15e5-402b-ae57-d2e283b0e8f8-tmp-dir\") pod \"node-resolver-lrjg9\" (UID: \"80b825a5-15e5-402b-ae57-d2e283b0e8f8\") " pod="openshift-dns/node-resolver-lrjg9" Apr 16 16:01:13.668070 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666719 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/82145526-4c6f-43c3-8850-84adf5e445e9-ovnkube-config\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.668070 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666731 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/a7e8c4ea-3157-441e-873c-cf283ecb2c2a-konnectivity-ca\") pod \"konnectivity-agent-mx7rd\" (UID: \"a7e8c4ea-3157-441e-873c-cf283ecb2c2a\") " pod="kube-system/konnectivity-agent-mx7rd" Apr 16 16:01:13.668070 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666720 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/97141e56-55f0-4d10-ba67-fabe3d76d95d-os-release\") pod \"multus-additional-cni-plugins-zhztn\" (UID: \"97141e56-55f0-4d10-ba67-fabe3d76d95d\") " pod="openshift-multus/multus-additional-cni-plugins-zhztn" Apr 16 16:01:13.668070 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666758 2584 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 16:01:13.668070 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666772 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/97141e56-55f0-4d10-ba67-fabe3d76d95d-os-release\") pod \"multus-additional-cni-plugins-zhztn\" (UID: \"97141e56-55f0-4d10-ba67-fabe3d76d95d\") " pod="openshift-multus/multus-additional-cni-plugins-zhztn" Apr 16 16:01:13.668070 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666775 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/82145526-4c6f-43c3-8850-84adf5e445e9-ovnkube-script-lib\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.668070 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666801 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-host-var-lib-cni-bin\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.668070 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666828 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4fec1617-e681-417d-aaff-5272a1fd3065-etc-sysctl-conf\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.668070 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666852 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-host-var-lib-cni-bin\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.668070 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666855 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4fec1617-e681-417d-aaff-5272a1fd3065-host\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.668070 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666905 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k6ffr\" (UniqueName: \"kubernetes.io/projected/82145526-4c6f-43c3-8850-84adf5e445e9-kube-api-access-k6ffr\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.668070 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666931 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8312a792-e613-4889-91d5-3f89d0f32c1d-device-dir\") pod \"aws-ebs-csi-driver-node-5xz8m\" (UID: \"8312a792-e613-4889-91d5-3f89d0f32c1d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xz8m" Apr 16 16:01:13.668070 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666976 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-system-cni-dir\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.668892 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.666984 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/07434d52-3b5d-4eaf-ba37-9f1b957e938a-serviceca\") pod \"node-ca-nffwb\" (UID: \"07434d52-3b5d-4eaf-ba37-9f1b957e938a\") " pod="openshift-image-registry/node-ca-nffwb" Apr 16 16:01:13.668892 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667000 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8e6b59ea-f783-49ae-902d-b33f1ca6c234-cni-binary-copy\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.668892 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667024 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-host-slash\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.668892 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667040 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4fec1617-e681-417d-aaff-5272a1fd3065-host\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.668892 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667047 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-host-run-ovn-kubernetes\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.668892 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667098 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-system-cni-dir\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.668892 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667098 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-host-run-ovn-kubernetes\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.668892 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667149 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/8312a792-e613-4889-91d5-3f89d0f32c1d-device-dir\") pod \"aws-ebs-csi-driver-node-5xz8m\" (UID: \"8312a792-e613-4889-91d5-3f89d0f32c1d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xz8m" Apr 16 16:01:13.668892 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667176 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-host-cni-bin\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.668892 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667201 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldk9f\" (UniqueName: \"kubernetes.io/projected/343ffde8-d18b-4f37-b099-b7e664a816d3-kube-api-access-ldk9f\") pod \"iptables-alerter-p96jv\" (UID: \"343ffde8-d18b-4f37-b099-b7e664a816d3\") " pod="openshift-network-operator/iptables-alerter-p96jv" Apr 16 16:01:13.668892 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667225 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8312a792-e613-4889-91d5-3f89d0f32c1d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5xz8m\" (UID: \"8312a792-e613-4889-91d5-3f89d0f32c1d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xz8m" Apr 16 16:01:13.668892 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667250 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-multus-socket-dir-parent\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.668892 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667277 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8xf64\" (UniqueName: \"kubernetes.io/projected/4fec1617-e681-417d-aaff-5272a1fd3065-kube-api-access-8xf64\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.668892 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667272 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-host-slash\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.668892 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667324 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-etc-openvswitch\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.668892 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667349 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-cnibin\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.668892 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667356 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/82145526-4c6f-43c3-8850-84adf5e445e9-ovnkube-script-lib\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.668892 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667374 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-host-var-lib-kubelet\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.669609 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667402 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8e6b59ea-f783-49ae-902d-b33f1ca6c234-multus-daemon-config\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.669609 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667426 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4fec1617-e681-417d-aaff-5272a1fd3065-etc-modprobe-d\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.669609 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667431 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-host-cni-bin\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.669609 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667448 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-os-release\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.669609 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667475 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-etc-openvswitch\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.669609 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667475 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rr7k9\" (UniqueName: \"kubernetes.io/projected/80b825a5-15e5-402b-ae57-d2e283b0e8f8-kube-api-access-rr7k9\") pod \"node-resolver-lrjg9\" (UID: \"80b825a5-15e5-402b-ae57-d2e283b0e8f8\") " pod="openshift-dns/node-resolver-lrjg9" Apr 16 16:01:13.669609 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667518 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/97141e56-55f0-4d10-ba67-fabe3d76d95d-cni-binary-copy\") pod \"multus-additional-cni-plugins-zhztn\" (UID: \"97141e56-55f0-4d10-ba67-fabe3d76d95d\") " pod="openshift-multus/multus-additional-cni-plugins-zhztn" Apr 16 16:01:13.669609 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667525 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8312a792-e613-4889-91d5-3f89d0f32c1d-kubelet-dir\") pod \"aws-ebs-csi-driver-node-5xz8m\" (UID: \"8312a792-e613-4889-91d5-3f89d0f32c1d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xz8m" Apr 16 16:01:13.669609 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667578 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-multus-socket-dir-parent\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.669609 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667559 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/97141e56-55f0-4d10-ba67-fabe3d76d95d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zhztn\" (UID: \"97141e56-55f0-4d10-ba67-fabe3d76d95d\") " pod="openshift-multus/multus-additional-cni-plugins-zhztn" Apr 16 16:01:13.669609 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667618 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8e6b59ea-f783-49ae-902d-b33f1ca6c234-cni-binary-copy\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.669609 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667621 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rrdfc\" (UniqueName: \"kubernetes.io/projected/97141e56-55f0-4d10-ba67-fabe3d76d95d-kube-api-access-rrdfc\") pod \"multus-additional-cni-plugins-zhztn\" (UID: \"97141e56-55f0-4d10-ba67-fabe3d76d95d\") " pod="openshift-multus/multus-additional-cni-plugins-zhztn" Apr 16 16:01:13.669609 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667638 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4fec1617-e681-417d-aaff-5272a1fd3065-etc-modprobe-d\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.669609 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667669 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-host-kubelet\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.669609 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667671 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-host-var-lib-kubelet\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.669609 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667722 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ldqfz\" (UniqueName: \"kubernetes.io/projected/07434d52-3b5d-4eaf-ba37-9f1b957e938a-kube-api-access-ldqfz\") pod \"node-ca-nffwb\" (UID: \"07434d52-3b5d-4eaf-ba37-9f1b957e938a\") " pod="openshift-image-registry/node-ca-nffwb" Apr 16 16:01:13.669609 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667759 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/97141e56-55f0-4d10-ba67-fabe3d76d95d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zhztn\" (UID: \"97141e56-55f0-4d10-ba67-fabe3d76d95d\") " pod="openshift-multus/multus-additional-cni-plugins-zhztn" Apr 16 16:01:13.670361 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667784 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/82145526-4c6f-43c3-8850-84adf5e445e9-env-overrides\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.670361 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667826 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8312a792-e613-4889-91d5-3f89d0f32c1d-socket-dir\") pod \"aws-ebs-csi-driver-node-5xz8m\" (UID: \"8312a792-e613-4889-91d5-3f89d0f32c1d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xz8m" Apr 16 16:01:13.670361 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667850 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4fec1617-e681-417d-aaff-5272a1fd3065-sys\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.670361 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667875 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/80b825a5-15e5-402b-ae57-d2e283b0e8f8-hosts-file\") pod \"node-resolver-lrjg9\" (UID: \"80b825a5-15e5-402b-ae57-d2e283b0e8f8\") " pod="openshift-dns/node-resolver-lrjg9" Apr 16 16:01:13.670361 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667901 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/97141e56-55f0-4d10-ba67-fabe3d76d95d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zhztn\" (UID: \"97141e56-55f0-4d10-ba67-fabe3d76d95d\") " pod="openshift-multus/multus-additional-cni-plugins-zhztn" Apr 16 16:01:13.670361 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667926 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07434d52-3b5d-4eaf-ba37-9f1b957e938a-host\") pod \"node-ca-nffwb\" (UID: \"07434d52-3b5d-4eaf-ba37-9f1b957e938a\") " pod="openshift-image-registry/node-ca-nffwb" Apr 16 16:01:13.670361 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667968 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4fec1617-e681-417d-aaff-5272a1fd3065-etc-systemd\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.670361 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667995 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97141e56-55f0-4d10-ba67-fabe3d76d95d-system-cni-dir\") pod \"multus-additional-cni-plugins-zhztn\" (UID: \"97141e56-55f0-4d10-ba67-fabe3d76d95d\") " pod="openshift-multus/multus-additional-cni-plugins-zhztn" Apr 16 16:01:13.670361 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.667999 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4fec1617-e681-417d-aaff-5272a1fd3065-etc-sysctl-conf\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.670361 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.668021 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-host-run-netns\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.670361 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.668071 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-host-run-netns\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.670361 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.668077 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-var-lib-openvswitch\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.670361 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.668082 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-cnibin\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.670361 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.668114 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-host-kubelet\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.670361 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.668159 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4fec1617-e681-417d-aaff-5272a1fd3065-sys\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.670361 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.668160 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/8e6b59ea-f783-49ae-902d-b33f1ca6c234-multus-daemon-config\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.670361 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.668240 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-os-release\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.670361 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.668308 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8312a792-e613-4889-91d5-3f89d0f32c1d-socket-dir\") pod \"aws-ebs-csi-driver-node-5xz8m\" (UID: \"8312a792-e613-4889-91d5-3f89d0f32c1d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xz8m" Apr 16 16:01:13.671159 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.668512 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4fec1617-e681-417d-aaff-5272a1fd3065-etc-systemd\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.671159 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.668580 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/80b825a5-15e5-402b-ae57-d2e283b0e8f8-hosts-file\") pod \"node-resolver-lrjg9\" (UID: \"80b825a5-15e5-402b-ae57-d2e283b0e8f8\") " pod="openshift-dns/node-resolver-lrjg9" Apr 16 16:01:13.671159 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.668594 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x7h5s\" (UniqueName: \"kubernetes.io/projected/8312a792-e613-4889-91d5-3f89d0f32c1d-kube-api-access-x7h5s\") pod \"aws-ebs-csi-driver-node-5xz8m\" (UID: \"8312a792-e613-4889-91d5-3f89d0f32c1d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xz8m" Apr 16 16:01:13.671159 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.668617 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/97141e56-55f0-4d10-ba67-fabe3d76d95d-system-cni-dir\") pod \"multus-additional-cni-plugins-zhztn\" (UID: \"97141e56-55f0-4d10-ba67-fabe3d76d95d\") " pod="openshift-multus/multus-additional-cni-plugins-zhztn" Apr 16 16:01:13.671159 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.668628 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-hostroot\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.671159 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.668655 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/82145526-4c6f-43c3-8850-84adf5e445e9-var-lib-openvswitch\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.671159 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.668657 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-multus-conf-dir\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.671159 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.668695 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/82145526-4c6f-43c3-8850-84adf5e445e9-env-overrides\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.671159 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.668711 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/07434d52-3b5d-4eaf-ba37-9f1b957e938a-host\") pod \"node-ca-nffwb\" (UID: \"07434d52-3b5d-4eaf-ba37-9f1b957e938a\") " pod="openshift-image-registry/node-ca-nffwb" Apr 16 16:01:13.671159 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.668713 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/343ffde8-d18b-4f37-b099-b7e664a816d3-host-slash\") pod \"iptables-alerter-p96jv\" (UID: \"343ffde8-d18b-4f37-b099-b7e664a816d3\") " pod="openshift-network-operator/iptables-alerter-p96jv" Apr 16 16:01:13.671159 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.668745 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/343ffde8-d18b-4f37-b099-b7e664a816d3-host-slash\") pod \"iptables-alerter-p96jv\" (UID: \"343ffde8-d18b-4f37-b099-b7e664a816d3\") " pod="openshift-network-operator/iptables-alerter-p96jv" Apr 16 16:01:13.671159 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.668788 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4fec1617-e681-417d-aaff-5272a1fd3065-etc-sysconfig\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.671159 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.668824 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/97141e56-55f0-4d10-ba67-fabe3d76d95d-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-zhztn\" (UID: \"97141e56-55f0-4d10-ba67-fabe3d76d95d\") " pod="openshift-multus/multus-additional-cni-plugins-zhztn" Apr 16 16:01:13.671159 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.668886 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-hostroot\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.671159 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.668885 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/97141e56-55f0-4d10-ba67-fabe3d76d95d-cni-binary-copy\") pod \"multus-additional-cni-plugins-zhztn\" (UID: \"97141e56-55f0-4d10-ba67-fabe3d76d95d\") " pod="openshift-multus/multus-additional-cni-plugins-zhztn" Apr 16 16:01:13.671159 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.668932 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8e6b59ea-f783-49ae-902d-b33f1ca6c234-multus-conf-dir\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.671159 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.668941 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4fec1617-e681-417d-aaff-5272a1fd3065-etc-sysconfig\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.671159 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.669043 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/97141e56-55f0-4d10-ba67-fabe3d76d95d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-zhztn\" (UID: \"97141e56-55f0-4d10-ba67-fabe3d76d95d\") " pod="openshift-multus/multus-additional-cni-plugins-zhztn" Apr 16 16:01:13.671772 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.669207 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/97141e56-55f0-4d10-ba67-fabe3d76d95d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-zhztn\" (UID: \"97141e56-55f0-4d10-ba67-fabe3d76d95d\") " pod="openshift-multus/multus-additional-cni-plugins-zhztn" Apr 16 16:01:13.671772 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.670727 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4fec1617-e681-417d-aaff-5272a1fd3065-etc-tuned\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.671772 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.670752 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4fec1617-e681-417d-aaff-5272a1fd3065-tmp\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.671772 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.671092 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/a7e8c4ea-3157-441e-873c-cf283ecb2c2a-agent-certs\") pod \"konnectivity-agent-mx7rd\" (UID: \"a7e8c4ea-3157-441e-873c-cf283ecb2c2a\") " pod="kube-system/konnectivity-agent-mx7rd" Apr 16 16:01:13.671772 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.671183 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/82145526-4c6f-43c3-8850-84adf5e445e9-ovn-node-metrics-cert\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.681211 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:13.681187 2584 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:01:13.681302 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:13.681224 2584 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:01:13.681302 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:13.681238 2584 projected.go:194] Error preparing data for projected volume kube-api-access-wgm8h for pod openshift-network-diagnostics/network-check-target-xg7zh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:01:13.681410 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:13.681304 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3db670c-92c1-42b9-94ed-04726fe1d07a-kube-api-access-wgm8h podName:a3db670c-92c1-42b9-94ed-04726fe1d07a nodeName:}" failed. No retries permitted until 2026-04-16 16:01:14.181285775 +0000 UTC m=+2.141477491 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-wgm8h" (UniqueName: "kubernetes.io/projected/a3db670c-92c1-42b9-94ed-04726fe1d07a-kube-api-access-wgm8h") pod "network-check-target-xg7zh" (UID: "a3db670c-92c1-42b9-94ed-04726fe1d07a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:01:13.683645 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.683620 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldk9f\" (UniqueName: \"kubernetes.io/projected/343ffde8-d18b-4f37-b099-b7e664a816d3-kube-api-access-ldk9f\") pod \"iptables-alerter-p96jv\" (UID: \"343ffde8-d18b-4f37-b099-b7e664a816d3\") " pod="openshift-network-operator/iptables-alerter-p96jv" Apr 16 16:01:13.685528 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.685508 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrdfc\" (UniqueName: \"kubernetes.io/projected/97141e56-55f0-4d10-ba67-fabe3d76d95d-kube-api-access-rrdfc\") pod \"multus-additional-cni-plugins-zhztn\" (UID: \"97141e56-55f0-4d10-ba67-fabe3d76d95d\") " pod="openshift-multus/multus-additional-cni-plugins-zhztn" Apr 16 16:01:13.691106 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.691080 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7h5s\" (UniqueName: \"kubernetes.io/projected/8312a792-e613-4889-91d5-3f89d0f32c1d-kube-api-access-x7h5s\") pod \"aws-ebs-csi-driver-node-5xz8m\" (UID: \"8312a792-e613-4889-91d5-3f89d0f32c1d\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xz8m" Apr 16 16:01:13.691106 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.691092 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6ffr\" (UniqueName: \"kubernetes.io/projected/82145526-4c6f-43c3-8850-84adf5e445e9-kube-api-access-k6ffr\") pod \"ovnkube-node-65x5s\" (UID: \"82145526-4c6f-43c3-8850-84adf5e445e9\") " pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.691263 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.691080 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xf64\" (UniqueName: \"kubernetes.io/projected/4fec1617-e681-417d-aaff-5272a1fd3065-kube-api-access-8xf64\") pod \"tuned-kzd5d\" (UID: \"4fec1617-e681-417d-aaff-5272a1fd3065\") " pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.691463 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.691444 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nmxd\" (UniqueName: \"kubernetes.io/projected/8e6b59ea-f783-49ae-902d-b33f1ca6c234-kube-api-access-4nmxd\") pod \"multus-trctq\" (UID: \"8e6b59ea-f783-49ae-902d-b33f1ca6c234\") " pod="openshift-multus/multus-trctq" Apr 16 16:01:13.691564 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.691532 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6cmx6\" (UniqueName: \"kubernetes.io/projected/d280ea9a-de22-4d14-8870-0fbcbb459f8f-kube-api-access-6cmx6\") pod \"network-metrics-daemon-snrt9\" (UID: \"d280ea9a-de22-4d14-8870-0fbcbb459f8f\") " pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:13.692489 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.692469 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr7k9\" (UniqueName: \"kubernetes.io/projected/80b825a5-15e5-402b-ae57-d2e283b0e8f8-kube-api-access-rr7k9\") pod \"node-resolver-lrjg9\" (UID: \"80b825a5-15e5-402b-ae57-d2e283b0e8f8\") " pod="openshift-dns/node-resolver-lrjg9" Apr 16 16:01:13.693568 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.693550 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ldqfz\" (UniqueName: \"kubernetes.io/projected/07434d52-3b5d-4eaf-ba37-9f1b957e938a-kube-api-access-ldqfz\") pod \"node-ca-nffwb\" (UID: \"07434d52-3b5d-4eaf-ba37-9f1b957e938a\") " pod="openshift-image-registry/node-ca-nffwb" Apr 16 16:01:13.694842 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.694824 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-p96jv" Apr 16 16:01:13.761569 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:13.761536 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0fb0a319680077d647b66b0c3297064a.slice/crio-021b16bf7458487aaf552eeec43f9c7a0623cad96609dfc3908c7c211aef918a WatchSource:0}: Error finding container 021b16bf7458487aaf552eeec43f9c7a0623cad96609dfc3908c7c211aef918a: Status 404 returned error can't find the container with id 021b16bf7458487aaf552eeec43f9c7a0623cad96609dfc3908c7c211aef918a Apr 16 16:01:13.761912 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:13.761891 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb825418a7522e6122bfaed620c21321.slice/crio-bac1855a192b6a2f4797a4bb0456ebc26815913f35e8a5903be4cadb56307009 WatchSource:0}: Error finding container bac1855a192b6a2f4797a4bb0456ebc26815913f35e8a5903be4cadb56307009: Status 404 returned error can't find the container with id bac1855a192b6a2f4797a4bb0456ebc26815913f35e8a5903be4cadb56307009 Apr 16 16:01:13.766482 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.766463 2584 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:01:13.869544 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.869482 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-mx7rd" Apr 16 16:01:13.874726 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:13.874705 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7e8c4ea_3157_441e_873c_cf283ecb2c2a.slice/crio-cb7ac688e64050384889286e4c8464dcec722656a12512be4a7f5909d3af37c7 WatchSource:0}: Error finding container cb7ac688e64050384889286e4c8464dcec722656a12512be4a7f5909d3af37c7: Status 404 returned error can't find the container with id cb7ac688e64050384889286e4c8464dcec722656a12512be4a7f5909d3af37c7 Apr 16 16:01:13.883104 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.883089 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xz8m" Apr 16 16:01:13.889069 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:13.889049 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8312a792_e613_4889_91d5_3f89d0f32c1d.slice/crio-e71039e93a0049236565bf90310f27069d647f4a2a254a4c4353837e9c0e54eb WatchSource:0}: Error finding container e71039e93a0049236565bf90310f27069d647f4a2a254a4c4353837e9c0e54eb: Status 404 returned error can't find the container with id e71039e93a0049236565bf90310f27069d647f4a2a254a4c4353837e9c0e54eb Apr 16 16:01:13.899238 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.899221 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-nffwb" Apr 16 16:01:13.902738 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.902713 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-zhztn" Apr 16 16:01:13.905148 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:13.905128 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07434d52_3b5d_4eaf_ba37_9f1b957e938a.slice/crio-351bf1581b02cf0d4dbf0dbc6b96569d1688fc5847709353ea605b3882f88247 WatchSource:0}: Error finding container 351bf1581b02cf0d4dbf0dbc6b96569d1688fc5847709353ea605b3882f88247: Status 404 returned error can't find the container with id 351bf1581b02cf0d4dbf0dbc6b96569d1688fc5847709353ea605b3882f88247 Apr 16 16:01:13.910361 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:13.910341 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97141e56_55f0_4d10_ba67_fabe3d76d95d.slice/crio-800760d06def1bc20d725a6fcd9a734c67af72fee95b8b241eef5d0a34a09ed2 WatchSource:0}: Error finding container 800760d06def1bc20d725a6fcd9a734c67af72fee95b8b241eef5d0a34a09ed2: Status 404 returned error can't find the container with id 800760d06def1bc20d725a6fcd9a734c67af72fee95b8b241eef5d0a34a09ed2 Apr 16 16:01:13.916627 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.916608 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-trctq" Apr 16 16:01:13.922143 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:13.922125 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e6b59ea_f783_49ae_902d_b33f1ca6c234.slice/crio-ef467c491f49c95e039ae9b9da39824fe14d3135991a6a7920634502ad7f1a27 WatchSource:0}: Error finding container ef467c491f49c95e039ae9b9da39824fe14d3135991a6a7920634502ad7f1a27: Status 404 returned error can't find the container with id ef467c491f49c95e039ae9b9da39824fe14d3135991a6a7920634502ad7f1a27 Apr 16 16:01:13.932094 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.932078 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:13.938190 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:13.938172 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod82145526_4c6f_43c3_8850_84adf5e445e9.slice/crio-c70c6d6e3662051182515092c3a4647a9e66e9915a4b8a0c8125afeff01d37ac WatchSource:0}: Error finding container c70c6d6e3662051182515092c3a4647a9e66e9915a4b8a0c8125afeff01d37ac: Status 404 returned error can't find the container with id c70c6d6e3662051182515092c3a4647a9e66e9915a4b8a0c8125afeff01d37ac Apr 16 16:01:13.948340 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.948325 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" Apr 16 16:01:13.953369 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:13.953352 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fec1617_e681_417d_aaff_5272a1fd3065.slice/crio-1ba01a0d8b7938ce0bf4af3924f19369da51d9229123dc8ede88166857b63f27 WatchSource:0}: Error finding container 1ba01a0d8b7938ce0bf4af3924f19369da51d9229123dc8ede88166857b63f27: Status 404 returned error can't find the container with id 1ba01a0d8b7938ce0bf4af3924f19369da51d9229123dc8ede88166857b63f27 Apr 16 16:01:13.979001 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:13.978982 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-lrjg9" Apr 16 16:01:13.984462 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:13.984436 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80b825a5_15e5_402b_ae57_d2e283b0e8f8.slice/crio-61910db7fe7a483e6680522b65f3e21fcb97f16529cf143b2c8eca5cb3021141 WatchSource:0}: Error finding container 61910db7fe7a483e6680522b65f3e21fcb97f16529cf143b2c8eca5cb3021141: Status 404 returned error can't find the container with id 61910db7fe7a483e6680522b65f3e21fcb97f16529cf143b2c8eca5cb3021141 Apr 16 16:01:14.062870 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:01:14.062843 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod343ffde8_d18b_4f37_b099_b7e664a816d3.slice/crio-79954951998a231d293a645ff204c28b171b70a5858cd3d9f54cc52aef91e98f WatchSource:0}: Error finding container 79954951998a231d293a645ff204c28b171b70a5858cd3d9f54cc52aef91e98f: Status 404 returned error can't find the container with id 79954951998a231d293a645ff204c28b171b70a5858cd3d9f54cc52aef91e98f Apr 16 16:01:14.172269 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:14.172208 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d280ea9a-de22-4d14-8870-0fbcbb459f8f-metrics-certs\") pod \"network-metrics-daemon-snrt9\" (UID: \"d280ea9a-de22-4d14-8870-0fbcbb459f8f\") " pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:14.172359 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:14.172341 2584 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:01:14.172406 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:14.172398 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d280ea9a-de22-4d14-8870-0fbcbb459f8f-metrics-certs podName:d280ea9a-de22-4d14-8870-0fbcbb459f8f nodeName:}" failed. No retries permitted until 2026-04-16 16:01:15.172382699 +0000 UTC m=+3.132574401 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d280ea9a-de22-4d14-8870-0fbcbb459f8f-metrics-certs") pod "network-metrics-daemon-snrt9" (UID: "d280ea9a-de22-4d14-8870-0fbcbb459f8f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:01:14.272781 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:14.272755 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgm8h\" (UniqueName: \"kubernetes.io/projected/a3db670c-92c1-42b9-94ed-04726fe1d07a-kube-api-access-wgm8h\") pod \"network-check-target-xg7zh\" (UID: \"a3db670c-92c1-42b9-94ed-04726fe1d07a\") " pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:01:14.272910 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:14.272895 2584 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:01:14.272981 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:14.272912 2584 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:01:14.272981 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:14.272922 2584 projected.go:194] Error preparing data for projected volume kube-api-access-wgm8h for pod openshift-network-diagnostics/network-check-target-xg7zh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:01:14.273090 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:14.272991 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3db670c-92c1-42b9-94ed-04726fe1d07a-kube-api-access-wgm8h podName:a3db670c-92c1-42b9-94ed-04726fe1d07a nodeName:}" failed. No retries permitted until 2026-04-16 16:01:15.272972435 +0000 UTC m=+3.233164152 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-wgm8h" (UniqueName: "kubernetes.io/projected/a3db670c-92c1-42b9-94ed-04726fe1d07a-kube-api-access-wgm8h") pod "network-check-target-xg7zh" (UID: "a3db670c-92c1-42b9-94ed-04726fe1d07a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:01:14.601109 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:14.600996 2584 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:01:14.607887 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:14.607855 2584 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 15:56:13 +0000 UTC" deadline="2027-12-26 19:58:15.200868526 +0000 UTC" Apr 16 16:01:14.607887 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:14.607887 2584 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14859h57m0.592985815s" Apr 16 16:01:14.683233 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:14.683210 2584 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:01:14.703191 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:14.703137 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lrjg9" event={"ID":"80b825a5-15e5-402b-ae57-d2e283b0e8f8","Type":"ContainerStarted","Data":"61910db7fe7a483e6680522b65f3e21fcb97f16529cf143b2c8eca5cb3021141"} Apr 16 16:01:14.706229 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:14.706186 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" event={"ID":"4fec1617-e681-417d-aaff-5272a1fd3065","Type":"ContainerStarted","Data":"1ba01a0d8b7938ce0bf4af3924f19369da51d9229123dc8ede88166857b63f27"} Apr 16 16:01:14.708299 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:14.708275 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" event={"ID":"82145526-4c6f-43c3-8850-84adf5e445e9","Type":"ContainerStarted","Data":"c70c6d6e3662051182515092c3a4647a9e66e9915a4b8a0c8125afeff01d37ac"} Apr 16 16:01:14.711638 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:14.711614 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xz8m" event={"ID":"8312a792-e613-4889-91d5-3f89d0f32c1d","Type":"ContainerStarted","Data":"e71039e93a0049236565bf90310f27069d647f4a2a254a4c4353837e9c0e54eb"} Apr 16 16:01:14.714331 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:14.714307 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-mx7rd" event={"ID":"a7e8c4ea-3157-441e-873c-cf283ecb2c2a","Type":"ContainerStarted","Data":"cb7ac688e64050384889286e4c8464dcec722656a12512be4a7f5909d3af37c7"} Apr 16 16:01:14.718062 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:14.718037 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-24.ec2.internal" event={"ID":"bb825418a7522e6122bfaed620c21321","Type":"ContainerStarted","Data":"bac1855a192b6a2f4797a4bb0456ebc26815913f35e8a5903be4cadb56307009"} Apr 16 16:01:14.721878 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:14.721848 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-24.ec2.internal" event={"ID":"0fb0a319680077d647b66b0c3297064a","Type":"ContainerStarted","Data":"021b16bf7458487aaf552eeec43f9c7a0623cad96609dfc3908c7c211aef918a"} Apr 16 16:01:14.732046 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:14.726780 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-p96jv" event={"ID":"343ffde8-d18b-4f37-b099-b7e664a816d3","Type":"ContainerStarted","Data":"79954951998a231d293a645ff204c28b171b70a5858cd3d9f54cc52aef91e98f"} Apr 16 16:01:14.732046 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:14.728612 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-trctq" event={"ID":"8e6b59ea-f783-49ae-902d-b33f1ca6c234","Type":"ContainerStarted","Data":"ef467c491f49c95e039ae9b9da39824fe14d3135991a6a7920634502ad7f1a27"} Apr 16 16:01:14.732046 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:14.731215 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zhztn" event={"ID":"97141e56-55f0-4d10-ba67-fabe3d76d95d","Type":"ContainerStarted","Data":"800760d06def1bc20d725a6fcd9a734c67af72fee95b8b241eef5d0a34a09ed2"} Apr 16 16:01:14.732846 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:14.732824 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nffwb" event={"ID":"07434d52-3b5d-4eaf-ba37-9f1b957e938a","Type":"ContainerStarted","Data":"351bf1581b02cf0d4dbf0dbc6b96569d1688fc5847709353ea605b3882f88247"} Apr 16 16:01:15.180786 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:15.180747 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d280ea9a-de22-4d14-8870-0fbcbb459f8f-metrics-certs\") pod \"network-metrics-daemon-snrt9\" (UID: \"d280ea9a-de22-4d14-8870-0fbcbb459f8f\") " pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:15.180995 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:15.180905 2584 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:01:15.180995 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:15.180988 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d280ea9a-de22-4d14-8870-0fbcbb459f8f-metrics-certs podName:d280ea9a-de22-4d14-8870-0fbcbb459f8f nodeName:}" failed. No retries permitted until 2026-04-16 16:01:17.180967811 +0000 UTC m=+5.141159530 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d280ea9a-de22-4d14-8870-0fbcbb459f8f-metrics-certs") pod "network-metrics-daemon-snrt9" (UID: "d280ea9a-de22-4d14-8870-0fbcbb459f8f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:01:15.282603 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:15.281936 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgm8h\" (UniqueName: \"kubernetes.io/projected/a3db670c-92c1-42b9-94ed-04726fe1d07a-kube-api-access-wgm8h\") pod \"network-check-target-xg7zh\" (UID: \"a3db670c-92c1-42b9-94ed-04726fe1d07a\") " pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:01:15.282603 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:15.282141 2584 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:01:15.282603 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:15.282162 2584 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:01:15.282603 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:15.282176 2584 projected.go:194] Error preparing data for projected volume kube-api-access-wgm8h for pod openshift-network-diagnostics/network-check-target-xg7zh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:01:15.282603 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:15.282238 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3db670c-92c1-42b9-94ed-04726fe1d07a-kube-api-access-wgm8h podName:a3db670c-92c1-42b9-94ed-04726fe1d07a nodeName:}" failed. No retries permitted until 2026-04-16 16:01:17.282216805 +0000 UTC m=+5.242408522 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-wgm8h" (UniqueName: "kubernetes.io/projected/a3db670c-92c1-42b9-94ed-04726fe1d07a-kube-api-access-wgm8h") pod "network-check-target-xg7zh" (UID: "a3db670c-92c1-42b9-94ed-04726fe1d07a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:01:15.608611 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:15.608516 2584 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 15:56:13 +0000 UTC" deadline="2027-10-17 22:31:36.854313468 +0000 UTC" Apr 16 16:01:15.608611 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:15.608554 2584 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13182h30m21.245763819s" Apr 16 16:01:15.664165 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:15.664137 2584 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:01:15.693891 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:15.693867 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:15.694037 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:15.694012 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-snrt9" podUID="d280ea9a-de22-4d14-8870-0fbcbb459f8f" Apr 16 16:01:15.694417 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:15.694393 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:01:15.694511 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:15.694489 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg7zh" podUID="a3db670c-92c1-42b9-94ed-04726fe1d07a" Apr 16 16:01:16.542711 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:16.542304 2584 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 16:01:17.197920 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:17.197301 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d280ea9a-de22-4d14-8870-0fbcbb459f8f-metrics-certs\") pod \"network-metrics-daemon-snrt9\" (UID: \"d280ea9a-de22-4d14-8870-0fbcbb459f8f\") " pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:17.197920 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:17.197472 2584 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:01:17.197920 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:17.197539 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d280ea9a-de22-4d14-8870-0fbcbb459f8f-metrics-certs podName:d280ea9a-de22-4d14-8870-0fbcbb459f8f nodeName:}" failed. No retries permitted until 2026-04-16 16:01:21.197517415 +0000 UTC m=+9.157709150 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d280ea9a-de22-4d14-8870-0fbcbb459f8f-metrics-certs") pod "network-metrics-daemon-snrt9" (UID: "d280ea9a-de22-4d14-8870-0fbcbb459f8f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:01:17.299111 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:17.298462 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgm8h\" (UniqueName: \"kubernetes.io/projected/a3db670c-92c1-42b9-94ed-04726fe1d07a-kube-api-access-wgm8h\") pod \"network-check-target-xg7zh\" (UID: \"a3db670c-92c1-42b9-94ed-04726fe1d07a\") " pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:01:17.299111 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:17.298649 2584 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:01:17.299111 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:17.298668 2584 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:01:17.299111 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:17.298715 2584 projected.go:194] Error preparing data for projected volume kube-api-access-wgm8h for pod openshift-network-diagnostics/network-check-target-xg7zh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:01:17.299111 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:17.298771 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3db670c-92c1-42b9-94ed-04726fe1d07a-kube-api-access-wgm8h podName:a3db670c-92c1-42b9-94ed-04726fe1d07a nodeName:}" failed. No retries permitted until 2026-04-16 16:01:21.298753187 +0000 UTC m=+9.258944898 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-wgm8h" (UniqueName: "kubernetes.io/projected/a3db670c-92c1-42b9-94ed-04726fe1d07a-kube-api-access-wgm8h") pod "network-check-target-xg7zh" (UID: "a3db670c-92c1-42b9-94ed-04726fe1d07a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:01:17.694645 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:17.694527 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:01:17.694790 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:17.694647 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg7zh" podUID="a3db670c-92c1-42b9-94ed-04726fe1d07a" Apr 16 16:01:17.695059 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:17.695017 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:17.695139 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:17.695106 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-snrt9" podUID="d280ea9a-de22-4d14-8870-0fbcbb459f8f" Apr 16 16:01:19.551146 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:19.551035 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-d48fn"] Apr 16 16:01:19.553615 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:19.553590 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:19.553717 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:19.553669 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d48fn" podUID="01732c3c-7221-4102-8699-6e097f947672" Apr 16 16:01:19.614080 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:19.614049 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/01732c3c-7221-4102-8699-6e097f947672-kubelet-config\") pod \"global-pull-secret-syncer-d48fn\" (UID: \"01732c3c-7221-4102-8699-6e097f947672\") " pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:19.614199 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:19.614107 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/01732c3c-7221-4102-8699-6e097f947672-dbus\") pod \"global-pull-secret-syncer-d48fn\" (UID: \"01732c3c-7221-4102-8699-6e097f947672\") " pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:19.614199 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:19.614134 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/01732c3c-7221-4102-8699-6e097f947672-original-pull-secret\") pod \"global-pull-secret-syncer-d48fn\" (UID: \"01732c3c-7221-4102-8699-6e097f947672\") " pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:19.694360 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:19.694314 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:19.694506 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:19.694459 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-snrt9" podUID="d280ea9a-de22-4d14-8870-0fbcbb459f8f" Apr 16 16:01:19.694899 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:19.694876 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:01:19.695016 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:19.694997 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg7zh" podUID="a3db670c-92c1-42b9-94ed-04726fe1d07a" Apr 16 16:01:19.715579 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:19.715041 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/01732c3c-7221-4102-8699-6e097f947672-kubelet-config\") pod \"global-pull-secret-syncer-d48fn\" (UID: \"01732c3c-7221-4102-8699-6e097f947672\") " pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:19.715579 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:19.715094 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/01732c3c-7221-4102-8699-6e097f947672-dbus\") pod \"global-pull-secret-syncer-d48fn\" (UID: \"01732c3c-7221-4102-8699-6e097f947672\") " pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:19.715579 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:19.715121 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/01732c3c-7221-4102-8699-6e097f947672-original-pull-secret\") pod \"global-pull-secret-syncer-d48fn\" (UID: \"01732c3c-7221-4102-8699-6e097f947672\") " pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:19.715579 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:19.715230 2584 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:01:19.715579 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:19.715292 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01732c3c-7221-4102-8699-6e097f947672-original-pull-secret podName:01732c3c-7221-4102-8699-6e097f947672 nodeName:}" failed. No retries permitted until 2026-04-16 16:01:20.215273375 +0000 UTC m=+8.175465082 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/01732c3c-7221-4102-8699-6e097f947672-original-pull-secret") pod "global-pull-secret-syncer-d48fn" (UID: "01732c3c-7221-4102-8699-6e097f947672") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:01:19.715579 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:19.715488 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/01732c3c-7221-4102-8699-6e097f947672-kubelet-config\") pod \"global-pull-secret-syncer-d48fn\" (UID: \"01732c3c-7221-4102-8699-6e097f947672\") " pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:19.715579 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:19.715550 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/01732c3c-7221-4102-8699-6e097f947672-dbus\") pod \"global-pull-secret-syncer-d48fn\" (UID: \"01732c3c-7221-4102-8699-6e097f947672\") " pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:20.219461 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:20.219424 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/01732c3c-7221-4102-8699-6e097f947672-original-pull-secret\") pod \"global-pull-secret-syncer-d48fn\" (UID: \"01732c3c-7221-4102-8699-6e097f947672\") " pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:20.219650 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:20.219592 2584 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:01:20.219650 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:20.219649 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01732c3c-7221-4102-8699-6e097f947672-original-pull-secret podName:01732c3c-7221-4102-8699-6e097f947672 nodeName:}" failed. No retries permitted until 2026-04-16 16:01:21.219630642 +0000 UTC m=+9.179822349 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/01732c3c-7221-4102-8699-6e097f947672-original-pull-secret") pod "global-pull-secret-syncer-d48fn" (UID: "01732c3c-7221-4102-8699-6e097f947672") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:01:20.694308 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:20.693759 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:20.694308 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:20.693895 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d48fn" podUID="01732c3c-7221-4102-8699-6e097f947672" Apr 16 16:01:21.226639 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:21.226595 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/01732c3c-7221-4102-8699-6e097f947672-original-pull-secret\") pod \"global-pull-secret-syncer-d48fn\" (UID: \"01732c3c-7221-4102-8699-6e097f947672\") " pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:21.226812 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:21.226665 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d280ea9a-de22-4d14-8870-0fbcbb459f8f-metrics-certs\") pod \"network-metrics-daemon-snrt9\" (UID: \"d280ea9a-de22-4d14-8870-0fbcbb459f8f\") " pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:21.226812 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:21.226768 2584 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:01:21.226923 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:21.226832 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d280ea9a-de22-4d14-8870-0fbcbb459f8f-metrics-certs podName:d280ea9a-de22-4d14-8870-0fbcbb459f8f nodeName:}" failed. No retries permitted until 2026-04-16 16:01:29.226811758 +0000 UTC m=+17.187003462 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d280ea9a-de22-4d14-8870-0fbcbb459f8f-metrics-certs") pod "network-metrics-daemon-snrt9" (UID: "d280ea9a-de22-4d14-8870-0fbcbb459f8f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:01:21.227042 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:21.226980 2584 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:01:21.227042 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:21.227022 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01732c3c-7221-4102-8699-6e097f947672-original-pull-secret podName:01732c3c-7221-4102-8699-6e097f947672 nodeName:}" failed. No retries permitted until 2026-04-16 16:01:23.227009441 +0000 UTC m=+11.187201218 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/01732c3c-7221-4102-8699-6e097f947672-original-pull-secret") pod "global-pull-secret-syncer-d48fn" (UID: "01732c3c-7221-4102-8699-6e097f947672") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:01:21.327610 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:21.327579 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgm8h\" (UniqueName: \"kubernetes.io/projected/a3db670c-92c1-42b9-94ed-04726fe1d07a-kube-api-access-wgm8h\") pod \"network-check-target-xg7zh\" (UID: \"a3db670c-92c1-42b9-94ed-04726fe1d07a\") " pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:01:21.327791 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:21.327767 2584 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:01:21.327791 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:21.327787 2584 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:01:21.327906 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:21.327801 2584 projected.go:194] Error preparing data for projected volume kube-api-access-wgm8h for pod openshift-network-diagnostics/network-check-target-xg7zh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:01:21.327906 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:21.327855 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3db670c-92c1-42b9-94ed-04726fe1d07a-kube-api-access-wgm8h podName:a3db670c-92c1-42b9-94ed-04726fe1d07a nodeName:}" failed. No retries permitted until 2026-04-16 16:01:29.327837256 +0000 UTC m=+17.288028965 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-wgm8h" (UniqueName: "kubernetes.io/projected/a3db670c-92c1-42b9-94ed-04726fe1d07a-kube-api-access-wgm8h") pod "network-check-target-xg7zh" (UID: "a3db670c-92c1-42b9-94ed-04726fe1d07a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:01:21.694144 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:21.694070 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:21.694302 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:21.694200 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-snrt9" podUID="d280ea9a-de22-4d14-8870-0fbcbb459f8f" Apr 16 16:01:21.694542 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:21.694526 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:01:21.694897 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:21.694600 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg7zh" podUID="a3db670c-92c1-42b9-94ed-04726fe1d07a" Apr 16 16:01:22.697984 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:22.697723 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:22.698401 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:22.698089 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d48fn" podUID="01732c3c-7221-4102-8699-6e097f947672" Apr 16 16:01:23.242512 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:23.242469 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/01732c3c-7221-4102-8699-6e097f947672-original-pull-secret\") pod \"global-pull-secret-syncer-d48fn\" (UID: \"01732c3c-7221-4102-8699-6e097f947672\") " pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:23.242704 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:23.242660 2584 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:01:23.242766 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:23.242717 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01732c3c-7221-4102-8699-6e097f947672-original-pull-secret podName:01732c3c-7221-4102-8699-6e097f947672 nodeName:}" failed. No retries permitted until 2026-04-16 16:01:27.242700347 +0000 UTC m=+15.202892063 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/01732c3c-7221-4102-8699-6e097f947672-original-pull-secret") pod "global-pull-secret-syncer-d48fn" (UID: "01732c3c-7221-4102-8699-6e097f947672") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:01:23.694100 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:23.694017 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:23.694100 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:23.694038 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:01:23.694286 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:23.694162 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-snrt9" podUID="d280ea9a-de22-4d14-8870-0fbcbb459f8f" Apr 16 16:01:23.694286 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:23.694258 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg7zh" podUID="a3db670c-92c1-42b9-94ed-04726fe1d07a" Apr 16 16:01:24.694085 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:24.694052 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:24.694486 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:24.694180 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d48fn" podUID="01732c3c-7221-4102-8699-6e097f947672" Apr 16 16:01:25.693755 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:25.693719 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:25.693755 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:25.693755 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:01:25.693979 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:25.693834 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-snrt9" podUID="d280ea9a-de22-4d14-8870-0fbcbb459f8f" Apr 16 16:01:25.693979 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:25.693964 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg7zh" podUID="a3db670c-92c1-42b9-94ed-04726fe1d07a" Apr 16 16:01:26.693931 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:26.693897 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:26.694268 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:26.694041 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d48fn" podUID="01732c3c-7221-4102-8699-6e097f947672" Apr 16 16:01:27.273846 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:27.273807 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/01732c3c-7221-4102-8699-6e097f947672-original-pull-secret\") pod \"global-pull-secret-syncer-d48fn\" (UID: \"01732c3c-7221-4102-8699-6e097f947672\") " pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:27.274019 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:27.273929 2584 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:01:27.274072 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:27.274021 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01732c3c-7221-4102-8699-6e097f947672-original-pull-secret podName:01732c3c-7221-4102-8699-6e097f947672 nodeName:}" failed. No retries permitted until 2026-04-16 16:01:35.274006073 +0000 UTC m=+23.234197779 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/01732c3c-7221-4102-8699-6e097f947672-original-pull-secret") pod "global-pull-secret-syncer-d48fn" (UID: "01732c3c-7221-4102-8699-6e097f947672") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:01:27.693889 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:27.693805 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:01:27.694051 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:27.693819 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:27.694051 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:27.693913 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg7zh" podUID="a3db670c-92c1-42b9-94ed-04726fe1d07a" Apr 16 16:01:27.694051 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:27.694037 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-snrt9" podUID="d280ea9a-de22-4d14-8870-0fbcbb459f8f" Apr 16 16:01:28.694452 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:28.694398 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:28.694888 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:28.694557 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d48fn" podUID="01732c3c-7221-4102-8699-6e097f947672" Apr 16 16:01:29.289283 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:29.289242 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d280ea9a-de22-4d14-8870-0fbcbb459f8f-metrics-certs\") pod \"network-metrics-daemon-snrt9\" (UID: \"d280ea9a-de22-4d14-8870-0fbcbb459f8f\") " pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:29.289445 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:29.289378 2584 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:01:29.289508 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:29.289445 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d280ea9a-de22-4d14-8870-0fbcbb459f8f-metrics-certs podName:d280ea9a-de22-4d14-8870-0fbcbb459f8f nodeName:}" failed. No retries permitted until 2026-04-16 16:01:45.289425715 +0000 UTC m=+33.249617440 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d280ea9a-de22-4d14-8870-0fbcbb459f8f-metrics-certs") pod "network-metrics-daemon-snrt9" (UID: "d280ea9a-de22-4d14-8870-0fbcbb459f8f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:01:29.390256 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:29.390226 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgm8h\" (UniqueName: \"kubernetes.io/projected/a3db670c-92c1-42b9-94ed-04726fe1d07a-kube-api-access-wgm8h\") pod \"network-check-target-xg7zh\" (UID: \"a3db670c-92c1-42b9-94ed-04726fe1d07a\") " pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:01:29.390464 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:29.390432 2584 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:01:29.390464 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:29.390462 2584 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:01:29.390631 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:29.390477 2584 projected.go:194] Error preparing data for projected volume kube-api-access-wgm8h for pod openshift-network-diagnostics/network-check-target-xg7zh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:01:29.390631 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:29.390545 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3db670c-92c1-42b9-94ed-04726fe1d07a-kube-api-access-wgm8h podName:a3db670c-92c1-42b9-94ed-04726fe1d07a nodeName:}" failed. No retries permitted until 2026-04-16 16:01:45.390523372 +0000 UTC m=+33.350715079 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-wgm8h" (UniqueName: "kubernetes.io/projected/a3db670c-92c1-42b9-94ed-04726fe1d07a-kube-api-access-wgm8h") pod "network-check-target-xg7zh" (UID: "a3db670c-92c1-42b9-94ed-04726fe1d07a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:01:29.694609 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:29.694537 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:29.695053 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:29.694544 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:01:29.695053 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:29.694678 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-snrt9" podUID="d280ea9a-de22-4d14-8870-0fbcbb459f8f" Apr 16 16:01:29.695053 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:29.694753 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg7zh" podUID="a3db670c-92c1-42b9-94ed-04726fe1d07a" Apr 16 16:01:30.694678 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:30.694641 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:30.695212 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:30.694775 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d48fn" podUID="01732c3c-7221-4102-8699-6e097f947672" Apr 16 16:01:31.694136 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:31.694111 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:31.694219 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:31.694112 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:01:31.694289 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:31.694203 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-snrt9" podUID="d280ea9a-de22-4d14-8870-0fbcbb459f8f" Apr 16 16:01:31.694334 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:31.694292 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg7zh" podUID="a3db670c-92c1-42b9-94ed-04726fe1d07a" Apr 16 16:01:32.701030 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:32.700771 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:32.701458 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:32.701434 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d48fn" podUID="01732c3c-7221-4102-8699-6e097f947672" Apr 16 16:01:32.761803 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:32.761765 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" event={"ID":"4fec1617-e681-417d-aaff-5272a1fd3065","Type":"ContainerStarted","Data":"66e436ab705559fac3bfaa30bcba0c5d84fd99144e1c66b53ab04fff3a86a181"} Apr 16 16:01:32.764740 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:32.764717 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65x5s_82145526-4c6f-43c3-8850-84adf5e445e9/ovn-acl-logging/0.log" Apr 16 16:01:32.765114 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:32.765095 2584 generic.go:358] "Generic (PLEG): container finished" podID="82145526-4c6f-43c3-8850-84adf5e445e9" containerID="d0988d43ba5488f585d82226db9d1eeea1a4a76aeebd8c5efa0bec6dd72d9f56" exitCode=1 Apr 16 16:01:32.765189 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:32.765166 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" event={"ID":"82145526-4c6f-43c3-8850-84adf5e445e9","Type":"ContainerStarted","Data":"c3c184642fe52d2d9068ff8e03b86dd8b38ada1b375c9b36c35dcd49ac3156b0"} Apr 16 16:01:32.765237 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:32.765203 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" event={"ID":"82145526-4c6f-43c3-8850-84adf5e445e9","Type":"ContainerStarted","Data":"6a459b83b27e44cfa29606043cd79bce95bf8f634709ced630dc6368554b7556"} Apr 16 16:01:32.765237 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:32.765217 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" event={"ID":"82145526-4c6f-43c3-8850-84adf5e445e9","Type":"ContainerStarted","Data":"63ea9d30b1d6ff0dfacd42f33cefa2b03c480e1048a43362b26e10b1f9e486fd"} Apr 16 16:01:32.765237 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:32.765231 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" event={"ID":"82145526-4c6f-43c3-8850-84adf5e445e9","Type":"ContainerStarted","Data":"9e3f49741d339944fbd577d76818d22016d40f6cc15cfd60678aecf3510d602a"} Apr 16 16:01:32.765363 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:32.765242 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" event={"ID":"82145526-4c6f-43c3-8850-84adf5e445e9","Type":"ContainerDied","Data":"d0988d43ba5488f585d82226db9d1eeea1a4a76aeebd8c5efa0bec6dd72d9f56"} Apr 16 16:01:32.765363 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:32.765256 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" event={"ID":"82145526-4c6f-43c3-8850-84adf5e445e9","Type":"ContainerStarted","Data":"3d0ef1c7f327092ea10cabf0696ac325de868e3a4d352caf3570b9a7abc7efdf"} Apr 16 16:01:32.766646 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:32.766626 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-24.ec2.internal" event={"ID":"bb825418a7522e6122bfaed620c21321","Type":"ContainerStarted","Data":"86a109536266bde05ce8df08463056d69ca3ba76d69801859fe2bbab5f68bd94"} Apr 16 16:01:32.767999 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:32.767978 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-trctq" event={"ID":"8e6b59ea-f783-49ae-902d-b33f1ca6c234","Type":"ContainerStarted","Data":"8db446ca83177b68c102cf16bed38148bcbdb9d2674db0d6278694ad920c46b4"} Apr 16 16:01:32.779606 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:32.779551 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-kzd5d" podStartSLOduration=3.059148786 podStartE2EDuration="20.77954058s" podCreationTimestamp="2026-04-16 16:01:12 +0000 UTC" firstStartedPulling="2026-04-16 16:01:13.954611714 +0000 UTC m=+1.914803416" lastFinishedPulling="2026-04-16 16:01:31.675003507 +0000 UTC m=+19.635195210" observedRunningTime="2026-04-16 16:01:32.779324033 +0000 UTC m=+20.739515756" watchObservedRunningTime="2026-04-16 16:01:32.77954058 +0000 UTC m=+20.739732305" Apr 16 16:01:32.795663 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:32.795621 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-trctq" podStartSLOduration=3.042740693 podStartE2EDuration="20.795610859s" podCreationTimestamp="2026-04-16 16:01:12 +0000 UTC" firstStartedPulling="2026-04-16 16:01:13.923677292 +0000 UTC m=+1.883868998" lastFinishedPulling="2026-04-16 16:01:31.676547461 +0000 UTC m=+19.636739164" observedRunningTime="2026-04-16 16:01:32.795106121 +0000 UTC m=+20.755297846" watchObservedRunningTime="2026-04-16 16:01:32.795610859 +0000 UTC m=+20.755802583" Apr 16 16:01:32.807587 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:32.807554 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-131-24.ec2.internal" podStartSLOduration=19.807543307 podStartE2EDuration="19.807543307s" podCreationTimestamp="2026-04-16 16:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:01:32.807348012 +0000 UTC m=+20.767539735" watchObservedRunningTime="2026-04-16 16:01:32.807543307 +0000 UTC m=+20.767735031" Apr 16 16:01:33.694490 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:33.694429 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:01:33.694618 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:33.694429 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:33.694618 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:33.694517 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg7zh" podUID="a3db670c-92c1-42b9-94ed-04726fe1d07a" Apr 16 16:01:33.694618 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:33.694598 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-snrt9" podUID="d280ea9a-de22-4d14-8870-0fbcbb459f8f" Apr 16 16:01:33.773792 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:33.773758 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-p96jv" event={"ID":"343ffde8-d18b-4f37-b099-b7e664a816d3","Type":"ContainerStarted","Data":"c5e3bf6878657bfb831edf5e76d0f71699b18f8e6bc8525189f1df04cf817724"} Apr 16 16:01:34.694362 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:34.694335 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:34.694533 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:34.694435 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d48fn" podUID="01732c3c-7221-4102-8699-6e097f947672" Apr 16 16:01:35.334563 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:35.334508 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/01732c3c-7221-4102-8699-6e097f947672-original-pull-secret\") pod \"global-pull-secret-syncer-d48fn\" (UID: \"01732c3c-7221-4102-8699-6e097f947672\") " pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:35.335057 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:35.334620 2584 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:01:35.335057 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:35.334665 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01732c3c-7221-4102-8699-6e097f947672-original-pull-secret podName:01732c3c-7221-4102-8699-6e097f947672 nodeName:}" failed. No retries permitted until 2026-04-16 16:01:51.334652965 +0000 UTC m=+39.294844667 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/01732c3c-7221-4102-8699-6e097f947672-original-pull-secret") pod "global-pull-secret-syncer-d48fn" (UID: "01732c3c-7221-4102-8699-6e097f947672") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:01:35.694404 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:35.694341 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:01:35.694404 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:35.694341 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:35.694644 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:35.694449 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg7zh" podUID="a3db670c-92c1-42b9-94ed-04726fe1d07a" Apr 16 16:01:35.694644 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:35.694571 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-snrt9" podUID="d280ea9a-de22-4d14-8870-0fbcbb459f8f" Apr 16 16:01:35.778536 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:35.778511 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-lrjg9" event={"ID":"80b825a5-15e5-402b-ae57-d2e283b0e8f8","Type":"ContainerStarted","Data":"c0ea5ac2b5318331b1ce858912e09eba1616695b32c83e6160eab71d65718f3f"} Apr 16 16:01:35.781063 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:35.781044 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65x5s_82145526-4c6f-43c3-8850-84adf5e445e9/ovn-acl-logging/0.log" Apr 16 16:01:35.781454 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:35.781423 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" event={"ID":"82145526-4c6f-43c3-8850-84adf5e445e9","Type":"ContainerStarted","Data":"ede7a03ef9dc69a208ede8e907b2a6281ae3514091e6796ddba85a55d79611ee"} Apr 16 16:01:35.782775 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:35.782750 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xz8m" event={"ID":"8312a792-e613-4889-91d5-3f89d0f32c1d","Type":"ContainerStarted","Data":"addb9e8b5aafa68a5ed9e7e3f19b1d771a4205bb1e85902a01919336c42d8a98"} Apr 16 16:01:35.784010 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:35.783991 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-mx7rd" event={"ID":"a7e8c4ea-3157-441e-873c-cf283ecb2c2a","Type":"ContainerStarted","Data":"ca7c9eab91e0f38a9159e12a0b25f9b18c063c42c8b372b3bc7b253fd0747ac0"} Apr 16 16:01:35.785218 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:35.785197 2584 generic.go:358] "Generic (PLEG): container finished" podID="0fb0a319680077d647b66b0c3297064a" containerID="beaf16aaf84a9f094be1ced120eafc0a5027c73bce73c1be2c1efa6899459ed5" exitCode=0 Apr 16 16:01:35.785311 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:35.785270 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-24.ec2.internal" event={"ID":"0fb0a319680077d647b66b0c3297064a","Type":"ContainerDied","Data":"beaf16aaf84a9f094be1ced120eafc0a5027c73bce73c1be2c1efa6899459ed5"} Apr 16 16:01:35.786455 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:35.786421 2584 generic.go:358] "Generic (PLEG): container finished" podID="97141e56-55f0-4d10-ba67-fabe3d76d95d" containerID="f49547808ac4f1387c8f4cb2bf830e6c196bd6eb1d84fd79b125a3de60d7494e" exitCode=0 Apr 16 16:01:35.786538 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:35.786499 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zhztn" event={"ID":"97141e56-55f0-4d10-ba67-fabe3d76d95d","Type":"ContainerDied","Data":"f49547808ac4f1387c8f4cb2bf830e6c196bd6eb1d84fd79b125a3de60d7494e"} Apr 16 16:01:35.789544 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:35.789523 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-nffwb" event={"ID":"07434d52-3b5d-4eaf-ba37-9f1b957e938a","Type":"ContainerStarted","Data":"81d265b59e77f2236e33193d918563da7b4a77f05176518f62516032672607d8"} Apr 16 16:01:35.793584 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:35.793549 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-lrjg9" podStartSLOduration=11.523454483 podStartE2EDuration="23.793538046s" podCreationTimestamp="2026-04-16 16:01:12 +0000 UTC" firstStartedPulling="2026-04-16 16:01:13.985726947 +0000 UTC m=+1.945918649" lastFinishedPulling="2026-04-16 16:01:26.25581051 +0000 UTC m=+14.216002212" observedRunningTime="2026-04-16 16:01:35.793084612 +0000 UTC m=+23.753276335" watchObservedRunningTime="2026-04-16 16:01:35.793538046 +0000 UTC m=+23.753729770" Apr 16 16:01:35.793710 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:35.793692 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-p96jv" podStartSLOduration=6.182934182 podStartE2EDuration="23.79368833s" podCreationTimestamp="2026-04-16 16:01:12 +0000 UTC" firstStartedPulling="2026-04-16 16:01:14.064291825 +0000 UTC m=+2.024483527" lastFinishedPulling="2026-04-16 16:01:31.675045969 +0000 UTC m=+19.635237675" observedRunningTime="2026-04-16 16:01:33.788025325 +0000 UTC m=+21.748217050" watchObservedRunningTime="2026-04-16 16:01:35.79368833 +0000 UTC m=+23.753880054" Apr 16 16:01:35.824157 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:35.824124 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-mx7rd" podStartSLOduration=6.026188793 podStartE2EDuration="23.824113628s" podCreationTimestamp="2026-04-16 16:01:12 +0000 UTC" firstStartedPulling="2026-04-16 16:01:13.876183512 +0000 UTC m=+1.836375215" lastFinishedPulling="2026-04-16 16:01:31.674108349 +0000 UTC m=+19.634300050" observedRunningTime="2026-04-16 16:01:35.823364437 +0000 UTC m=+23.783556160" watchObservedRunningTime="2026-04-16 16:01:35.824113628 +0000 UTC m=+23.784305352" Apr 16 16:01:35.837402 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:35.837370 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-nffwb" podStartSLOduration=11.252375485 podStartE2EDuration="23.837361503s" podCreationTimestamp="2026-04-16 16:01:12 +0000 UTC" firstStartedPulling="2026-04-16 16:01:13.906701837 +0000 UTC m=+1.866893539" lastFinishedPulling="2026-04-16 16:01:26.491687853 +0000 UTC m=+14.451879557" observedRunningTime="2026-04-16 16:01:35.837072937 +0000 UTC m=+23.797264661" watchObservedRunningTime="2026-04-16 16:01:35.837361503 +0000 UTC m=+23.797553226" Apr 16 16:01:36.291968 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:36.291770 2584 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 16:01:36.638130 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:36.637997 2584 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T16:01:36.291970395Z","UUID":"658cc5c6-34d0-4274-a48d-b5c818af42b4","Handler":null,"Name":"","Endpoint":""} Apr 16 16:01:36.639784 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:36.639764 2584 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 16:01:36.639784 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:36.639794 2584 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 16:01:36.693999 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:36.693971 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:36.694148 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:36.694117 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d48fn" podUID="01732c3c-7221-4102-8699-6e097f947672" Apr 16 16:01:36.792991 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:36.792943 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xz8m" event={"ID":"8312a792-e613-4889-91d5-3f89d0f32c1d","Type":"ContainerStarted","Data":"c74eec3c2892cf4f3e21d54b2eb423030167eae0ac2e16243091b8d865092981"} Apr 16 16:01:36.794697 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:36.794616 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-24.ec2.internal" event={"ID":"0fb0a319680077d647b66b0c3297064a","Type":"ContainerStarted","Data":"1f5c199f675b2c2b310003bc8d6f6e742cf79dc2840bfaa8b0a0c0c3ce921fe1"} Apr 16 16:01:36.815769 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:36.815726 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-131-24.ec2.internal" podStartSLOduration=23.815716105 podStartE2EDuration="23.815716105s" podCreationTimestamp="2026-04-16 16:01:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:01:36.815360277 +0000 UTC m=+24.775552000" watchObservedRunningTime="2026-04-16 16:01:36.815716105 +0000 UTC m=+24.775907828" Apr 16 16:01:37.694132 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:37.694101 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:37.694600 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:37.694101 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:01:37.694600 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:37.694230 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-snrt9" podUID="d280ea9a-de22-4d14-8870-0fbcbb459f8f" Apr 16 16:01:37.694600 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:37.694271 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg7zh" podUID="a3db670c-92c1-42b9-94ed-04726fe1d07a" Apr 16 16:01:37.800470 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:37.800444 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65x5s_82145526-4c6f-43c3-8850-84adf5e445e9/ovn-acl-logging/0.log" Apr 16 16:01:37.800858 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:37.800826 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" event={"ID":"82145526-4c6f-43c3-8850-84adf5e445e9","Type":"ContainerStarted","Data":"226b8713236bf194a42a9843303da4d5578a0bfc4966b52e0321c1d896f96d8a"} Apr 16 16:01:37.801470 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:37.801438 2584 scope.go:117] "RemoveContainer" containerID="d0988d43ba5488f585d82226db9d1eeea1a4a76aeebd8c5efa0bec6dd72d9f56" Apr 16 16:01:38.303445 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:38.303223 2584 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-mx7rd" Apr 16 16:01:38.303790 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:38.303772 2584 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-mx7rd" Apr 16 16:01:38.697258 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:38.697234 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:38.697699 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:38.697351 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d48fn" podUID="01732c3c-7221-4102-8699-6e097f947672" Apr 16 16:01:38.805660 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:38.805636 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65x5s_82145526-4c6f-43c3-8850-84adf5e445e9/ovn-acl-logging/0.log" Apr 16 16:01:38.806040 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:38.806006 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" event={"ID":"82145526-4c6f-43c3-8850-84adf5e445e9","Type":"ContainerStarted","Data":"44124397cce35ed9751447a85a69ee70441f520331c6a0a8c5db06cf7fd9267f"} Apr 16 16:01:38.806297 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:38.806273 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:38.806418 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:38.806303 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:38.806418 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:38.806318 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:38.808427 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:38.808379 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xz8m" event={"ID":"8312a792-e613-4889-91d5-3f89d0f32c1d","Type":"ContainerStarted","Data":"94398f1d1d46d6e53caa834a7f62247741d0c8c0dd39600f7156ad91f013492c"} Apr 16 16:01:38.808593 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:38.808577 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-mx7rd" Apr 16 16:01:38.809179 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:38.809161 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-mx7rd" Apr 16 16:01:38.822848 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:38.822826 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:38.823001 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:38.822985 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:01:38.834925 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:38.834889 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" podStartSLOduration=8.755465619 podStartE2EDuration="26.834878438s" podCreationTimestamp="2026-04-16 16:01:12 +0000 UTC" firstStartedPulling="2026-04-16 16:01:13.939519863 +0000 UTC m=+1.899711566" lastFinishedPulling="2026-04-16 16:01:32.018932683 +0000 UTC m=+19.979124385" observedRunningTime="2026-04-16 16:01:38.834033384 +0000 UTC m=+26.794225109" watchObservedRunningTime="2026-04-16 16:01:38.834878438 +0000 UTC m=+26.795070164" Apr 16 16:01:38.866373 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:38.866329 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-5xz8m" podStartSLOduration=2.688602302 podStartE2EDuration="26.866319071s" podCreationTimestamp="2026-04-16 16:01:12 +0000 UTC" firstStartedPulling="2026-04-16 16:01:13.890222827 +0000 UTC m=+1.850414530" lastFinishedPulling="2026-04-16 16:01:38.067939598 +0000 UTC m=+26.028131299" observedRunningTime="2026-04-16 16:01:38.865771072 +0000 UTC m=+26.825962798" watchObservedRunningTime="2026-04-16 16:01:38.866319071 +0000 UTC m=+26.826510865" Apr 16 16:01:39.694417 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:39.694382 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:39.694604 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:39.694381 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:01:39.694604 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:39.694523 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-snrt9" podUID="d280ea9a-de22-4d14-8870-0fbcbb459f8f" Apr 16 16:01:39.694604 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:39.694577 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg7zh" podUID="a3db670c-92c1-42b9-94ed-04726fe1d07a" Apr 16 16:01:40.696624 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:40.696597 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:40.697157 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:40.696690 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d48fn" podUID="01732c3c-7221-4102-8699-6e097f947672" Apr 16 16:01:40.813582 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:40.813555 2584 generic.go:358] "Generic (PLEG): container finished" podID="97141e56-55f0-4d10-ba67-fabe3d76d95d" containerID="fb448258d81118667942c789de715690f44b1ce8915b071ed0e708dadb74c1b0" exitCode=0 Apr 16 16:01:40.813724 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:40.813637 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zhztn" event={"ID":"97141e56-55f0-4d10-ba67-fabe3d76d95d","Type":"ContainerDied","Data":"fb448258d81118667942c789de715690f44b1ce8915b071ed0e708dadb74c1b0"} Apr 16 16:01:41.694594 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:41.694570 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:01:41.694594 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:41.694586 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:41.694706 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:41.694668 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg7zh" podUID="a3db670c-92c1-42b9-94ed-04726fe1d07a" Apr 16 16:01:41.694838 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:41.694811 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-snrt9" podUID="d280ea9a-de22-4d14-8870-0fbcbb459f8f" Apr 16 16:01:41.817273 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:41.817247 2584 generic.go:358] "Generic (PLEG): container finished" podID="97141e56-55f0-4d10-ba67-fabe3d76d95d" containerID="de6553e8da920bdab75c9dd123199a46d344dfd865dbb575d47e5b01687a36cb" exitCode=0 Apr 16 16:01:41.817541 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:41.817287 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zhztn" event={"ID":"97141e56-55f0-4d10-ba67-fabe3d76d95d","Type":"ContainerDied","Data":"de6553e8da920bdab75c9dd123199a46d344dfd865dbb575d47e5b01687a36cb"} Apr 16 16:01:42.696061 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:42.696038 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:42.696144 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:42.696121 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d48fn" podUID="01732c3c-7221-4102-8699-6e097f947672" Apr 16 16:01:42.820911 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:42.820858 2584 generic.go:358] "Generic (PLEG): container finished" podID="97141e56-55f0-4d10-ba67-fabe3d76d95d" containerID="ae9be57fd1fb194bc1f136d45e05bbf2639d0a02eb6bd2a1d7f46aac8c0b7efb" exitCode=0 Apr 16 16:01:42.820911 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:42.820892 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zhztn" event={"ID":"97141e56-55f0-4d10-ba67-fabe3d76d95d","Type":"ContainerDied","Data":"ae9be57fd1fb194bc1f136d45e05bbf2639d0a02eb6bd2a1d7f46aac8c0b7efb"} Apr 16 16:01:43.694507 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:43.694477 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:01:43.694507 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:43.694498 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:43.694717 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:43.694576 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg7zh" podUID="a3db670c-92c1-42b9-94ed-04726fe1d07a" Apr 16 16:01:43.694760 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:43.694719 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-snrt9" podUID="d280ea9a-de22-4d14-8870-0fbcbb459f8f" Apr 16 16:01:44.694526 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:44.694494 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:44.694983 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:44.694607 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d48fn" podUID="01732c3c-7221-4102-8699-6e097f947672" Apr 16 16:01:45.310103 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:45.310068 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d280ea9a-de22-4d14-8870-0fbcbb459f8f-metrics-certs\") pod \"network-metrics-daemon-snrt9\" (UID: \"d280ea9a-de22-4d14-8870-0fbcbb459f8f\") " pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:45.310289 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:45.310221 2584 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:01:45.310350 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:45.310291 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d280ea9a-de22-4d14-8870-0fbcbb459f8f-metrics-certs podName:d280ea9a-de22-4d14-8870-0fbcbb459f8f nodeName:}" failed. No retries permitted until 2026-04-16 16:02:17.310272594 +0000 UTC m=+65.270464309 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/d280ea9a-de22-4d14-8870-0fbcbb459f8f-metrics-certs") pod "network-metrics-daemon-snrt9" (UID: "d280ea9a-de22-4d14-8870-0fbcbb459f8f") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 16:01:45.410981 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:45.410939 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgm8h\" (UniqueName: \"kubernetes.io/projected/a3db670c-92c1-42b9-94ed-04726fe1d07a-kube-api-access-wgm8h\") pod \"network-check-target-xg7zh\" (UID: \"a3db670c-92c1-42b9-94ed-04726fe1d07a\") " pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:01:45.411155 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:45.411087 2584 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 16:01:45.411155 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:45.411106 2584 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 16:01:45.411155 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:45.411119 2584 projected.go:194] Error preparing data for projected volume kube-api-access-wgm8h for pod openshift-network-diagnostics/network-check-target-xg7zh: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:01:45.411275 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:45.411184 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a3db670c-92c1-42b9-94ed-04726fe1d07a-kube-api-access-wgm8h podName:a3db670c-92c1-42b9-94ed-04726fe1d07a nodeName:}" failed. No retries permitted until 2026-04-16 16:02:17.411169342 +0000 UTC m=+65.371361057 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-wgm8h" (UniqueName: "kubernetes.io/projected/a3db670c-92c1-42b9-94ed-04726fe1d07a-kube-api-access-wgm8h") pod "network-check-target-xg7zh" (UID: "a3db670c-92c1-42b9-94ed-04726fe1d07a") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 16:01:45.694468 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:45.694435 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:01:45.694623 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:45.694440 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:45.694623 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:45.694532 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg7zh" podUID="a3db670c-92c1-42b9-94ed-04726fe1d07a" Apr 16 16:01:45.694981 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:45.694622 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-snrt9" podUID="d280ea9a-de22-4d14-8870-0fbcbb459f8f" Apr 16 16:01:46.694161 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:46.694122 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:46.694443 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:46.694254 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d48fn" podUID="01732c3c-7221-4102-8699-6e097f947672" Apr 16 16:01:47.694000 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:47.693962 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:01:47.694462 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:47.693967 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:47.694462 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:47.694076 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg7zh" podUID="a3db670c-92c1-42b9-94ed-04726fe1d07a" Apr 16 16:01:47.694462 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:47.694177 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-snrt9" podUID="d280ea9a-de22-4d14-8870-0fbcbb459f8f" Apr 16 16:01:48.694735 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:48.694711 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:48.695064 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:48.694838 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d48fn" podUID="01732c3c-7221-4102-8699-6e097f947672" Apr 16 16:01:48.834218 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:48.834188 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zhztn" event={"ID":"97141e56-55f0-4d10-ba67-fabe3d76d95d","Type":"ContainerStarted","Data":"acbef4048a52da446423141651a6b1cab076621ca5bbc5fd0259acebc2c39b29"} Apr 16 16:01:49.693965 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:49.693927 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:01:49.694138 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:49.693935 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:49.694138 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:49.694031 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg7zh" podUID="a3db670c-92c1-42b9-94ed-04726fe1d07a" Apr 16 16:01:49.694138 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:49.694112 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-snrt9" podUID="d280ea9a-de22-4d14-8870-0fbcbb459f8f" Apr 16 16:01:49.838209 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:49.838177 2584 generic.go:358] "Generic (PLEG): container finished" podID="97141e56-55f0-4d10-ba67-fabe3d76d95d" containerID="acbef4048a52da446423141651a6b1cab076621ca5bbc5fd0259acebc2c39b29" exitCode=0 Apr 16 16:01:49.838574 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:49.838219 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zhztn" event={"ID":"97141e56-55f0-4d10-ba67-fabe3d76d95d","Type":"ContainerDied","Data":"acbef4048a52da446423141651a6b1cab076621ca5bbc5fd0259acebc2c39b29"} Apr 16 16:01:50.693816 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:50.693786 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:50.694017 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:50.693879 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d48fn" podUID="01732c3c-7221-4102-8699-6e097f947672" Apr 16 16:01:50.842709 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:50.842672 2584 generic.go:358] "Generic (PLEG): container finished" podID="97141e56-55f0-4d10-ba67-fabe3d76d95d" containerID="ef0dfb9e333de2f90b83ff1aec8ec5cdf66756a1ea0bfbfb610ee43d26163966" exitCode=0 Apr 16 16:01:50.843088 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:50.842718 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zhztn" event={"ID":"97141e56-55f0-4d10-ba67-fabe3d76d95d","Type":"ContainerDied","Data":"ef0dfb9e333de2f90b83ff1aec8ec5cdf66756a1ea0bfbfb610ee43d26163966"} Apr 16 16:01:51.357327 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:51.357089 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/01732c3c-7221-4102-8699-6e097f947672-original-pull-secret\") pod \"global-pull-secret-syncer-d48fn\" (UID: \"01732c3c-7221-4102-8699-6e097f947672\") " pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:51.357327 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:51.357234 2584 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 16:01:51.357496 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:51.357344 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/01732c3c-7221-4102-8699-6e097f947672-original-pull-secret podName:01732c3c-7221-4102-8699-6e097f947672 nodeName:}" failed. No retries permitted until 2026-04-16 16:02:23.357329984 +0000 UTC m=+71.317521685 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/01732c3c-7221-4102-8699-6e097f947672-original-pull-secret") pod "global-pull-secret-syncer-d48fn" (UID: "01732c3c-7221-4102-8699-6e097f947672") : object "kube-system"/"original-pull-secret" not registered Apr 16 16:01:51.694439 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:51.694413 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:01:51.694573 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:51.694418 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:51.694573 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:51.694516 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg7zh" podUID="a3db670c-92c1-42b9-94ed-04726fe1d07a" Apr 16 16:01:51.694649 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:51.694596 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-snrt9" podUID="d280ea9a-de22-4d14-8870-0fbcbb459f8f" Apr 16 16:01:51.849378 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:51.849346 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-zhztn" event={"ID":"97141e56-55f0-4d10-ba67-fabe3d76d95d","Type":"ContainerStarted","Data":"8427e7c52f64d062574db6a7c89ff4cb6e19a2b2bbf3b3f6e316bcb3fd45cbaa"} Apr 16 16:01:51.871336 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:51.871294 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-zhztn" podStartSLOduration=5.165274608 podStartE2EDuration="39.871282374s" podCreationTimestamp="2026-04-16 16:01:12 +0000 UTC" firstStartedPulling="2026-04-16 16:01:13.912237057 +0000 UTC m=+1.872428758" lastFinishedPulling="2026-04-16 16:01:48.61824482 +0000 UTC m=+36.578436524" observedRunningTime="2026-04-16 16:01:51.870897861 +0000 UTC m=+39.831089586" watchObservedRunningTime="2026-04-16 16:01:51.871282374 +0000 UTC m=+39.831474097" Apr 16 16:01:52.694867 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:52.694842 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:52.695029 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:52.694919 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d48fn" podUID="01732c3c-7221-4102-8699-6e097f947672" Apr 16 16:01:53.694593 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:53.694568 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:53.694973 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:53.694605 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:01:53.694973 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:53.694662 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg7zh" podUID="a3db670c-92c1-42b9-94ed-04726fe1d07a" Apr 16 16:01:53.694973 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:53.694733 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-snrt9" podUID="d280ea9a-de22-4d14-8870-0fbcbb459f8f" Apr 16 16:01:54.694028 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:54.693996 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:54.694229 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:54.694103 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d48fn" podUID="01732c3c-7221-4102-8699-6e097f947672" Apr 16 16:01:55.694233 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:55.694204 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:55.694233 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:55.694238 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:01:55.694680 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:55.694315 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-snrt9" podUID="d280ea9a-de22-4d14-8870-0fbcbb459f8f" Apr 16 16:01:55.694680 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:55.694367 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg7zh" podUID="a3db670c-92c1-42b9-94ed-04726fe1d07a" Apr 16 16:01:56.694280 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:56.694245 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:56.694654 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:56.694372 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d48fn" podUID="01732c3c-7221-4102-8699-6e097f947672" Apr 16 16:01:57.693983 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:57.693932 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:57.694182 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:57.694079 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-snrt9" podUID="d280ea9a-de22-4d14-8870-0fbcbb459f8f" Apr 16 16:01:57.694312 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:57.694283 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:01:57.694687 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:57.694395 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg7zh" podUID="a3db670c-92c1-42b9-94ed-04726fe1d07a" Apr 16 16:01:57.965405 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:57.965332 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-d48fn"] Apr 16 16:01:57.965514 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:57.965451 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:57.965586 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:57.965565 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d48fn" podUID="01732c3c-7221-4102-8699-6e097f947672" Apr 16 16:01:57.968237 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:57.968208 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xg7zh"] Apr 16 16:01:57.968357 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:57.968319 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:01:57.968423 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:57.968395 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg7zh" podUID="a3db670c-92c1-42b9-94ed-04726fe1d07a" Apr 16 16:01:57.969038 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:57.969015 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-snrt9"] Apr 16 16:01:57.969140 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:57.969112 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:57.969245 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:57.969222 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-snrt9" podUID="d280ea9a-de22-4d14-8870-0fbcbb459f8f" Apr 16 16:01:59.694640 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:59.694607 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:01:59.694640 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:59.694627 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:01:59.695116 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:01:59.694607 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:01:59.695116 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:59.694716 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xg7zh" podUID="a3db670c-92c1-42b9-94ed-04726fe1d07a" Apr 16 16:01:59.695116 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:59.694777 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-d48fn" podUID="01732c3c-7221-4102-8699-6e097f947672" Apr 16 16:01:59.695116 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:01:59.694857 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-snrt9" podUID="d280ea9a-de22-4d14-8870-0fbcbb459f8f" Apr 16 16:02:01.383663 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.383448 2584 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-131-24.ec2.internal" event="NodeReady" Apr 16 16:02:01.384112 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.383721 2584 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 16:02:01.419346 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.419317 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-9f8c8b94d-kt6mf"] Apr 16 16:02:01.460520 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.460499 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-9f8c8b94d-kt6mf"] Apr 16 16:02:01.460520 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.460523 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-hsj24"] Apr 16 16:02:01.460657 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.460632 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-9f8c8b94d-kt6mf" Apr 16 16:02:01.462771 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.462753 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-8px99\"" Apr 16 16:02:01.462892 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.462819 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 16:02:01.462892 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.462863 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 16:02:01.463025 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.462921 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 16:02:01.467379 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.467357 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 16:02:01.475825 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.475803 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-ndvgr"] Apr 16 16:02:01.475981 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.475945 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hsj24" Apr 16 16:02:01.478321 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.478302 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 16:02:01.478555 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.478540 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-jr2th\"" Apr 16 16:02:01.478648 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.478557 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 16:02:01.497164 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.497144 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hsj24"] Apr 16 16:02:01.497164 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.497164 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ndvgr"] Apr 16 16:02:01.497276 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.497239 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ndvgr" Apr 16 16:02:01.499449 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.499431 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 16:02:01.499449 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.499445 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 16:02:01.499607 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.499505 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-5q2sn\"" Apr 16 16:02:01.499838 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.499818 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 16:02:01.535199 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.535174 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c7c2949f-5c5f-49f8-8fa9-bb2e36cd9787-metrics-tls\") pod \"dns-default-hsj24\" (UID: \"c7c2949f-5c5f-49f8-8fa9-bb2e36cd9787\") " pod="openshift-dns/dns-default-hsj24" Apr 16 16:02:01.535300 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.535204 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c7c2949f-5c5f-49f8-8fa9-bb2e36cd9787-tmp-dir\") pod \"dns-default-hsj24\" (UID: \"c7c2949f-5c5f-49f8-8fa9-bb2e36cd9787\") " pod="openshift-dns/dns-default-hsj24" Apr 16 16:02:01.535300 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.535230 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/126c769e-8fc5-445a-a80a-2576bf17ce18-bound-sa-token\") pod \"image-registry-9f8c8b94d-kt6mf\" (UID: \"126c769e-8fc5-445a-a80a-2576bf17ce18\") " pod="openshift-image-registry/image-registry-9f8c8b94d-kt6mf" Apr 16 16:02:01.535390 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.535305 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7c2949f-5c5f-49f8-8fa9-bb2e36cd9787-config-volume\") pod \"dns-default-hsj24\" (UID: \"c7c2949f-5c5f-49f8-8fa9-bb2e36cd9787\") " pod="openshift-dns/dns-default-hsj24" Apr 16 16:02:01.535390 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.535332 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fb7kq\" (UniqueName: \"kubernetes.io/projected/126c769e-8fc5-445a-a80a-2576bf17ce18-kube-api-access-fb7kq\") pod \"image-registry-9f8c8b94d-kt6mf\" (UID: \"126c769e-8fc5-445a-a80a-2576bf17ce18\") " pod="openshift-image-registry/image-registry-9f8c8b94d-kt6mf" Apr 16 16:02:01.535390 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.535370 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/126c769e-8fc5-445a-a80a-2576bf17ce18-registry-certificates\") pod \"image-registry-9f8c8b94d-kt6mf\" (UID: \"126c769e-8fc5-445a-a80a-2576bf17ce18\") " pod="openshift-image-registry/image-registry-9f8c8b94d-kt6mf" Apr 16 16:02:01.535533 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.535391 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/126c769e-8fc5-445a-a80a-2576bf17ce18-image-registry-private-configuration\") pod \"image-registry-9f8c8b94d-kt6mf\" (UID: \"126c769e-8fc5-445a-a80a-2576bf17ce18\") " pod="openshift-image-registry/image-registry-9f8c8b94d-kt6mf" Apr 16 16:02:01.535533 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.535412 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/126c769e-8fc5-445a-a80a-2576bf17ce18-registry-tls\") pod \"image-registry-9f8c8b94d-kt6mf\" (UID: \"126c769e-8fc5-445a-a80a-2576bf17ce18\") " pod="openshift-image-registry/image-registry-9f8c8b94d-kt6mf" Apr 16 16:02:01.535533 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.535458 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/60c25e0c-4dbd-4686-8ab2-4bd45e9f960b-cert\") pod \"ingress-canary-ndvgr\" (UID: \"60c25e0c-4dbd-4686-8ab2-4bd45e9f960b\") " pod="openshift-ingress-canary/ingress-canary-ndvgr" Apr 16 16:02:01.535533 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.535513 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxn4f\" (UniqueName: \"kubernetes.io/projected/c7c2949f-5c5f-49f8-8fa9-bb2e36cd9787-kube-api-access-cxn4f\") pod \"dns-default-hsj24\" (UID: \"c7c2949f-5c5f-49f8-8fa9-bb2e36cd9787\") " pod="openshift-dns/dns-default-hsj24" Apr 16 16:02:01.535712 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.535542 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/126c769e-8fc5-445a-a80a-2576bf17ce18-ca-trust-extracted\") pod \"image-registry-9f8c8b94d-kt6mf\" (UID: \"126c769e-8fc5-445a-a80a-2576bf17ce18\") " pod="openshift-image-registry/image-registry-9f8c8b94d-kt6mf" Apr 16 16:02:01.535712 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.535566 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/126c769e-8fc5-445a-a80a-2576bf17ce18-trusted-ca\") pod \"image-registry-9f8c8b94d-kt6mf\" (UID: \"126c769e-8fc5-445a-a80a-2576bf17ce18\") " pod="openshift-image-registry/image-registry-9f8c8b94d-kt6mf" Apr 16 16:02:01.535712 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.535594 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/126c769e-8fc5-445a-a80a-2576bf17ce18-installation-pull-secrets\") pod \"image-registry-9f8c8b94d-kt6mf\" (UID: \"126c769e-8fc5-445a-a80a-2576bf17ce18\") " pod="openshift-image-registry/image-registry-9f8c8b94d-kt6mf" Apr 16 16:02:01.535712 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.535637 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkgw6\" (UniqueName: \"kubernetes.io/projected/60c25e0c-4dbd-4686-8ab2-4bd45e9f960b-kube-api-access-gkgw6\") pod \"ingress-canary-ndvgr\" (UID: \"60c25e0c-4dbd-4686-8ab2-4bd45e9f960b\") " pod="openshift-ingress-canary/ingress-canary-ndvgr" Apr 16 16:02:01.636596 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.636540 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/126c769e-8fc5-445a-a80a-2576bf17ce18-registry-tls\") pod \"image-registry-9f8c8b94d-kt6mf\" (UID: \"126c769e-8fc5-445a-a80a-2576bf17ce18\") " pod="openshift-image-registry/image-registry-9f8c8b94d-kt6mf" Apr 16 16:02:01.636596 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.636583 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/60c25e0c-4dbd-4686-8ab2-4bd45e9f960b-cert\") pod \"ingress-canary-ndvgr\" (UID: \"60c25e0c-4dbd-4686-8ab2-4bd45e9f960b\") " pod="openshift-ingress-canary/ingress-canary-ndvgr" Apr 16 16:02:01.636738 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.636618 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxn4f\" (UniqueName: \"kubernetes.io/projected/c7c2949f-5c5f-49f8-8fa9-bb2e36cd9787-kube-api-access-cxn4f\") pod \"dns-default-hsj24\" (UID: \"c7c2949f-5c5f-49f8-8fa9-bb2e36cd9787\") " pod="openshift-dns/dns-default-hsj24" Apr 16 16:02:01.636738 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.636646 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/126c769e-8fc5-445a-a80a-2576bf17ce18-ca-trust-extracted\") pod \"image-registry-9f8c8b94d-kt6mf\" (UID: \"126c769e-8fc5-445a-a80a-2576bf17ce18\") " pod="openshift-image-registry/image-registry-9f8c8b94d-kt6mf" Apr 16 16:02:01.636738 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.636672 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/126c769e-8fc5-445a-a80a-2576bf17ce18-trusted-ca\") pod \"image-registry-9f8c8b94d-kt6mf\" (UID: \"126c769e-8fc5-445a-a80a-2576bf17ce18\") " pod="openshift-image-registry/image-registry-9f8c8b94d-kt6mf" Apr 16 16:02:01.636738 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.636699 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/126c769e-8fc5-445a-a80a-2576bf17ce18-installation-pull-secrets\") pod \"image-registry-9f8c8b94d-kt6mf\" (UID: \"126c769e-8fc5-445a-a80a-2576bf17ce18\") " pod="openshift-image-registry/image-registry-9f8c8b94d-kt6mf" Apr 16 16:02:01.636928 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.636835 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gkgw6\" (UniqueName: \"kubernetes.io/projected/60c25e0c-4dbd-4686-8ab2-4bd45e9f960b-kube-api-access-gkgw6\") pod \"ingress-canary-ndvgr\" (UID: \"60c25e0c-4dbd-4686-8ab2-4bd45e9f960b\") " pod="openshift-ingress-canary/ingress-canary-ndvgr" Apr 16 16:02:01.636928 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.636912 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c7c2949f-5c5f-49f8-8fa9-bb2e36cd9787-metrics-tls\") pod \"dns-default-hsj24\" (UID: \"c7c2949f-5c5f-49f8-8fa9-bb2e36cd9787\") " pod="openshift-dns/dns-default-hsj24" Apr 16 16:02:01.637064 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.636942 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c7c2949f-5c5f-49f8-8fa9-bb2e36cd9787-tmp-dir\") pod \"dns-default-hsj24\" (UID: \"c7c2949f-5c5f-49f8-8fa9-bb2e36cd9787\") " pod="openshift-dns/dns-default-hsj24" Apr 16 16:02:01.637064 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.636988 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/126c769e-8fc5-445a-a80a-2576bf17ce18-bound-sa-token\") pod \"image-registry-9f8c8b94d-kt6mf\" (UID: \"126c769e-8fc5-445a-a80a-2576bf17ce18\") " pod="openshift-image-registry/image-registry-9f8c8b94d-kt6mf" Apr 16 16:02:01.637064 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.637025 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7c2949f-5c5f-49f8-8fa9-bb2e36cd9787-config-volume\") pod \"dns-default-hsj24\" (UID: \"c7c2949f-5c5f-49f8-8fa9-bb2e36cd9787\") " pod="openshift-dns/dns-default-hsj24" Apr 16 16:02:01.637212 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.637070 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/126c769e-8fc5-445a-a80a-2576bf17ce18-ca-trust-extracted\") pod \"image-registry-9f8c8b94d-kt6mf\" (UID: \"126c769e-8fc5-445a-a80a-2576bf17ce18\") " pod="openshift-image-registry/image-registry-9f8c8b94d-kt6mf" Apr 16 16:02:01.637212 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.637080 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fb7kq\" (UniqueName: \"kubernetes.io/projected/126c769e-8fc5-445a-a80a-2576bf17ce18-kube-api-access-fb7kq\") pod \"image-registry-9f8c8b94d-kt6mf\" (UID: \"126c769e-8fc5-445a-a80a-2576bf17ce18\") " pod="openshift-image-registry/image-registry-9f8c8b94d-kt6mf" Apr 16 16:02:01.637212 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.637142 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/126c769e-8fc5-445a-a80a-2576bf17ce18-registry-certificates\") pod \"image-registry-9f8c8b94d-kt6mf\" (UID: \"126c769e-8fc5-445a-a80a-2576bf17ce18\") " pod="openshift-image-registry/image-registry-9f8c8b94d-kt6mf" Apr 16 16:02:01.637212 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.637175 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/126c769e-8fc5-445a-a80a-2576bf17ce18-image-registry-private-configuration\") pod \"image-registry-9f8c8b94d-kt6mf\" (UID: \"126c769e-8fc5-445a-a80a-2576bf17ce18\") " pod="openshift-image-registry/image-registry-9f8c8b94d-kt6mf" Apr 16 16:02:01.637662 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.637639 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/126c769e-8fc5-445a-a80a-2576bf17ce18-trusted-ca\") pod \"image-registry-9f8c8b94d-kt6mf\" (UID: \"126c769e-8fc5-445a-a80a-2576bf17ce18\") " pod="openshift-image-registry/image-registry-9f8c8b94d-kt6mf" Apr 16 16:02:01.637808 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.637786 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c7c2949f-5c5f-49f8-8fa9-bb2e36cd9787-tmp-dir\") pod \"dns-default-hsj24\" (UID: \"c7c2949f-5c5f-49f8-8fa9-bb2e36cd9787\") " pod="openshift-dns/dns-default-hsj24" Apr 16 16:02:01.638118 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.638097 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/126c769e-8fc5-445a-a80a-2576bf17ce18-registry-certificates\") pod \"image-registry-9f8c8b94d-kt6mf\" (UID: \"126c769e-8fc5-445a-a80a-2576bf17ce18\") " pod="openshift-image-registry/image-registry-9f8c8b94d-kt6mf" Apr 16 16:02:01.641365 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.641343 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/126c769e-8fc5-445a-a80a-2576bf17ce18-image-registry-private-configuration\") pod \"image-registry-9f8c8b94d-kt6mf\" (UID: \"126c769e-8fc5-445a-a80a-2576bf17ce18\") " pod="openshift-image-registry/image-registry-9f8c8b94d-kt6mf" Apr 16 16:02:01.641458 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.641394 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/126c769e-8fc5-445a-a80a-2576bf17ce18-installation-pull-secrets\") pod \"image-registry-9f8c8b94d-kt6mf\" (UID: \"126c769e-8fc5-445a-a80a-2576bf17ce18\") " pod="openshift-image-registry/image-registry-9f8c8b94d-kt6mf" Apr 16 16:02:01.641458 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.641425 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/60c25e0c-4dbd-4686-8ab2-4bd45e9f960b-cert\") pod \"ingress-canary-ndvgr\" (UID: \"60c25e0c-4dbd-4686-8ab2-4bd45e9f960b\") " pod="openshift-ingress-canary/ingress-canary-ndvgr" Apr 16 16:02:01.641533 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.641475 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/c7c2949f-5c5f-49f8-8fa9-bb2e36cd9787-metrics-tls\") pod \"dns-default-hsj24\" (UID: \"c7c2949f-5c5f-49f8-8fa9-bb2e36cd9787\") " pod="openshift-dns/dns-default-hsj24" Apr 16 16:02:01.641533 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.641485 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/126c769e-8fc5-445a-a80a-2576bf17ce18-registry-tls\") pod \"image-registry-9f8c8b94d-kt6mf\" (UID: \"126c769e-8fc5-445a-a80a-2576bf17ce18\") " pod="openshift-image-registry/image-registry-9f8c8b94d-kt6mf" Apr 16 16:02:01.646454 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.646430 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c7c2949f-5c5f-49f8-8fa9-bb2e36cd9787-config-volume\") pod \"dns-default-hsj24\" (UID: \"c7c2949f-5c5f-49f8-8fa9-bb2e36cd9787\") " pod="openshift-dns/dns-default-hsj24" Apr 16 16:02:01.649032 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.648987 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxn4f\" (UniqueName: \"kubernetes.io/projected/c7c2949f-5c5f-49f8-8fa9-bb2e36cd9787-kube-api-access-cxn4f\") pod \"dns-default-hsj24\" (UID: \"c7c2949f-5c5f-49f8-8fa9-bb2e36cd9787\") " pod="openshift-dns/dns-default-hsj24" Apr 16 16:02:01.649280 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.649256 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkgw6\" (UniqueName: \"kubernetes.io/projected/60c25e0c-4dbd-4686-8ab2-4bd45e9f960b-kube-api-access-gkgw6\") pod \"ingress-canary-ndvgr\" (UID: \"60c25e0c-4dbd-4686-8ab2-4bd45e9f960b\") " pod="openshift-ingress-canary/ingress-canary-ndvgr" Apr 16 16:02:01.651124 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.651106 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fb7kq\" (UniqueName: \"kubernetes.io/projected/126c769e-8fc5-445a-a80a-2576bf17ce18-kube-api-access-fb7kq\") pod \"image-registry-9f8c8b94d-kt6mf\" (UID: \"126c769e-8fc5-445a-a80a-2576bf17ce18\") " pod="openshift-image-registry/image-registry-9f8c8b94d-kt6mf" Apr 16 16:02:01.651345 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.651327 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/126c769e-8fc5-445a-a80a-2576bf17ce18-bound-sa-token\") pod \"image-registry-9f8c8b94d-kt6mf\" (UID: \"126c769e-8fc5-445a-a80a-2576bf17ce18\") " pod="openshift-image-registry/image-registry-9f8c8b94d-kt6mf" Apr 16 16:02:01.694432 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.694416 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:02:01.694520 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.694442 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:02:01.694520 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.694463 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:02:01.697657 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.697624 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-x4n9w\"" Apr 16 16:02:01.697760 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.697720 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 16:02:01.697824 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.697756 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-4vs75\"" Apr 16 16:02:01.697824 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.697756 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 16:02:01.697918 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.697862 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 16:02:01.697918 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.697879 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 16:02:01.770026 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.770006 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-9f8c8b94d-kt6mf" Apr 16 16:02:01.784620 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.784596 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hsj24" Apr 16 16:02:01.804644 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.804622 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-ndvgr" Apr 16 16:02:01.951903 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.951873 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-ndvgr"] Apr 16 16:02:01.955116 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.955092 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hsj24"] Apr 16 16:02:01.960289 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:01.960267 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-9f8c8b94d-kt6mf"] Apr 16 16:02:01.962464 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:02:01.962443 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60c25e0c_4dbd_4686_8ab2_4bd45e9f960b.slice/crio-2a94e63bf8252b8252ec5d65effb7fc745ce0c589ae7c077affa29c36a859f3f WatchSource:0}: Error finding container 2a94e63bf8252b8252ec5d65effb7fc745ce0c589ae7c077affa29c36a859f3f: Status 404 returned error can't find the container with id 2a94e63bf8252b8252ec5d65effb7fc745ce0c589ae7c077affa29c36a859f3f Apr 16 16:02:01.963226 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:02:01.963113 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7c2949f_5c5f_49f8_8fa9_bb2e36cd9787.slice/crio-099f18eb9e27b467165b9ce72073d71740694052eb0a58b253505d8752a7118b WatchSource:0}: Error finding container 099f18eb9e27b467165b9ce72073d71740694052eb0a58b253505d8752a7118b: Status 404 returned error can't find the container with id 099f18eb9e27b467165b9ce72073d71740694052eb0a58b253505d8752a7118b Apr 16 16:02:01.963915 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:02:01.963887 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod126c769e_8fc5_445a_a80a_2576bf17ce18.slice/crio-1d2f76bb1b3629c5cb6f0a524e189a88ccc989f4411a3840452b0f381fb2ba89 WatchSource:0}: Error finding container 1d2f76bb1b3629c5cb6f0a524e189a88ccc989f4411a3840452b0f381fb2ba89: Status 404 returned error can't find the container with id 1d2f76bb1b3629c5cb6f0a524e189a88ccc989f4411a3840452b0f381fb2ba89 Apr 16 16:02:02.869739 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:02.869554 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ndvgr" event={"ID":"60c25e0c-4dbd-4686-8ab2-4bd45e9f960b","Type":"ContainerStarted","Data":"2a94e63bf8252b8252ec5d65effb7fc745ce0c589ae7c077affa29c36a859f3f"} Apr 16 16:02:02.870728 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:02.870705 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hsj24" event={"ID":"c7c2949f-5c5f-49f8-8fa9-bb2e36cd9787","Type":"ContainerStarted","Data":"099f18eb9e27b467165b9ce72073d71740694052eb0a58b253505d8752a7118b"} Apr 16 16:02:02.872260 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:02.872234 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-9f8c8b94d-kt6mf" event={"ID":"126c769e-8fc5-445a-a80a-2576bf17ce18","Type":"ContainerStarted","Data":"84cd5572da2c354bd130d66e67c162c321bba9398f11e9c2f5c75b3845c21e75"} Apr 16 16:02:02.872399 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:02.872268 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-9f8c8b94d-kt6mf" event={"ID":"126c769e-8fc5-445a-a80a-2576bf17ce18","Type":"ContainerStarted","Data":"1d2f76bb1b3629c5cb6f0a524e189a88ccc989f4411a3840452b0f381fb2ba89"} Apr 16 16:02:02.872399 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:02.872383 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-9f8c8b94d-kt6mf" Apr 16 16:02:02.903872 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:02.903510 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-9f8c8b94d-kt6mf" podStartSLOduration=5.903492967 podStartE2EDuration="5.903492967s" podCreationTimestamp="2026-04-16 16:01:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:02:02.901205883 +0000 UTC m=+50.861397622" watchObservedRunningTime="2026-04-16 16:02:02.903492967 +0000 UTC m=+50.863684693" Apr 16 16:02:02.964941 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:02.964908 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-gk2jm"] Apr 16 16:02:02.986766 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:02.986741 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-gk2jm"] Apr 16 16:02:02.986899 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:02.986879 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-gk2jm" Apr 16 16:02:02.989129 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:02.989107 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 16:02:02.989274 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:02.989182 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 16:02:02.989274 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:02.989191 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-4x5t4\"" Apr 16 16:02:03.049014 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:03.048976 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mzl7\" (UniqueName: \"kubernetes.io/projected/22ab41d5-25ed-43e4-bf00-489eafd172e1-kube-api-access-9mzl7\") pod \"downloads-586b57c7b4-gk2jm\" (UID: \"22ab41d5-25ed-43e4-bf00-489eafd172e1\") " pod="openshift-console/downloads-586b57c7b4-gk2jm" Apr 16 16:02:03.150088 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:03.150013 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mzl7\" (UniqueName: \"kubernetes.io/projected/22ab41d5-25ed-43e4-bf00-489eafd172e1-kube-api-access-9mzl7\") pod \"downloads-586b57c7b4-gk2jm\" (UID: \"22ab41d5-25ed-43e4-bf00-489eafd172e1\") " pod="openshift-console/downloads-586b57c7b4-gk2jm" Apr 16 16:02:03.162268 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:03.162239 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mzl7\" (UniqueName: \"kubernetes.io/projected/22ab41d5-25ed-43e4-bf00-489eafd172e1-kube-api-access-9mzl7\") pod \"downloads-586b57c7b4-gk2jm\" (UID: \"22ab41d5-25ed-43e4-bf00-489eafd172e1\") " pod="openshift-console/downloads-586b57c7b4-gk2jm" Apr 16 16:02:03.299721 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:03.299685 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-gk2jm" Apr 16 16:02:04.124355 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:04.124169 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-gk2jm"] Apr 16 16:02:04.129056 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:02:04.129011 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22ab41d5_25ed_43e4_bf00_489eafd172e1.slice/crio-662affa9bbcd723f37fb21ccc89b87e9c3ce63fe0bf891a807c61dd6750e7dd9 WatchSource:0}: Error finding container 662affa9bbcd723f37fb21ccc89b87e9c3ce63fe0bf891a807c61dd6750e7dd9: Status 404 returned error can't find the container with id 662affa9bbcd723f37fb21ccc89b87e9c3ce63fe0bf891a807c61dd6750e7dd9 Apr 16 16:02:04.878653 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:04.878612 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-ndvgr" event={"ID":"60c25e0c-4dbd-4686-8ab2-4bd45e9f960b","Type":"ContainerStarted","Data":"fcc9f679ddb1464a52e1fbd60fa254a1aaf279eb4655bcaa912e8d52eb983d67"} Apr 16 16:02:04.879705 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:04.879678 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-gk2jm" event={"ID":"22ab41d5-25ed-43e4-bf00-489eafd172e1","Type":"ContainerStarted","Data":"662affa9bbcd723f37fb21ccc89b87e9c3ce63fe0bf891a807c61dd6750e7dd9"} Apr 16 16:02:04.881476 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:04.881449 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hsj24" event={"ID":"c7c2949f-5c5f-49f8-8fa9-bb2e36cd9787","Type":"ContainerStarted","Data":"4387c0c3acdaa1c65a7b152e7f914fa4e87f57cf8c49fe8497832c11eada516e"} Apr 16 16:02:04.881593 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:04.881480 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hsj24" event={"ID":"c7c2949f-5c5f-49f8-8fa9-bb2e36cd9787","Type":"ContainerStarted","Data":"16129174e2c038088a25e83a41b9be5d19fae6778e10accf33f92842dde3f0a9"} Apr 16 16:02:04.881652 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:04.881609 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-hsj24" Apr 16 16:02:04.895399 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:04.895356 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-ndvgr" podStartSLOduration=1.869340748 podStartE2EDuration="3.895343803s" podCreationTimestamp="2026-04-16 16:02:01 +0000 UTC" firstStartedPulling="2026-04-16 16:02:01.964552334 +0000 UTC m=+49.924744039" lastFinishedPulling="2026-04-16 16:02:03.990555391 +0000 UTC m=+51.950747094" observedRunningTime="2026-04-16 16:02:04.894404664 +0000 UTC m=+52.854596389" watchObservedRunningTime="2026-04-16 16:02:04.895343803 +0000 UTC m=+52.855535527" Apr 16 16:02:04.912339 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:04.912272 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-hsj24" podStartSLOduration=1.89248675 podStartE2EDuration="3.912256928s" podCreationTimestamp="2026-04-16 16:02:01 +0000 UTC" firstStartedPulling="2026-04-16 16:02:01.96504563 +0000 UTC m=+49.925237332" lastFinishedPulling="2026-04-16 16:02:03.984815803 +0000 UTC m=+51.945007510" observedRunningTime="2026-04-16 16:02:04.911718116 +0000 UTC m=+52.871909831" watchObservedRunningTime="2026-04-16 16:02:04.912256928 +0000 UTC m=+52.872448645" Apr 16 16:02:05.035849 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:05.035823 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hsj24_c7c2949f-5c5f-49f8-8fa9-bb2e36cd9787/dns/0.log" Apr 16 16:02:05.216443 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:05.216407 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hsj24_c7c2949f-5c5f-49f8-8fa9-bb2e36cd9787/kube-rbac-proxy/0.log" Apr 16 16:02:06.020254 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:06.020226 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-lrjg9_80b825a5-15e5-402b-ae57-d2e283b0e8f8/dns-node-resolver/0.log" Apr 16 16:02:06.419281 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:06.419255 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-9f8c8b94d-kt6mf_126c769e-8fc5-445a-a80a-2576bf17ce18/registry/0.log" Apr 16 16:02:07.016370 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:07.016340 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-nffwb_07434d52-3b5d-4eaf-ba37-9f1b957e938a/node-ca/0.log" Apr 16 16:02:07.816474 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:07.816357 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-ndvgr_60c25e0c-4dbd-4686-8ab2-4bd45e9f960b/serve-healthcheck-canary/0.log" Apr 16 16:02:10.826565 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:10.826538 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-65x5s" Apr 16 16:02:11.462376 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:11.462345 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-b88fd7674-fhszj"] Apr 16 16:02:11.464976 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:11.464929 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b88fd7674-fhszj" Apr 16 16:02:11.467717 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:11.467372 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 16 16:02:11.467717 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:11.467419 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 16 16:02:11.467717 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:11.467445 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-pp558\"" Apr 16 16:02:11.467717 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:11.467460 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 16 16:02:11.467717 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:11.467480 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 16 16:02:11.467717 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:11.467595 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 16 16:02:11.473998 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:11.473939 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b88fd7674-fhszj"] Apr 16 16:02:11.508147 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:11.508123 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-service-ca\") pod \"console-b88fd7674-fhszj\" (UID: \"0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9\") " pod="openshift-console/console-b88fd7674-fhszj" Apr 16 16:02:11.508255 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:11.508185 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-console-serving-cert\") pod \"console-b88fd7674-fhszj\" (UID: \"0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9\") " pod="openshift-console/console-b88fd7674-fhszj" Apr 16 16:02:11.508255 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:11.508243 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-console-config\") pod \"console-b88fd7674-fhszj\" (UID: \"0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9\") " pod="openshift-console/console-b88fd7674-fhszj" Apr 16 16:02:11.508358 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:11.508287 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-console-oauth-config\") pod \"console-b88fd7674-fhszj\" (UID: \"0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9\") " pod="openshift-console/console-b88fd7674-fhszj" Apr 16 16:02:11.508358 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:11.508305 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-oauth-serving-cert\") pod \"console-b88fd7674-fhszj\" (UID: \"0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9\") " pod="openshift-console/console-b88fd7674-fhszj" Apr 16 16:02:11.508358 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:11.508326 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g45ws\" (UniqueName: \"kubernetes.io/projected/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-kube-api-access-g45ws\") pod \"console-b88fd7674-fhszj\" (UID: \"0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9\") " pod="openshift-console/console-b88fd7674-fhszj" Apr 16 16:02:11.609600 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:11.609574 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-console-oauth-config\") pod \"console-b88fd7674-fhszj\" (UID: \"0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9\") " pod="openshift-console/console-b88fd7674-fhszj" Apr 16 16:02:11.609746 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:11.609610 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-oauth-serving-cert\") pod \"console-b88fd7674-fhszj\" (UID: \"0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9\") " pod="openshift-console/console-b88fd7674-fhszj" Apr 16 16:02:11.609746 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:11.609643 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g45ws\" (UniqueName: \"kubernetes.io/projected/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-kube-api-access-g45ws\") pod \"console-b88fd7674-fhszj\" (UID: \"0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9\") " pod="openshift-console/console-b88fd7674-fhszj" Apr 16 16:02:11.609746 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:11.609680 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-service-ca\") pod \"console-b88fd7674-fhszj\" (UID: \"0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9\") " pod="openshift-console/console-b88fd7674-fhszj" Apr 16 16:02:11.609899 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:11.609759 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-console-serving-cert\") pod \"console-b88fd7674-fhszj\" (UID: \"0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9\") " pod="openshift-console/console-b88fd7674-fhszj" Apr 16 16:02:11.609899 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:11.609800 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-console-config\") pod \"console-b88fd7674-fhszj\" (UID: \"0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9\") " pod="openshift-console/console-b88fd7674-fhszj" Apr 16 16:02:11.610794 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:11.610666 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-console-config\") pod \"console-b88fd7674-fhszj\" (UID: \"0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9\") " pod="openshift-console/console-b88fd7674-fhszj" Apr 16 16:02:11.610794 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:11.610699 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-oauth-serving-cert\") pod \"console-b88fd7674-fhszj\" (UID: \"0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9\") " pod="openshift-console/console-b88fd7674-fhszj" Apr 16 16:02:11.612619 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:11.612596 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-service-ca\") pod \"console-b88fd7674-fhszj\" (UID: \"0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9\") " pod="openshift-console/console-b88fd7674-fhszj" Apr 16 16:02:11.614044 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:11.614022 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-console-oauth-config\") pod \"console-b88fd7674-fhszj\" (UID: \"0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9\") " pod="openshift-console/console-b88fd7674-fhszj" Apr 16 16:02:11.614141 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:11.614085 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-console-serving-cert\") pod \"console-b88fd7674-fhszj\" (UID: \"0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9\") " pod="openshift-console/console-b88fd7674-fhszj" Apr 16 16:02:11.618708 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:11.618689 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g45ws\" (UniqueName: \"kubernetes.io/projected/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-kube-api-access-g45ws\") pod \"console-b88fd7674-fhszj\" (UID: \"0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9\") " pod="openshift-console/console-b88fd7674-fhszj" Apr 16 16:02:11.776231 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:11.776161 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b88fd7674-fhszj" Apr 16 16:02:11.912724 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:11.912692 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b88fd7674-fhszj"] Apr 16 16:02:12.169238 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.169207 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-qdg8s"] Apr 16 16:02:12.172022 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.172003 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-jnqnv"] Apr 16 16:02:12.172203 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.172175 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-qdg8s" Apr 16 16:02:12.176197 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.176169 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-j87g5\"" Apr 16 16:02:12.176607 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.176589 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 16:02:12.176657 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.176643 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 16:02:12.177896 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.177132 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 16:02:12.177896 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.177235 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 16:02:12.177896 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.177501 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jnqnv" Apr 16 16:02:12.177896 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.177585 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 16:02:12.177896 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.177507 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 16:02:12.180731 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.180703 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 16:02:12.180837 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.180708 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 16:02:12.180837 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.180821 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-rqdzt\"" Apr 16 16:02:12.180978 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.180822 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 16:02:12.183713 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.183691 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-qdg8s"] Apr 16 16:02:12.214426 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.214393 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9wcgr\" (UniqueName: \"kubernetes.io/projected/90876a88-2791-4518-a822-b1b69a071e6f-kube-api-access-9wcgr\") pod \"kube-state-metrics-7479c89684-qdg8s\" (UID: \"90876a88-2791-4518-a822-b1b69a071e6f\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qdg8s" Apr 16 16:02:12.214543 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.214432 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/86c14ee5-7e88-4559-a96d-9147a4e36c13-sys\") pod \"node-exporter-jnqnv\" (UID: \"86c14ee5-7e88-4559-a96d-9147a4e36c13\") " pod="openshift-monitoring/node-exporter-jnqnv" Apr 16 16:02:12.214543 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.214459 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/86c14ee5-7e88-4559-a96d-9147a4e36c13-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jnqnv\" (UID: \"86c14ee5-7e88-4559-a96d-9147a4e36c13\") " pod="openshift-monitoring/node-exporter-jnqnv" Apr 16 16:02:12.214662 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.214538 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/90876a88-2791-4518-a822-b1b69a071e6f-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-qdg8s\" (UID: \"90876a88-2791-4518-a822-b1b69a071e6f\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qdg8s" Apr 16 16:02:12.214662 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.214583 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/86c14ee5-7e88-4559-a96d-9147a4e36c13-node-exporter-textfile\") pod \"node-exporter-jnqnv\" (UID: \"86c14ee5-7e88-4559-a96d-9147a4e36c13\") " pod="openshift-monitoring/node-exporter-jnqnv" Apr 16 16:02:12.214662 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.214620 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/90876a88-2791-4518-a822-b1b69a071e6f-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-qdg8s\" (UID: \"90876a88-2791-4518-a822-b1b69a071e6f\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qdg8s" Apr 16 16:02:12.214662 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.214652 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/90876a88-2791-4518-a822-b1b69a071e6f-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-qdg8s\" (UID: \"90876a88-2791-4518-a822-b1b69a071e6f\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qdg8s" Apr 16 16:02:12.214907 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.214684 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrmjd\" (UniqueName: \"kubernetes.io/projected/86c14ee5-7e88-4559-a96d-9147a4e36c13-kube-api-access-lrmjd\") pod \"node-exporter-jnqnv\" (UID: \"86c14ee5-7e88-4559-a96d-9147a4e36c13\") " pod="openshift-monitoring/node-exporter-jnqnv" Apr 16 16:02:12.214907 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.214715 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/86c14ee5-7e88-4559-a96d-9147a4e36c13-metrics-client-ca\") pod \"node-exporter-jnqnv\" (UID: \"86c14ee5-7e88-4559-a96d-9147a4e36c13\") " pod="openshift-monitoring/node-exporter-jnqnv" Apr 16 16:02:12.214907 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.214743 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/86c14ee5-7e88-4559-a96d-9147a4e36c13-node-exporter-wtmp\") pod \"node-exporter-jnqnv\" (UID: \"86c14ee5-7e88-4559-a96d-9147a4e36c13\") " pod="openshift-monitoring/node-exporter-jnqnv" Apr 16 16:02:12.214907 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.214806 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/86c14ee5-7e88-4559-a96d-9147a4e36c13-node-exporter-accelerators-collector-config\") pod \"node-exporter-jnqnv\" (UID: \"86c14ee5-7e88-4559-a96d-9147a4e36c13\") " pod="openshift-monitoring/node-exporter-jnqnv" Apr 16 16:02:12.214907 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.214857 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/86c14ee5-7e88-4559-a96d-9147a4e36c13-root\") pod \"node-exporter-jnqnv\" (UID: \"86c14ee5-7e88-4559-a96d-9147a4e36c13\") " pod="openshift-monitoring/node-exporter-jnqnv" Apr 16 16:02:12.215168 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.214907 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/86c14ee5-7e88-4559-a96d-9147a4e36c13-node-exporter-tls\") pod \"node-exporter-jnqnv\" (UID: \"86c14ee5-7e88-4559-a96d-9147a4e36c13\") " pod="openshift-monitoring/node-exporter-jnqnv" Apr 16 16:02:12.215168 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.215085 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/90876a88-2791-4518-a822-b1b69a071e6f-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-qdg8s\" (UID: \"90876a88-2791-4518-a822-b1b69a071e6f\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qdg8s" Apr 16 16:02:12.215168 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.215120 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/90876a88-2791-4518-a822-b1b69a071e6f-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-qdg8s\" (UID: \"90876a88-2791-4518-a822-b1b69a071e6f\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qdg8s" Apr 16 16:02:12.315451 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.315421 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/90876a88-2791-4518-a822-b1b69a071e6f-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-qdg8s\" (UID: \"90876a88-2791-4518-a822-b1b69a071e6f\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qdg8s" Apr 16 16:02:12.315451 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.315460 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/90876a88-2791-4518-a822-b1b69a071e6f-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-qdg8s\" (UID: \"90876a88-2791-4518-a822-b1b69a071e6f\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qdg8s" Apr 16 16:02:12.315702 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.315493 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9wcgr\" (UniqueName: \"kubernetes.io/projected/90876a88-2791-4518-a822-b1b69a071e6f-kube-api-access-9wcgr\") pod \"kube-state-metrics-7479c89684-qdg8s\" (UID: \"90876a88-2791-4518-a822-b1b69a071e6f\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qdg8s" Apr 16 16:02:12.315702 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.315516 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/86c14ee5-7e88-4559-a96d-9147a4e36c13-sys\") pod \"node-exporter-jnqnv\" (UID: \"86c14ee5-7e88-4559-a96d-9147a4e36c13\") " pod="openshift-monitoring/node-exporter-jnqnv" Apr 16 16:02:12.315702 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.315540 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/86c14ee5-7e88-4559-a96d-9147a4e36c13-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jnqnv\" (UID: \"86c14ee5-7e88-4559-a96d-9147a4e36c13\") " pod="openshift-monitoring/node-exporter-jnqnv" Apr 16 16:02:12.315702 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.315583 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/90876a88-2791-4518-a822-b1b69a071e6f-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-qdg8s\" (UID: \"90876a88-2791-4518-a822-b1b69a071e6f\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qdg8s" Apr 16 16:02:12.315702 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.315613 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/86c14ee5-7e88-4559-a96d-9147a4e36c13-node-exporter-textfile\") pod \"node-exporter-jnqnv\" (UID: \"86c14ee5-7e88-4559-a96d-9147a4e36c13\") " pod="openshift-monitoring/node-exporter-jnqnv" Apr 16 16:02:12.315702 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.315645 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/90876a88-2791-4518-a822-b1b69a071e6f-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-qdg8s\" (UID: \"90876a88-2791-4518-a822-b1b69a071e6f\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qdg8s" Apr 16 16:02:12.315702 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.315678 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/90876a88-2791-4518-a822-b1b69a071e6f-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-qdg8s\" (UID: \"90876a88-2791-4518-a822-b1b69a071e6f\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qdg8s" Apr 16 16:02:12.316083 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.315708 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrmjd\" (UniqueName: \"kubernetes.io/projected/86c14ee5-7e88-4559-a96d-9147a4e36c13-kube-api-access-lrmjd\") pod \"node-exporter-jnqnv\" (UID: \"86c14ee5-7e88-4559-a96d-9147a4e36c13\") " pod="openshift-monitoring/node-exporter-jnqnv" Apr 16 16:02:12.316083 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.315736 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/86c14ee5-7e88-4559-a96d-9147a4e36c13-metrics-client-ca\") pod \"node-exporter-jnqnv\" (UID: \"86c14ee5-7e88-4559-a96d-9147a4e36c13\") " pod="openshift-monitoring/node-exporter-jnqnv" Apr 16 16:02:12.316083 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.315762 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/86c14ee5-7e88-4559-a96d-9147a4e36c13-node-exporter-wtmp\") pod \"node-exporter-jnqnv\" (UID: \"86c14ee5-7e88-4559-a96d-9147a4e36c13\") " pod="openshift-monitoring/node-exporter-jnqnv" Apr 16 16:02:12.316083 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.315807 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/86c14ee5-7e88-4559-a96d-9147a4e36c13-node-exporter-accelerators-collector-config\") pod \"node-exporter-jnqnv\" (UID: \"86c14ee5-7e88-4559-a96d-9147a4e36c13\") " pod="openshift-monitoring/node-exporter-jnqnv" Apr 16 16:02:12.316083 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.315843 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/86c14ee5-7e88-4559-a96d-9147a4e36c13-root\") pod \"node-exporter-jnqnv\" (UID: \"86c14ee5-7e88-4559-a96d-9147a4e36c13\") " pod="openshift-monitoring/node-exporter-jnqnv" Apr 16 16:02:12.316083 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.315870 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/86c14ee5-7e88-4559-a96d-9147a4e36c13-node-exporter-tls\") pod \"node-exporter-jnqnv\" (UID: \"86c14ee5-7e88-4559-a96d-9147a4e36c13\") " pod="openshift-monitoring/node-exporter-jnqnv" Apr 16 16:02:12.316083 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.315874 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/90876a88-2791-4518-a822-b1b69a071e6f-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-qdg8s\" (UID: \"90876a88-2791-4518-a822-b1b69a071e6f\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qdg8s" Apr 16 16:02:12.316083 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:02:12.316026 2584 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 16 16:02:12.316494 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:02:12.316105 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90876a88-2791-4518-a822-b1b69a071e6f-kube-state-metrics-tls podName:90876a88-2791-4518-a822-b1b69a071e6f nodeName:}" failed. No retries permitted until 2026-04-16 16:02:12.81608434 +0000 UTC m=+60.776276042 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/90876a88-2791-4518-a822-b1b69a071e6f-kube-state-metrics-tls") pod "kube-state-metrics-7479c89684-qdg8s" (UID: "90876a88-2791-4518-a822-b1b69a071e6f") : secret "kube-state-metrics-tls" not found Apr 16 16:02:12.316494 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.316162 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/86c14ee5-7e88-4559-a96d-9147a4e36c13-root\") pod \"node-exporter-jnqnv\" (UID: \"86c14ee5-7e88-4559-a96d-9147a4e36c13\") " pod="openshift-monitoring/node-exporter-jnqnv" Apr 16 16:02:12.316494 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.316230 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/86c14ee5-7e88-4559-a96d-9147a4e36c13-sys\") pod \"node-exporter-jnqnv\" (UID: \"86c14ee5-7e88-4559-a96d-9147a4e36c13\") " pod="openshift-monitoring/node-exporter-jnqnv" Apr 16 16:02:12.316494 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:02:12.316463 2584 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 16:02:12.316692 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:02:12.316513 2584 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/86c14ee5-7e88-4559-a96d-9147a4e36c13-node-exporter-tls podName:86c14ee5-7e88-4559-a96d-9147a4e36c13 nodeName:}" failed. No retries permitted until 2026-04-16 16:02:12.816498569 +0000 UTC m=+60.776690273 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/86c14ee5-7e88-4559-a96d-9147a4e36c13-node-exporter-tls") pod "node-exporter-jnqnv" (UID: "86c14ee5-7e88-4559-a96d-9147a4e36c13") : secret "node-exporter-tls" not found Apr 16 16:02:12.316692 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.316524 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/86c14ee5-7e88-4559-a96d-9147a4e36c13-node-exporter-accelerators-collector-config\") pod \"node-exporter-jnqnv\" (UID: \"86c14ee5-7e88-4559-a96d-9147a4e36c13\") " pod="openshift-monitoring/node-exporter-jnqnv" Apr 16 16:02:12.316692 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.316636 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/86c14ee5-7e88-4559-a96d-9147a4e36c13-node-exporter-wtmp\") pod \"node-exporter-jnqnv\" (UID: \"86c14ee5-7e88-4559-a96d-9147a4e36c13\") " pod="openshift-monitoring/node-exporter-jnqnv" Apr 16 16:02:12.316867 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.316701 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/90876a88-2791-4518-a822-b1b69a071e6f-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-qdg8s\" (UID: \"90876a88-2791-4518-a822-b1b69a071e6f\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qdg8s" Apr 16 16:02:12.316920 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.316884 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/86c14ee5-7e88-4559-a96d-9147a4e36c13-metrics-client-ca\") pod \"node-exporter-jnqnv\" (UID: \"86c14ee5-7e88-4559-a96d-9147a4e36c13\") " pod="openshift-monitoring/node-exporter-jnqnv" Apr 16 16:02:12.317005 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.316986 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/86c14ee5-7e88-4559-a96d-9147a4e36c13-node-exporter-textfile\") pod \"node-exporter-jnqnv\" (UID: \"86c14ee5-7e88-4559-a96d-9147a4e36c13\") " pod="openshift-monitoring/node-exporter-jnqnv" Apr 16 16:02:12.317371 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.317353 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/90876a88-2791-4518-a822-b1b69a071e6f-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-qdg8s\" (UID: \"90876a88-2791-4518-a822-b1b69a071e6f\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qdg8s" Apr 16 16:02:12.318481 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.318454 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/90876a88-2791-4518-a822-b1b69a071e6f-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-qdg8s\" (UID: \"90876a88-2791-4518-a822-b1b69a071e6f\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qdg8s" Apr 16 16:02:12.318594 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.318486 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/86c14ee5-7e88-4559-a96d-9147a4e36c13-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-jnqnv\" (UID: \"86c14ee5-7e88-4559-a96d-9147a4e36c13\") " pod="openshift-monitoring/node-exporter-jnqnv" Apr 16 16:02:12.326091 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.326070 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9wcgr\" (UniqueName: \"kubernetes.io/projected/90876a88-2791-4518-a822-b1b69a071e6f-kube-api-access-9wcgr\") pod \"kube-state-metrics-7479c89684-qdg8s\" (UID: \"90876a88-2791-4518-a822-b1b69a071e6f\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qdg8s" Apr 16 16:02:12.326344 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.326328 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrmjd\" (UniqueName: \"kubernetes.io/projected/86c14ee5-7e88-4559-a96d-9147a4e36c13-kube-api-access-lrmjd\") pod \"node-exporter-jnqnv\" (UID: \"86c14ee5-7e88-4559-a96d-9147a4e36c13\") " pod="openshift-monitoring/node-exporter-jnqnv" Apr 16 16:02:12.820468 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.820421 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/90876a88-2791-4518-a822-b1b69a071e6f-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-qdg8s\" (UID: \"90876a88-2791-4518-a822-b1b69a071e6f\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qdg8s" Apr 16 16:02:12.820638 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.820496 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/86c14ee5-7e88-4559-a96d-9147a4e36c13-node-exporter-tls\") pod \"node-exporter-jnqnv\" (UID: \"86c14ee5-7e88-4559-a96d-9147a4e36c13\") " pod="openshift-monitoring/node-exporter-jnqnv" Apr 16 16:02:12.825916 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.825859 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/86c14ee5-7e88-4559-a96d-9147a4e36c13-node-exporter-tls\") pod \"node-exporter-jnqnv\" (UID: \"86c14ee5-7e88-4559-a96d-9147a4e36c13\") " pod="openshift-monitoring/node-exporter-jnqnv" Apr 16 16:02:12.832686 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.832658 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/90876a88-2791-4518-a822-b1b69a071e6f-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-qdg8s\" (UID: \"90876a88-2791-4518-a822-b1b69a071e6f\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-qdg8s" Apr 16 16:02:12.901301 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:12.901258 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b88fd7674-fhszj" event={"ID":"0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9","Type":"ContainerStarted","Data":"1ce0737ac8a2ed531f407f575baa369f3ec9099e55a2cc35e6bd88c0d15003c6"} Apr 16 16:02:13.088678 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:13.088585 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-j87g5\"" Apr 16 16:02:13.095771 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:13.095746 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-rqdzt\"" Apr 16 16:02:13.096737 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:13.096711 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-qdg8s" Apr 16 16:02:13.105423 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:13.104084 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-jnqnv" Apr 16 16:02:13.117060 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:02:13.117034 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod86c14ee5_7e88_4559_a96d_9147a4e36c13.slice/crio-546408e9c8a29c87f400db1de18f62712417ce3e392fec22c5bdec2e4af5a012 WatchSource:0}: Error finding container 546408e9c8a29c87f400db1de18f62712417ce3e392fec22c5bdec2e4af5a012: Status 404 returned error can't find the container with id 546408e9c8a29c87f400db1de18f62712417ce3e392fec22c5bdec2e4af5a012 Apr 16 16:02:13.247454 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:13.247425 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-qdg8s"] Apr 16 16:02:13.251292 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:02:13.251261 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod90876a88_2791_4518_a822_b1b69a071e6f.slice/crio-cc3896f2c77740bc4c79d9435fb5f4850a75f6add13d1b18fb913fd30abd70e0 WatchSource:0}: Error finding container cc3896f2c77740bc4c79d9435fb5f4850a75f6add13d1b18fb913fd30abd70e0: Status 404 returned error can't find the container with id cc3896f2c77740bc4c79d9435fb5f4850a75f6add13d1b18fb913fd30abd70e0 Apr 16 16:02:13.905313 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:13.905250 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jnqnv" event={"ID":"86c14ee5-7e88-4559-a96d-9147a4e36c13","Type":"ContainerStarted","Data":"546408e9c8a29c87f400db1de18f62712417ce3e392fec22c5bdec2e4af5a012"} Apr 16 16:02:13.906566 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:13.906518 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-qdg8s" event={"ID":"90876a88-2791-4518-a822-b1b69a071e6f","Type":"ContainerStarted","Data":"cc3896f2c77740bc4c79d9435fb5f4850a75f6add13d1b18fb913fd30abd70e0"} Apr 16 16:02:14.209464 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.209427 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-679899d9dd-hkjsf"] Apr 16 16:02:14.212354 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.212331 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" Apr 16 16:02:14.215328 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.215305 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 16 16:02:14.215444 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.215335 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-da3o4okebsl2h\"" Apr 16 16:02:14.215444 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.215358 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 16 16:02:14.215444 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.215394 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-rlcdz\"" Apr 16 16:02:14.215650 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.215632 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 16 16:02:14.215724 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.215685 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 16 16:02:14.215724 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.215631 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 16 16:02:14.226685 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.226222 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-679899d9dd-hkjsf"] Apr 16 16:02:14.233052 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.233026 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/63f686c4-c679-48b9-bd65-7a861737f564-secret-thanos-querier-tls\") pod \"thanos-querier-679899d9dd-hkjsf\" (UID: \"63f686c4-c679-48b9-bd65-7a861737f564\") " pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" Apr 16 16:02:14.233160 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.233075 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/63f686c4-c679-48b9-bd65-7a861737f564-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-679899d9dd-hkjsf\" (UID: \"63f686c4-c679-48b9-bd65-7a861737f564\") " pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" Apr 16 16:02:14.233160 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.233118 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/63f686c4-c679-48b9-bd65-7a861737f564-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-679899d9dd-hkjsf\" (UID: \"63f686c4-c679-48b9-bd65-7a861737f564\") " pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" Apr 16 16:02:14.233160 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.233143 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/63f686c4-c679-48b9-bd65-7a861737f564-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-679899d9dd-hkjsf\" (UID: \"63f686c4-c679-48b9-bd65-7a861737f564\") " pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" Apr 16 16:02:14.233326 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.233170 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/63f686c4-c679-48b9-bd65-7a861737f564-secret-grpc-tls\") pod \"thanos-querier-679899d9dd-hkjsf\" (UID: \"63f686c4-c679-48b9-bd65-7a861737f564\") " pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" Apr 16 16:02:14.233326 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.233191 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/63f686c4-c679-48b9-bd65-7a861737f564-metrics-client-ca\") pod \"thanos-querier-679899d9dd-hkjsf\" (UID: \"63f686c4-c679-48b9-bd65-7a861737f564\") " pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" Apr 16 16:02:14.233326 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.233217 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/63f686c4-c679-48b9-bd65-7a861737f564-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-679899d9dd-hkjsf\" (UID: \"63f686c4-c679-48b9-bd65-7a861737f564\") " pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" Apr 16 16:02:14.233326 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.233244 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4qd5\" (UniqueName: \"kubernetes.io/projected/63f686c4-c679-48b9-bd65-7a861737f564-kube-api-access-k4qd5\") pod \"thanos-querier-679899d9dd-hkjsf\" (UID: \"63f686c4-c679-48b9-bd65-7a861737f564\") " pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" Apr 16 16:02:14.334246 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.334207 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/63f686c4-c679-48b9-bd65-7a861737f564-secret-grpc-tls\") pod \"thanos-querier-679899d9dd-hkjsf\" (UID: \"63f686c4-c679-48b9-bd65-7a861737f564\") " pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" Apr 16 16:02:14.334462 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.334252 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/63f686c4-c679-48b9-bd65-7a861737f564-metrics-client-ca\") pod \"thanos-querier-679899d9dd-hkjsf\" (UID: \"63f686c4-c679-48b9-bd65-7a861737f564\") " pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" Apr 16 16:02:14.334524 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.334455 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/63f686c4-c679-48b9-bd65-7a861737f564-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-679899d9dd-hkjsf\" (UID: \"63f686c4-c679-48b9-bd65-7a861737f564\") " pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" Apr 16 16:02:14.334524 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.334499 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4qd5\" (UniqueName: \"kubernetes.io/projected/63f686c4-c679-48b9-bd65-7a861737f564-kube-api-access-k4qd5\") pod \"thanos-querier-679899d9dd-hkjsf\" (UID: \"63f686c4-c679-48b9-bd65-7a861737f564\") " pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" Apr 16 16:02:14.334615 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.334571 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/63f686c4-c679-48b9-bd65-7a861737f564-secret-thanos-querier-tls\") pod \"thanos-querier-679899d9dd-hkjsf\" (UID: \"63f686c4-c679-48b9-bd65-7a861737f564\") " pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" Apr 16 16:02:14.334615 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.334605 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/63f686c4-c679-48b9-bd65-7a861737f564-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-679899d9dd-hkjsf\" (UID: \"63f686c4-c679-48b9-bd65-7a861737f564\") " pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" Apr 16 16:02:14.334724 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.334652 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/63f686c4-c679-48b9-bd65-7a861737f564-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-679899d9dd-hkjsf\" (UID: \"63f686c4-c679-48b9-bd65-7a861737f564\") " pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" Apr 16 16:02:14.334724 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.334686 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/63f686c4-c679-48b9-bd65-7a861737f564-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-679899d9dd-hkjsf\" (UID: \"63f686c4-c679-48b9-bd65-7a861737f564\") " pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" Apr 16 16:02:14.335013 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.334988 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/63f686c4-c679-48b9-bd65-7a861737f564-metrics-client-ca\") pod \"thanos-querier-679899d9dd-hkjsf\" (UID: \"63f686c4-c679-48b9-bd65-7a861737f564\") " pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" Apr 16 16:02:14.337996 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.337931 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/63f686c4-c679-48b9-bd65-7a861737f564-secret-grpc-tls\") pod \"thanos-querier-679899d9dd-hkjsf\" (UID: \"63f686c4-c679-48b9-bd65-7a861737f564\") " pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" Apr 16 16:02:14.338114 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.337993 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/63f686c4-c679-48b9-bd65-7a861737f564-secret-thanos-querier-tls\") pod \"thanos-querier-679899d9dd-hkjsf\" (UID: \"63f686c4-c679-48b9-bd65-7a861737f564\") " pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" Apr 16 16:02:14.338193 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.338171 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/63f686c4-c679-48b9-bd65-7a861737f564-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-679899d9dd-hkjsf\" (UID: \"63f686c4-c679-48b9-bd65-7a861737f564\") " pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" Apr 16 16:02:14.338317 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.338293 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/63f686c4-c679-48b9-bd65-7a861737f564-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-679899d9dd-hkjsf\" (UID: \"63f686c4-c679-48b9-bd65-7a861737f564\") " pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" Apr 16 16:02:14.338422 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.338330 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/63f686c4-c679-48b9-bd65-7a861737f564-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-679899d9dd-hkjsf\" (UID: \"63f686c4-c679-48b9-bd65-7a861737f564\") " pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" Apr 16 16:02:14.338757 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.338738 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/63f686c4-c679-48b9-bd65-7a861737f564-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-679899d9dd-hkjsf\" (UID: \"63f686c4-c679-48b9-bd65-7a861737f564\") " pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" Apr 16 16:02:14.343840 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.343819 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4qd5\" (UniqueName: \"kubernetes.io/projected/63f686c4-c679-48b9-bd65-7a861737f564-kube-api-access-k4qd5\") pod \"thanos-querier-679899d9dd-hkjsf\" (UID: \"63f686c4-c679-48b9-bd65-7a861737f564\") " pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" Apr 16 16:02:14.525230 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.525143 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" Apr 16 16:02:14.886664 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:14.886575 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-hsj24" Apr 16 16:02:16.924804 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:16.924775 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-wdmfj"] Apr 16 16:02:16.927598 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:16.927579 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-wdmfj" Apr 16 16:02:16.930035 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:16.930015 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 16 16:02:16.930156 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:16.930088 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-4qcrx\"" Apr 16 16:02:16.935743 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:16.935711 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-wdmfj"] Apr 16 16:02:16.956567 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:16.956545 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/09ea5b31-549a-496a-9830-4728c4f2ca13-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-wdmfj\" (UID: \"09ea5b31-549a-496a-9830-4728c4f2ca13\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-wdmfj" Apr 16 16:02:17.057778 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:17.057738 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/09ea5b31-549a-496a-9830-4728c4f2ca13-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-wdmfj\" (UID: \"09ea5b31-549a-496a-9830-4728c4f2ca13\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-wdmfj" Apr 16 16:02:17.060533 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:17.060501 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/09ea5b31-549a-496a-9830-4728c4f2ca13-monitoring-plugin-cert\") pod \"monitoring-plugin-5876b4bbc7-wdmfj\" (UID: \"09ea5b31-549a-496a-9830-4728c4f2ca13\") " pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-wdmfj" Apr 16 16:02:17.238815 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:17.238733 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-wdmfj" Apr 16 16:02:17.360699 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:17.360656 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d280ea9a-de22-4d14-8870-0fbcbb459f8f-metrics-certs\") pod \"network-metrics-daemon-snrt9\" (UID: \"d280ea9a-de22-4d14-8870-0fbcbb459f8f\") " pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:02:17.363388 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:17.363362 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 16:02:17.373981 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:17.373947 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d280ea9a-de22-4d14-8870-0fbcbb459f8f-metrics-certs\") pod \"network-metrics-daemon-snrt9\" (UID: \"d280ea9a-de22-4d14-8870-0fbcbb459f8f\") " pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:02:17.461545 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:17.461518 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wgm8h\" (UniqueName: \"kubernetes.io/projected/a3db670c-92c1-42b9-94ed-04726fe1d07a-kube-api-access-wgm8h\") pod \"network-check-target-xg7zh\" (UID: \"a3db670c-92c1-42b9-94ed-04726fe1d07a\") " pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:02:17.463995 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:17.463968 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 16:02:17.474930 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:17.474905 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 16:02:17.485301 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:17.485274 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wgm8h\" (UniqueName: \"kubernetes.io/projected/a3db670c-92c1-42b9-94ed-04726fe1d07a-kube-api-access-wgm8h\") pod \"network-check-target-xg7zh\" (UID: \"a3db670c-92c1-42b9-94ed-04726fe1d07a\") " pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:02:17.612197 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:17.612119 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-4vs75\"" Apr 16 16:02:17.613933 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:17.613907 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-x4n9w\"" Apr 16 16:02:17.619811 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:17.619792 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:02:17.622452 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:17.622432 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-snrt9" Apr 16 16:02:18.356247 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.356211 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:02:18.366791 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.366760 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.369484 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.369460 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 16:02:18.369766 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.369740 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 16:02:18.370156 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.370025 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 16:02:18.370403 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.370358 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 16:02:18.370787 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.370495 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 16:02:18.370787 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.370607 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 16:02:18.370787 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.370655 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 16:02:18.371011 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.370799 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 16:02:18.371011 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.370886 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-99mkn\"" Apr 16 16:02:18.371011 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.370977 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 16:02:18.371166 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.371054 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-d2t5a52ihtgro\"" Apr 16 16:02:18.371166 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.371067 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 16:02:18.371166 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.371127 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 16:02:18.371307 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.371177 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 16:02:18.372415 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.372391 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:02:18.374070 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.374049 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 16:02:18.468802 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.468775 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4e5a694-dbdd-44a6-8234-4845bff8c58c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.468996 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.468818 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.468996 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.468861 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.468996 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.468890 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kclvc\" (UniqueName: \"kubernetes.io/projected/e4e5a694-dbdd-44a6-8234-4845bff8c58c-kube-api-access-kclvc\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.468996 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.468982 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.469236 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.469026 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4e5a694-dbdd-44a6-8234-4845bff8c58c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.469398 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.469370 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e4e5a694-dbdd-44a6-8234-4845bff8c58c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.469527 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.469460 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.469527 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.469505 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-config\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.469621 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.469540 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e4e5a694-dbdd-44a6-8234-4845bff8c58c-config-out\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.469677 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.469604 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e4e5a694-dbdd-44a6-8234-4845bff8c58c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.469720 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.469672 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.469720 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.469711 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4e5a694-dbdd-44a6-8234-4845bff8c58c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.469873 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.469852 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e4e5a694-dbdd-44a6-8234-4845bff8c58c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.469976 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.469899 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.469976 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.469938 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e4e5a694-dbdd-44a6-8234-4845bff8c58c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.470083 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.469992 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.470083 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.470025 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-web-config\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.570510 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.570482 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e4e5a694-dbdd-44a6-8234-4845bff8c58c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.570668 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.570522 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.570668 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.570551 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-config\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.570668 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.570573 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e4e5a694-dbdd-44a6-8234-4845bff8c58c-config-out\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.570668 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.570598 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e4e5a694-dbdd-44a6-8234-4845bff8c58c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.570668 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.570628 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.570668 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.570654 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4e5a694-dbdd-44a6-8234-4845bff8c58c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.571037 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.570677 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e4e5a694-dbdd-44a6-8234-4845bff8c58c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.571037 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.570712 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.571037 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.570746 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e4e5a694-dbdd-44a6-8234-4845bff8c58c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.571037 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.570781 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.571037 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.570806 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-web-config\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.571037 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.570836 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4e5a694-dbdd-44a6-8234-4845bff8c58c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.571037 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.570868 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.571037 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.570905 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.571037 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.570934 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kclvc\" (UniqueName: \"kubernetes.io/projected/e4e5a694-dbdd-44a6-8234-4845bff8c58c-kube-api-access-kclvc\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.571037 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.570983 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.571037 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.571023 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4e5a694-dbdd-44a6-8234-4845bff8c58c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.571545 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.571292 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e4e5a694-dbdd-44a6-8234-4845bff8c58c-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.572363 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.572337 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4e5a694-dbdd-44a6-8234-4845bff8c58c-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.572505 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.572383 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4e5a694-dbdd-44a6-8234-4845bff8c58c-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.577673 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.573944 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e4e5a694-dbdd-44a6-8234-4845bff8c58c-config-out\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.577673 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.574209 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.577673 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.574224 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e4e5a694-dbdd-44a6-8234-4845bff8c58c-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.577673 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.575002 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4e5a694-dbdd-44a6-8234-4845bff8c58c-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.577673 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.575413 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.577673 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.575748 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.577673 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.575860 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-config\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.577673 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.576243 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e4e5a694-dbdd-44a6-8234-4845bff8c58c-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.577673 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.576669 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.577673 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.576983 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-web-config\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.578191 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.578142 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.578732 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.578709 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e4e5a694-dbdd-44a6-8234-4845bff8c58c-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.578939 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.578920 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.579035 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.578930 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.583708 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.583686 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kclvc\" (UniqueName: \"kubernetes.io/projected/e4e5a694-dbdd-44a6-8234-4845bff8c58c-kube-api-access-kclvc\") pod \"prometheus-k8s-0\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:18.679060 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:18.679012 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:19.259804 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:19.259769 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-x7lk6"] Apr 16 16:02:19.299633 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:19.299506 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-x7lk6"] Apr 16 16:02:19.299797 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:19.299698 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-x7lk6" Apr 16 16:02:19.303293 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:19.303267 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 16:02:19.303441 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:19.303329 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-frjrw\"" Apr 16 16:02:19.303441 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:19.303328 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 16:02:19.304117 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:19.304094 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 16:02:19.304246 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:19.304131 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 16:02:19.379075 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:19.379047 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt88d\" (UniqueName: \"kubernetes.io/projected/efe37373-e8a6-4016-bbaf-58d22d4d4fcf-kube-api-access-lt88d\") pod \"insights-runtime-extractor-x7lk6\" (UID: \"efe37373-e8a6-4016-bbaf-58d22d4d4fcf\") " pod="openshift-insights/insights-runtime-extractor-x7lk6" Apr 16 16:02:19.379439 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:19.379125 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/efe37373-e8a6-4016-bbaf-58d22d4d4fcf-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-x7lk6\" (UID: \"efe37373-e8a6-4016-bbaf-58d22d4d4fcf\") " pod="openshift-insights/insights-runtime-extractor-x7lk6" Apr 16 16:02:19.379439 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:19.379188 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/efe37373-e8a6-4016-bbaf-58d22d4d4fcf-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-x7lk6\" (UID: \"efe37373-e8a6-4016-bbaf-58d22d4d4fcf\") " pod="openshift-insights/insights-runtime-extractor-x7lk6" Apr 16 16:02:19.379439 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:19.379228 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/efe37373-e8a6-4016-bbaf-58d22d4d4fcf-crio-socket\") pod \"insights-runtime-extractor-x7lk6\" (UID: \"efe37373-e8a6-4016-bbaf-58d22d4d4fcf\") " pod="openshift-insights/insights-runtime-extractor-x7lk6" Apr 16 16:02:19.379439 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:19.379252 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/efe37373-e8a6-4016-bbaf-58d22d4d4fcf-data-volume\") pod \"insights-runtime-extractor-x7lk6\" (UID: \"efe37373-e8a6-4016-bbaf-58d22d4d4fcf\") " pod="openshift-insights/insights-runtime-extractor-x7lk6" Apr 16 16:02:19.479866 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:19.479828 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lt88d\" (UniqueName: \"kubernetes.io/projected/efe37373-e8a6-4016-bbaf-58d22d4d4fcf-kube-api-access-lt88d\") pod \"insights-runtime-extractor-x7lk6\" (UID: \"efe37373-e8a6-4016-bbaf-58d22d4d4fcf\") " pod="openshift-insights/insights-runtime-extractor-x7lk6" Apr 16 16:02:19.480051 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:19.479910 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/efe37373-e8a6-4016-bbaf-58d22d4d4fcf-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-x7lk6\" (UID: \"efe37373-e8a6-4016-bbaf-58d22d4d4fcf\") " pod="openshift-insights/insights-runtime-extractor-x7lk6" Apr 16 16:02:19.480051 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:19.479984 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/efe37373-e8a6-4016-bbaf-58d22d4d4fcf-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-x7lk6\" (UID: \"efe37373-e8a6-4016-bbaf-58d22d4d4fcf\") " pod="openshift-insights/insights-runtime-extractor-x7lk6" Apr 16 16:02:19.480051 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:19.480019 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/efe37373-e8a6-4016-bbaf-58d22d4d4fcf-crio-socket\") pod \"insights-runtime-extractor-x7lk6\" (UID: \"efe37373-e8a6-4016-bbaf-58d22d4d4fcf\") " pod="openshift-insights/insights-runtime-extractor-x7lk6" Apr 16 16:02:19.480051 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:19.480041 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/efe37373-e8a6-4016-bbaf-58d22d4d4fcf-data-volume\") pod \"insights-runtime-extractor-x7lk6\" (UID: \"efe37373-e8a6-4016-bbaf-58d22d4d4fcf\") " pod="openshift-insights/insights-runtime-extractor-x7lk6" Apr 16 16:02:19.480219 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:19.480158 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/efe37373-e8a6-4016-bbaf-58d22d4d4fcf-crio-socket\") pod \"insights-runtime-extractor-x7lk6\" (UID: \"efe37373-e8a6-4016-bbaf-58d22d4d4fcf\") " pod="openshift-insights/insights-runtime-extractor-x7lk6" Apr 16 16:02:19.480400 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:19.480384 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/efe37373-e8a6-4016-bbaf-58d22d4d4fcf-data-volume\") pod \"insights-runtime-extractor-x7lk6\" (UID: \"efe37373-e8a6-4016-bbaf-58d22d4d4fcf\") " pod="openshift-insights/insights-runtime-extractor-x7lk6" Apr 16 16:02:19.480619 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:19.480600 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/efe37373-e8a6-4016-bbaf-58d22d4d4fcf-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-x7lk6\" (UID: \"efe37373-e8a6-4016-bbaf-58d22d4d4fcf\") " pod="openshift-insights/insights-runtime-extractor-x7lk6" Apr 16 16:02:19.482583 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:19.482563 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/efe37373-e8a6-4016-bbaf-58d22d4d4fcf-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-x7lk6\" (UID: \"efe37373-e8a6-4016-bbaf-58d22d4d4fcf\") " pod="openshift-insights/insights-runtime-extractor-x7lk6" Apr 16 16:02:19.487888 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:19.487864 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt88d\" (UniqueName: \"kubernetes.io/projected/efe37373-e8a6-4016-bbaf-58d22d4d4fcf-kube-api-access-lt88d\") pod \"insights-runtime-extractor-x7lk6\" (UID: \"efe37373-e8a6-4016-bbaf-58d22d4d4fcf\") " pod="openshift-insights/insights-runtime-extractor-x7lk6" Apr 16 16:02:19.622332 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:19.622244 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-x7lk6" Apr 16 16:02:22.494497 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:22.494183 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-5876b4bbc7-wdmfj"] Apr 16 16:02:22.513730 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:22.511868 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-x7lk6"] Apr 16 16:02:22.513730 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:02:22.512015 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09ea5b31_549a_496a_9830_4728c4f2ca13.slice/crio-739ed576a8aa44f87d233eedfa432f9ec5133f3e4ca3e43cfeb4644c191176f6 WatchSource:0}: Error finding container 739ed576a8aa44f87d233eedfa432f9ec5133f3e4ca3e43cfeb4644c191176f6: Status 404 returned error can't find the container with id 739ed576a8aa44f87d233eedfa432f9ec5133f3e4ca3e43cfeb4644c191176f6 Apr 16 16:02:22.530145 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:02:22.530033 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podefe37373_e8a6_4016_bbaf_58d22d4d4fcf.slice/crio-2adf35ed5e4bb29fb416f5be9e0b143c6098271e0be5f2450cd23cf3c4b79e18 WatchSource:0}: Error finding container 2adf35ed5e4bb29fb416f5be9e0b143c6098271e0be5f2450cd23cf3c4b79e18: Status 404 returned error can't find the container with id 2adf35ed5e4bb29fb416f5be9e0b143c6098271e0be5f2450cd23cf3c4b79e18 Apr 16 16:02:22.544054 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:22.543002 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-xg7zh"] Apr 16 16:02:22.546707 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:22.544460 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-679899d9dd-hkjsf"] Apr 16 16:02:22.565234 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:02:22.565186 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3db670c_92c1_42b9_94ed_04726fe1d07a.slice/crio-9a13f05d717a4d8146f45fc505cad1537d29491ad2c89b9123baad02401ea0f6 WatchSource:0}: Error finding container 9a13f05d717a4d8146f45fc505cad1537d29491ad2c89b9123baad02401ea0f6: Status 404 returned error can't find the container with id 9a13f05d717a4d8146f45fc505cad1537d29491ad2c89b9123baad02401ea0f6 Apr 16 16:02:22.604782 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:02:22.604752 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4e5a694_dbdd_44a6_8234_4845bff8c58c.slice/crio-9f5c950111317a90654293203c24e1f7f6364194528ece936a801423c2f7538a WatchSource:0}: Error finding container 9f5c950111317a90654293203c24e1f7f6364194528ece936a801423c2f7538a: Status 404 returned error can't find the container with id 9f5c950111317a90654293203c24e1f7f6364194528ece936a801423c2f7538a Apr 16 16:02:22.610731 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:22.610369 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:02:22.793911 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:22.793751 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-snrt9"] Apr 16 16:02:22.796562 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:02:22.796532 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd280ea9a_de22_4d14_8870_0fbcbb459f8f.slice/crio-b57b4bd08772a6001021744cf066ca44c83ed889ff274eb8af9a51926161d9b7 WatchSource:0}: Error finding container b57b4bd08772a6001021744cf066ca44c83ed889ff274eb8af9a51926161d9b7: Status 404 returned error can't find the container with id b57b4bd08772a6001021744cf066ca44c83ed889ff274eb8af9a51926161d9b7 Apr 16 16:02:22.930886 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:22.930842 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e4e5a694-dbdd-44a6-8234-4845bff8c58c","Type":"ContainerStarted","Data":"9f5c950111317a90654293203c24e1f7f6364194528ece936a801423c2f7538a"} Apr 16 16:02:22.932647 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:22.932613 2584 generic.go:358] "Generic (PLEG): container finished" podID="86c14ee5-7e88-4559-a96d-9147a4e36c13" containerID="65634b57b25e8b1ab000a5976d863780aabb9fdf8f86f6f3bfbcd532339fc435" exitCode=0 Apr 16 16:02:22.932846 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:22.932699 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jnqnv" event={"ID":"86c14ee5-7e88-4559-a96d-9147a4e36c13","Type":"ContainerDied","Data":"65634b57b25e8b1ab000a5976d863780aabb9fdf8f86f6f3bfbcd532339fc435"} Apr 16 16:02:22.934720 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:22.934692 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" event={"ID":"63f686c4-c679-48b9-bd65-7a861737f564","Type":"ContainerStarted","Data":"af1ca116a16048af659048b73339800e5cbd62f43acb2bdbc082ff094164230f"} Apr 16 16:02:22.937639 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:22.937611 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-gk2jm" event={"ID":"22ab41d5-25ed-43e4-bf00-489eafd172e1","Type":"ContainerStarted","Data":"b620aa2a5b747a8617aad084d07e296a3223937af0440c0d5b26812e42972f59"} Apr 16 16:02:22.937851 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:22.937827 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-gk2jm" Apr 16 16:02:22.939403 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:22.939370 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b88fd7674-fhszj" event={"ID":"0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9","Type":"ContainerStarted","Data":"b8cf85a83940399b037ef3e655cbf13fcfb5c746d2a60f1ededb721da1f3a290"} Apr 16 16:02:22.941324 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:22.941299 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-qdg8s" event={"ID":"90876a88-2791-4518-a822-b1b69a071e6f","Type":"ContainerStarted","Data":"bb07bbfcfbccc8a4b3f9f3ce65d2a923824db8003acd6d9812a9fe0065601071"} Apr 16 16:02:22.941398 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:22.941332 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-qdg8s" event={"ID":"90876a88-2791-4518-a822-b1b69a071e6f","Type":"ContainerStarted","Data":"0dcafcdd6f1291d8ca8bab0729529b863015160de3bd83fd63104b0d6b3379b5"} Apr 16 16:02:22.941398 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:22.941348 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-qdg8s" event={"ID":"90876a88-2791-4518-a822-b1b69a071e6f","Type":"ContainerStarted","Data":"e7fb103d0702ee21b30f11a09ec85709021c20827034f43922fb0d452e2e9330"} Apr 16 16:02:22.942561 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:22.942539 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-snrt9" event={"ID":"d280ea9a-de22-4d14-8870-0fbcbb459f8f","Type":"ContainerStarted","Data":"b57b4bd08772a6001021744cf066ca44c83ed889ff274eb8af9a51926161d9b7"} Apr 16 16:02:22.943650 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:22.943626 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xg7zh" event={"ID":"a3db670c-92c1-42b9-94ed-04726fe1d07a","Type":"ContainerStarted","Data":"9a13f05d717a4d8146f45fc505cad1537d29491ad2c89b9123baad02401ea0f6"} Apr 16 16:02:22.944749 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:22.944730 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-gk2jm" Apr 16 16:02:22.944897 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:22.944874 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-wdmfj" event={"ID":"09ea5b31-549a-496a-9830-4728c4f2ca13","Type":"ContainerStarted","Data":"739ed576a8aa44f87d233eedfa432f9ec5133f3e4ca3e43cfeb4644c191176f6"} Apr 16 16:02:22.946320 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:22.946292 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-x7lk6" event={"ID":"efe37373-e8a6-4016-bbaf-58d22d4d4fcf","Type":"ContainerStarted","Data":"88286d7f77a2893e73b34eef34aaf24e45ac3058e46b3bdb7fdd9053efd8a269"} Apr 16 16:02:22.946411 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:22.946328 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-x7lk6" event={"ID":"efe37373-e8a6-4016-bbaf-58d22d4d4fcf","Type":"ContainerStarted","Data":"2adf35ed5e4bb29fb416f5be9e0b143c6098271e0be5f2450cd23cf3c4b79e18"} Apr 16 16:02:22.975637 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:22.975583 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-gk2jm" podStartSLOduration=2.7780877779999997 podStartE2EDuration="20.975565573s" podCreationTimestamp="2026-04-16 16:02:02 +0000 UTC" firstStartedPulling="2026-04-16 16:02:04.134555757 +0000 UTC m=+52.094747473" lastFinishedPulling="2026-04-16 16:02:22.33203355 +0000 UTC m=+70.292225268" observedRunningTime="2026-04-16 16:02:22.974343062 +0000 UTC m=+70.934534783" watchObservedRunningTime="2026-04-16 16:02:22.975565573 +0000 UTC m=+70.935757297" Apr 16 16:02:22.995235 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:22.995167 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-qdg8s" podStartSLOduration=1.968551771 podStartE2EDuration="10.995130037s" podCreationTimestamp="2026-04-16 16:02:12 +0000 UTC" firstStartedPulling="2026-04-16 16:02:13.253455638 +0000 UTC m=+61.213647341" lastFinishedPulling="2026-04-16 16:02:22.280033887 +0000 UTC m=+70.240225607" observedRunningTime="2026-04-16 16:02:22.994594346 +0000 UTC m=+70.954786070" watchObservedRunningTime="2026-04-16 16:02:22.995130037 +0000 UTC m=+70.955321765" Apr 16 16:02:23.422946 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:23.422908 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/01732c3c-7221-4102-8699-6e097f947672-original-pull-secret\") pod \"global-pull-secret-syncer-d48fn\" (UID: \"01732c3c-7221-4102-8699-6e097f947672\") " pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:02:23.425678 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:23.425485 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 16:02:23.436221 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:23.436165 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/01732c3c-7221-4102-8699-6e097f947672-original-pull-secret\") pod \"global-pull-secret-syncer-d48fn\" (UID: \"01732c3c-7221-4102-8699-6e097f947672\") " pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:02:23.616627 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:23.616589 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-d48fn" Apr 16 16:02:23.816560 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:23.816503 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-b88fd7674-fhszj" podStartSLOduration=2.460710076 podStartE2EDuration="12.81648634s" podCreationTimestamp="2026-04-16 16:02:11 +0000 UTC" firstStartedPulling="2026-04-16 16:02:11.919322208 +0000 UTC m=+59.879513909" lastFinishedPulling="2026-04-16 16:02:22.275098456 +0000 UTC m=+70.235290173" observedRunningTime="2026-04-16 16:02:23.021594507 +0000 UTC m=+70.981786233" watchObservedRunningTime="2026-04-16 16:02:23.81648634 +0000 UTC m=+71.776678064" Apr 16 16:02:23.816908 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:23.816858 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-d48fn"] Apr 16 16:02:23.885503 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:23.885268 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-9f8c8b94d-kt6mf" Apr 16 16:02:23.957381 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:23.957337 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-x7lk6" event={"ID":"efe37373-e8a6-4016-bbaf-58d22d4d4fcf","Type":"ContainerStarted","Data":"938b31cb296689175d778a68dfac671b01a27447bb73baefa89b76af4b3874e2"} Apr 16 16:02:23.965775 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:23.963995 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jnqnv" event={"ID":"86c14ee5-7e88-4559-a96d-9147a4e36c13","Type":"ContainerStarted","Data":"43fb7b21641ddb4ce512f248cff59c71990e80c7270e61b16bd49e284ab854d5"} Apr 16 16:02:23.965775 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:23.964037 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-jnqnv" event={"ID":"86c14ee5-7e88-4559-a96d-9147a4e36c13","Type":"ContainerStarted","Data":"dc3b2cf1ac62bb7564a065394e73ec14b30eed097322a7e5d74cead62d54d494"} Apr 16 16:02:23.985051 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:23.984910 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-jnqnv" podStartSLOduration=2.836519372 podStartE2EDuration="11.984862488s" podCreationTimestamp="2026-04-16 16:02:12 +0000 UTC" firstStartedPulling="2026-04-16 16:02:13.120854179 +0000 UTC m=+61.081045895" lastFinishedPulling="2026-04-16 16:02:22.269197309 +0000 UTC m=+70.229389011" observedRunningTime="2026-04-16 16:02:23.984086049 +0000 UTC m=+71.944277772" watchObservedRunningTime="2026-04-16 16:02:23.984862488 +0000 UTC m=+71.945054213" Apr 16 16:02:24.973979 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:24.973193 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-d48fn" event={"ID":"01732c3c-7221-4102-8699-6e097f947672","Type":"ContainerStarted","Data":"f974decbe74a7fd294443df182edf0f34bf7988727da4893c5056cb41df8d682"} Apr 16 16:02:28.989128 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:28.989095 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-wdmfj" event={"ID":"09ea5b31-549a-496a-9830-4728c4f2ca13","Type":"ContainerStarted","Data":"7f49a446ef7c005fbdf5321259cd9a26e9fb7aff15e327a6144e3544a79b4bf8"} Apr 16 16:02:28.990017 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:28.989995 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-wdmfj" Apr 16 16:02:28.996089 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:28.996066 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-wdmfj" Apr 16 16:02:29.009827 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:29.009727 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-5876b4bbc7-wdmfj" podStartSLOduration=6.7500707129999995 podStartE2EDuration="13.009710598s" podCreationTimestamp="2026-04-16 16:02:16 +0000 UTC" firstStartedPulling="2026-04-16 16:02:22.515482761 +0000 UTC m=+70.475674465" lastFinishedPulling="2026-04-16 16:02:28.775122644 +0000 UTC m=+76.735314350" observedRunningTime="2026-04-16 16:02:29.008660558 +0000 UTC m=+76.968852283" watchObservedRunningTime="2026-04-16 16:02:29.009710598 +0000 UTC m=+76.969902324" Apr 16 16:02:29.994384 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:29.994346 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-snrt9" event={"ID":"d280ea9a-de22-4d14-8870-0fbcbb459f8f","Type":"ContainerStarted","Data":"7c392888c83ead3b0ab285c0b7ec139bd26e8c46ec826c0cc3fb4934798373ee"} Apr 16 16:02:29.996564 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:29.996517 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e4e5a694-dbdd-44a6-8234-4845bff8c58c","Type":"ContainerStarted","Data":"753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b"} Apr 16 16:02:29.998342 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:29.998318 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" event={"ID":"63f686c4-c679-48b9-bd65-7a861737f564","Type":"ContainerStarted","Data":"6b30e1d5620d5726aff9ff5b7cb2be858c5cff67518f6453f62b9e4478c6d9dc"} Apr 16 16:02:30.463709 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:30.463673 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-b88fd7674-fhszj"] Apr 16 16:02:31.009395 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:31.009303 2584 generic.go:358] "Generic (PLEG): container finished" podID="e4e5a694-dbdd-44a6-8234-4845bff8c58c" containerID="753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b" exitCode=0 Apr 16 16:02:31.010080 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:31.009441 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e4e5a694-dbdd-44a6-8234-4845bff8c58c","Type":"ContainerDied","Data":"753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b"} Apr 16 16:02:31.777043 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:31.777012 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-b88fd7674-fhszj" Apr 16 16:02:32.019874 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:32.019191 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" event={"ID":"63f686c4-c679-48b9-bd65-7a861737f564","Type":"ContainerStarted","Data":"a07fdb93caca0f4b9df491082e6f3b9739eb94c896e1694fd0c10bfab705ca54"} Apr 16 16:02:32.023297 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:32.023274 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-snrt9" event={"ID":"d280ea9a-de22-4d14-8870-0fbcbb459f8f","Type":"ContainerStarted","Data":"f8f0cb46a9994d78f090c637d74b2de7dc8dedb950b01af23d6bd5e0f889722f"} Apr 16 16:02:32.025792 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:32.025736 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xg7zh" event={"ID":"a3db670c-92c1-42b9-94ed-04726fe1d07a","Type":"ContainerStarted","Data":"6a3d101670249c5715091e0c3de8d296619ff2ed9e618a3d4e9c305adf79b106"} Apr 16 16:02:32.026778 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:32.026464 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:02:32.040937 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:32.040902 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-snrt9" podStartSLOduration=74.067125326 podStartE2EDuration="1m20.040890351s" podCreationTimestamp="2026-04-16 16:01:12 +0000 UTC" firstStartedPulling="2026-04-16 16:02:22.801248164 +0000 UTC m=+70.761439868" lastFinishedPulling="2026-04-16 16:02:28.775013189 +0000 UTC m=+76.735204893" observedRunningTime="2026-04-16 16:02:32.039100051 +0000 UTC m=+79.999291777" watchObservedRunningTime="2026-04-16 16:02:32.040890351 +0000 UTC m=+80.001082076" Apr 16 16:02:32.054713 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:32.054572 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-xg7zh" podStartSLOduration=70.863387162 podStartE2EDuration="1m20.054561341s" podCreationTimestamp="2026-04-16 16:01:12 +0000 UTC" firstStartedPulling="2026-04-16 16:02:22.568097284 +0000 UTC m=+70.528288999" lastFinishedPulling="2026-04-16 16:02:31.759271476 +0000 UTC m=+79.719463178" observedRunningTime="2026-04-16 16:02:32.053384279 +0000 UTC m=+80.013576037" watchObservedRunningTime="2026-04-16 16:02:32.054561341 +0000 UTC m=+80.014753065" Apr 16 16:02:33.036097 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:33.036056 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-x7lk6" event={"ID":"efe37373-e8a6-4016-bbaf-58d22d4d4fcf","Type":"ContainerStarted","Data":"684309e542972d05d41ea3fdc1dfdfef7c1ecd0164a9f0ec088fc1edb47442c6"} Apr 16 16:02:33.040158 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:33.040131 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" event={"ID":"63f686c4-c679-48b9-bd65-7a861737f564","Type":"ContainerStarted","Data":"f76c640c320c0b4e24b0328a5a89ef7bddaca2bca427a97f494a09473a82ba16"} Apr 16 16:02:33.042908 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:33.042871 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-d48fn" event={"ID":"01732c3c-7221-4102-8699-6e097f947672","Type":"ContainerStarted","Data":"71a9acbea1455386c8a19bea068c6c0cbecf30a174fd97a7c92323069bc85463"} Apr 16 16:02:33.057617 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:33.057573 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-x7lk6" podStartSLOduration=4.875671832 podStartE2EDuration="14.057561332s" podCreationTimestamp="2026-04-16 16:02:19 +0000 UTC" firstStartedPulling="2026-04-16 16:02:22.633049113 +0000 UTC m=+70.593240822" lastFinishedPulling="2026-04-16 16:02:31.814938603 +0000 UTC m=+79.775130322" observedRunningTime="2026-04-16 16:02:33.056741584 +0000 UTC m=+81.016933311" watchObservedRunningTime="2026-04-16 16:02:33.057561332 +0000 UTC m=+81.017753102" Apr 16 16:02:33.072031 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:33.071066 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-d48fn" podStartSLOduration=66.77322636 podStartE2EDuration="1m14.071052128s" podCreationTimestamp="2026-04-16 16:01:19 +0000 UTC" firstStartedPulling="2026-04-16 16:02:24.510359199 +0000 UTC m=+72.470550915" lastFinishedPulling="2026-04-16 16:02:31.808184976 +0000 UTC m=+79.768376683" observedRunningTime="2026-04-16 16:02:33.069920379 +0000 UTC m=+81.030112113" watchObservedRunningTime="2026-04-16 16:02:33.071052128 +0000 UTC m=+81.031243853" Apr 16 16:02:35.051394 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:35.051347 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e4e5a694-dbdd-44a6-8234-4845bff8c58c","Type":"ContainerStarted","Data":"1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46"} Apr 16 16:02:35.051394 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:35.051393 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e4e5a694-dbdd-44a6-8234-4845bff8c58c","Type":"ContainerStarted","Data":"95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951"} Apr 16 16:02:35.054885 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:35.054808 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" event={"ID":"63f686c4-c679-48b9-bd65-7a861737f564","Type":"ContainerStarted","Data":"3c3b0ea122c1b2cddff541fdde2ae2b1fb2cfde51af3ae14e1c810a8d47cb7ae"} Apr 16 16:02:35.054885 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:35.054845 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" event={"ID":"63f686c4-c679-48b9-bd65-7a861737f564","Type":"ContainerStarted","Data":"efcb037bf7810496c77ba9474556e905daed093fac1fe50b8e697d85bc37f6b0"} Apr 16 16:02:35.054885 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:35.054861 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" event={"ID":"63f686c4-c679-48b9-bd65-7a861737f564","Type":"ContainerStarted","Data":"bcd4c709296770f666638c025f465a13d65ef1abd3f7903f93e92354642672c8"} Apr 16 16:02:35.055842 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:35.055815 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" Apr 16 16:02:35.089068 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:35.089008 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" podStartSLOduration=9.234059816 podStartE2EDuration="21.088990519s" podCreationTimestamp="2026-04-16 16:02:14 +0000 UTC" firstStartedPulling="2026-04-16 16:02:22.561644836 +0000 UTC m=+70.521836552" lastFinishedPulling="2026-04-16 16:02:34.416575553 +0000 UTC m=+82.376767255" observedRunningTime="2026-04-16 16:02:35.087405654 +0000 UTC m=+83.047597391" watchObservedRunningTime="2026-04-16 16:02:35.088990519 +0000 UTC m=+83.049182246" Apr 16 16:02:36.061315 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:36.061269 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e4e5a694-dbdd-44a6-8234-4845bff8c58c","Type":"ContainerStarted","Data":"cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82"} Apr 16 16:02:36.061315 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:36.061316 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e4e5a694-dbdd-44a6-8234-4845bff8c58c","Type":"ContainerStarted","Data":"43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259"} Apr 16 16:02:36.062205 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:36.061327 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e4e5a694-dbdd-44a6-8234-4845bff8c58c","Type":"ContainerStarted","Data":"98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1"} Apr 16 16:02:36.062205 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:36.061337 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e4e5a694-dbdd-44a6-8234-4845bff8c58c","Type":"ContainerStarted","Data":"c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e"} Apr 16 16:02:36.067350 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:36.067328 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-679899d9dd-hkjsf" Apr 16 16:02:36.091303 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:36.091260 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=5.899783939 podStartE2EDuration="18.09124411s" podCreationTimestamp="2026-04-16 16:02:18 +0000 UTC" firstStartedPulling="2026-04-16 16:02:22.607545814 +0000 UTC m=+70.567737518" lastFinishedPulling="2026-04-16 16:02:34.799005967 +0000 UTC m=+82.759197689" observedRunningTime="2026-04-16 16:02:36.089070772 +0000 UTC m=+84.049262521" watchObservedRunningTime="2026-04-16 16:02:36.09124411 +0000 UTC m=+84.051435830" Apr 16 16:02:38.679550 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:38.679517 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:02:55.488768 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:55.488722 2584 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-b88fd7674-fhszj" podUID="0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9" containerName="console" containerID="cri-o://b8cf85a83940399b037ef3e655cbf13fcfb5c746d2a60f1ededb721da1f3a290" gracePeriod=15 Apr 16 16:02:55.723431 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:55.723408 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b88fd7674-fhszj_0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9/console/0.log" Apr 16 16:02:55.723548 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:55.723477 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b88fd7674-fhszj" Apr 16 16:02:55.882838 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:55.882768 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-console-oauth-config\") pod \"0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9\" (UID: \"0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9\") " Apr 16 16:02:55.882838 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:55.882819 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-oauth-serving-cert\") pod \"0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9\" (UID: \"0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9\") " Apr 16 16:02:55.883021 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:55.883000 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-service-ca\") pod \"0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9\" (UID: \"0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9\") " Apr 16 16:02:55.883076 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:55.883060 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-console-config\") pod \"0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9\" (UID: \"0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9\") " Apr 16 16:02:55.883122 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:55.883092 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g45ws\" (UniqueName: \"kubernetes.io/projected/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-kube-api-access-g45ws\") pod \"0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9\" (UID: \"0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9\") " Apr 16 16:02:55.883122 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:55.883114 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-console-serving-cert\") pod \"0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9\" (UID: \"0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9\") " Apr 16 16:02:55.883271 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:55.883242 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9" (UID: "0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:02:55.883407 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:55.883368 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-service-ca" (OuterVolumeSpecName: "service-ca") pod "0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9" (UID: "0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:02:55.883474 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:55.883436 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-console-config" (OuterVolumeSpecName: "console-config") pod "0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9" (UID: "0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:02:55.885311 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:55.885286 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9" (UID: "0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:02:55.885427 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:55.885406 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9" (UID: "0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:02:55.885548 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:55.885530 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-kube-api-access-g45ws" (OuterVolumeSpecName: "kube-api-access-g45ws") pod "0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9" (UID: "0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9"). InnerVolumeSpecName "kube-api-access-g45ws". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:02:55.983963 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:55.983938 2584 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-service-ca\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:02:55.984045 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:55.983977 2584 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-console-config\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:02:55.984045 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:55.983987 2584 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-g45ws\" (UniqueName: \"kubernetes.io/projected/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-kube-api-access-g45ws\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:02:55.984045 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:55.983997 2584 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-console-serving-cert\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:02:55.984045 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:55.984008 2584 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-console-oauth-config\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:02:55.984045 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:55.984016 2584 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9-oauth-serving-cert\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:02:56.117612 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:56.117592 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-b88fd7674-fhszj_0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9/console/0.log" Apr 16 16:02:56.117705 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:56.117627 2584 generic.go:358] "Generic (PLEG): container finished" podID="0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9" containerID="b8cf85a83940399b037ef3e655cbf13fcfb5c746d2a60f1ededb721da1f3a290" exitCode=2 Apr 16 16:02:56.117705 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:56.117686 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b88fd7674-fhszj" Apr 16 16:02:56.117705 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:56.117700 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b88fd7674-fhszj" event={"ID":"0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9","Type":"ContainerDied","Data":"b8cf85a83940399b037ef3e655cbf13fcfb5c746d2a60f1ededb721da1f3a290"} Apr 16 16:02:56.117806 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:56.117727 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b88fd7674-fhszj" event={"ID":"0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9","Type":"ContainerDied","Data":"1ce0737ac8a2ed531f407f575baa369f3ec9099e55a2cc35e6bd88c0d15003c6"} Apr 16 16:02:56.117806 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:56.117742 2584 scope.go:117] "RemoveContainer" containerID="b8cf85a83940399b037ef3e655cbf13fcfb5c746d2a60f1ededb721da1f3a290" Apr 16 16:02:56.130475 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:56.130459 2584 scope.go:117] "RemoveContainer" containerID="b8cf85a83940399b037ef3e655cbf13fcfb5c746d2a60f1ededb721da1f3a290" Apr 16 16:02:56.130756 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:02:56.130738 2584 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8cf85a83940399b037ef3e655cbf13fcfb5c746d2a60f1ededb721da1f3a290\": container with ID starting with b8cf85a83940399b037ef3e655cbf13fcfb5c746d2a60f1ededb721da1f3a290 not found: ID does not exist" containerID="b8cf85a83940399b037ef3e655cbf13fcfb5c746d2a60f1ededb721da1f3a290" Apr 16 16:02:56.130806 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:56.130764 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8cf85a83940399b037ef3e655cbf13fcfb5c746d2a60f1ededb721da1f3a290"} err="failed to get container status \"b8cf85a83940399b037ef3e655cbf13fcfb5c746d2a60f1ededb721da1f3a290\": rpc error: code = NotFound desc = could not find container \"b8cf85a83940399b037ef3e655cbf13fcfb5c746d2a60f1ededb721da1f3a290\": container with ID starting with b8cf85a83940399b037ef3e655cbf13fcfb5c746d2a60f1ededb721da1f3a290 not found: ID does not exist" Apr 16 16:02:56.145529 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:56.145476 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-b88fd7674-fhszj"] Apr 16 16:02:56.148276 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:56.148258 2584 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-b88fd7674-fhszj"] Apr 16 16:02:56.698885 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:02:56.698846 2584 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9" path="/var/lib/kubelet/pods/0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9/volumes" Apr 16 16:03:04.047731 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:04.047689 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xg7zh" Apr 16 16:03:18.679480 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:18.679444 2584 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:18.701208 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:18.701183 2584 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:19.198348 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:19.198325 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:36.819663 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:36.819621 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:03:36.820296 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:36.820241 2584 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e4e5a694-dbdd-44a6-8234-4845bff8c58c" containerName="prometheus" containerID="cri-o://95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951" gracePeriod=600 Apr 16 16:03:36.820382 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:36.820261 2584 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e4e5a694-dbdd-44a6-8234-4845bff8c58c" containerName="kube-rbac-proxy" containerID="cri-o://43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259" gracePeriod=600 Apr 16 16:03:36.820382 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:36.820284 2584 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e4e5a694-dbdd-44a6-8234-4845bff8c58c" containerName="kube-rbac-proxy-thanos" containerID="cri-o://cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82" gracePeriod=600 Apr 16 16:03:36.820493 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:36.820380 2584 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e4e5a694-dbdd-44a6-8234-4845bff8c58c" containerName="kube-rbac-proxy-web" containerID="cri-o://98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1" gracePeriod=600 Apr 16 16:03:36.820493 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:36.820394 2584 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e4e5a694-dbdd-44a6-8234-4845bff8c58c" containerName="thanos-sidecar" containerID="cri-o://c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e" gracePeriod=600 Apr 16 16:03:36.820493 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:36.820414 2584 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="e4e5a694-dbdd-44a6-8234-4845bff8c58c" containerName="config-reloader" containerID="cri-o://1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46" gracePeriod=600 Apr 16 16:03:37.056116 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.056095 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.162039 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.161974 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e4e5a694-dbdd-44a6-8234-4845bff8c58c-prometheus-k8s-rulefiles-0\") pod \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " Apr 16 16:03:37.162039 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.162018 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-metrics-client-certs\") pod \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " Apr 16 16:03:37.162039 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.162036 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-config\") pod \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " Apr 16 16:03:37.162285 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.162057 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e4e5a694-dbdd-44a6-8234-4845bff8c58c-tls-assets\") pod \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " Apr 16 16:03:37.162285 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.162079 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " Apr 16 16:03:37.162285 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.162104 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-kube-rbac-proxy\") pod \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " Apr 16 16:03:37.162285 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.162119 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-grpc-tls\") pod \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " Apr 16 16:03:37.162285 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.162148 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4e5a694-dbdd-44a6-8234-4845bff8c58c-prometheus-trusted-ca-bundle\") pod \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " Apr 16 16:03:37.162285 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.162183 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e4e5a694-dbdd-44a6-8234-4845bff8c58c-config-out\") pod \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " Apr 16 16:03:37.162285 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.162211 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-web-config\") pod \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " Apr 16 16:03:37.162285 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.162242 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e4e5a694-dbdd-44a6-8234-4845bff8c58c-prometheus-k8s-db\") pod \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " Apr 16 16:03:37.162285 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.162279 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4e5a694-dbdd-44a6-8234-4845bff8c58c-configmap-kubelet-serving-ca-bundle\") pod \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " Apr 16 16:03:37.162874 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.162322 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4e5a694-dbdd-44a6-8234-4845bff8c58c-configmap-serving-certs-ca-bundle\") pod \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " Apr 16 16:03:37.162874 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.162351 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e4e5a694-dbdd-44a6-8234-4845bff8c58c-configmap-metrics-client-ca\") pod \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " Apr 16 16:03:37.162874 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.162384 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-thanos-prometheus-http-client-file\") pod \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " Apr 16 16:03:37.162874 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.162423 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-prometheus-k8s-tls\") pod \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " Apr 16 16:03:37.162874 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.162450 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kclvc\" (UniqueName: \"kubernetes.io/projected/e4e5a694-dbdd-44a6-8234-4845bff8c58c-kube-api-access-kclvc\") pod \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " Apr 16 16:03:37.162874 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.162481 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\" (UID: \"e4e5a694-dbdd-44a6-8234-4845bff8c58c\") " Apr 16 16:03:37.163696 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.163473 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4e5a694-dbdd-44a6-8234-4845bff8c58c-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "e4e5a694-dbdd-44a6-8234-4845bff8c58c" (UID: "e4e5a694-dbdd-44a6-8234-4845bff8c58c"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:03:37.164203 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.163938 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4e5a694-dbdd-44a6-8234-4845bff8c58c-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "e4e5a694-dbdd-44a6-8234-4845bff8c58c" (UID: "e4e5a694-dbdd-44a6-8234-4845bff8c58c"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:03:37.164203 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.164057 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4e5a694-dbdd-44a6-8234-4845bff8c58c-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "e4e5a694-dbdd-44a6-8234-4845bff8c58c" (UID: "e4e5a694-dbdd-44a6-8234-4845bff8c58c"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:03:37.164822 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.164541 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4e5a694-dbdd-44a6-8234-4845bff8c58c-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "e4e5a694-dbdd-44a6-8234-4845bff8c58c" (UID: "e4e5a694-dbdd-44a6-8234-4845bff8c58c"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:03:37.165215 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.164930 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4e5a694-dbdd-44a6-8234-4845bff8c58c-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "e4e5a694-dbdd-44a6-8234-4845bff8c58c" (UID: "e4e5a694-dbdd-44a6-8234-4845bff8c58c"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:03:37.165215 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.165107 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e4e5a694-dbdd-44a6-8234-4845bff8c58c-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "e4e5a694-dbdd-44a6-8234-4845bff8c58c" (UID: "e4e5a694-dbdd-44a6-8234-4845bff8c58c"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 16:03:37.165742 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.165642 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "e4e5a694-dbdd-44a6-8234-4845bff8c58c" (UID: "e4e5a694-dbdd-44a6-8234-4845bff8c58c"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:03:37.165742 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.165720 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-config" (OuterVolumeSpecName: "config") pod "e4e5a694-dbdd-44a6-8234-4845bff8c58c" (UID: "e4e5a694-dbdd-44a6-8234-4845bff8c58c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:03:37.166124 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.165820 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "e4e5a694-dbdd-44a6-8234-4845bff8c58c" (UID: "e4e5a694-dbdd-44a6-8234-4845bff8c58c"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:03:37.166124 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.165898 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "e4e5a694-dbdd-44a6-8234-4845bff8c58c" (UID: "e4e5a694-dbdd-44a6-8234-4845bff8c58c"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:03:37.166124 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.165945 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4e5a694-dbdd-44a6-8234-4845bff8c58c-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "e4e5a694-dbdd-44a6-8234-4845bff8c58c" (UID: "e4e5a694-dbdd-44a6-8234-4845bff8c58c"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:03:37.166739 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.166711 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "e4e5a694-dbdd-44a6-8234-4845bff8c58c" (UID: "e4e5a694-dbdd-44a6-8234-4845bff8c58c"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:03:37.166833 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.166781 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "e4e5a694-dbdd-44a6-8234-4845bff8c58c" (UID: "e4e5a694-dbdd-44a6-8234-4845bff8c58c"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:03:37.167360 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.167343 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4e5a694-dbdd-44a6-8234-4845bff8c58c-config-out" (OuterVolumeSpecName: "config-out") pod "e4e5a694-dbdd-44a6-8234-4845bff8c58c" (UID: "e4e5a694-dbdd-44a6-8234-4845bff8c58c"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:03:37.167562 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.167545 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "e4e5a694-dbdd-44a6-8234-4845bff8c58c" (UID: "e4e5a694-dbdd-44a6-8234-4845bff8c58c"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:03:37.167916 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.167901 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4e5a694-dbdd-44a6-8234-4845bff8c58c-kube-api-access-kclvc" (OuterVolumeSpecName: "kube-api-access-kclvc") pod "e4e5a694-dbdd-44a6-8234-4845bff8c58c" (UID: "e4e5a694-dbdd-44a6-8234-4845bff8c58c"). InnerVolumeSpecName "kube-api-access-kclvc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:03:37.168503 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.168481 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "e4e5a694-dbdd-44a6-8234-4845bff8c58c" (UID: "e4e5a694-dbdd-44a6-8234-4845bff8c58c"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:03:37.177829 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.177810 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-web-config" (OuterVolumeSpecName: "web-config") pod "e4e5a694-dbdd-44a6-8234-4845bff8c58c" (UID: "e4e5a694-dbdd-44a6-8234-4845bff8c58c"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:03:37.238475 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.238450 2584 generic.go:358] "Generic (PLEG): container finished" podID="e4e5a694-dbdd-44a6-8234-4845bff8c58c" containerID="cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82" exitCode=0 Apr 16 16:03:37.238475 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.238470 2584 generic.go:358] "Generic (PLEG): container finished" podID="e4e5a694-dbdd-44a6-8234-4845bff8c58c" containerID="43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259" exitCode=0 Apr 16 16:03:37.238475 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.238476 2584 generic.go:358] "Generic (PLEG): container finished" podID="e4e5a694-dbdd-44a6-8234-4845bff8c58c" containerID="98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1" exitCode=0 Apr 16 16:03:37.238617 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.238483 2584 generic.go:358] "Generic (PLEG): container finished" podID="e4e5a694-dbdd-44a6-8234-4845bff8c58c" containerID="c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e" exitCode=0 Apr 16 16:03:37.238617 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.238489 2584 generic.go:358] "Generic (PLEG): container finished" podID="e4e5a694-dbdd-44a6-8234-4845bff8c58c" containerID="1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46" exitCode=0 Apr 16 16:03:37.238617 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.238493 2584 generic.go:358] "Generic (PLEG): container finished" podID="e4e5a694-dbdd-44a6-8234-4845bff8c58c" containerID="95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951" exitCode=0 Apr 16 16:03:37.238617 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.238570 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e4e5a694-dbdd-44a6-8234-4845bff8c58c","Type":"ContainerDied","Data":"cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82"} Apr 16 16:03:37.238617 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.238601 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e4e5a694-dbdd-44a6-8234-4845bff8c58c","Type":"ContainerDied","Data":"43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259"} Apr 16 16:03:37.238617 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.238616 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e4e5a694-dbdd-44a6-8234-4845bff8c58c","Type":"ContainerDied","Data":"98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1"} Apr 16 16:03:37.238796 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.238617 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.238796 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.238625 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e4e5a694-dbdd-44a6-8234-4845bff8c58c","Type":"ContainerDied","Data":"c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e"} Apr 16 16:03:37.238796 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.238634 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e4e5a694-dbdd-44a6-8234-4845bff8c58c","Type":"ContainerDied","Data":"1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46"} Apr 16 16:03:37.238796 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.238643 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e4e5a694-dbdd-44a6-8234-4845bff8c58c","Type":"ContainerDied","Data":"95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951"} Apr 16 16:03:37.238796 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.238652 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"e4e5a694-dbdd-44a6-8234-4845bff8c58c","Type":"ContainerDied","Data":"9f5c950111317a90654293203c24e1f7f6364194528ece936a801423c2f7538a"} Apr 16 16:03:37.238796 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.238667 2584 scope.go:117] "RemoveContainer" containerID="cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82" Apr 16 16:03:37.246137 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.246118 2584 scope.go:117] "RemoveContainer" containerID="43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259" Apr 16 16:03:37.253119 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.253103 2584 scope.go:117] "RemoveContainer" containerID="98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1" Apr 16 16:03:37.259003 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.258939 2584 scope.go:117] "RemoveContainer" containerID="c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e" Apr 16 16:03:37.263240 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.263204 2584 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-metrics-client-certs\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:03:37.263240 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.263231 2584 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-config\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:03:37.263431 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.263244 2584 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/e4e5a694-dbdd-44a6-8234-4845bff8c58c-tls-assets\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:03:37.263431 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.263258 2584 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:03:37.263431 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.263272 2584 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-kube-rbac-proxy\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:03:37.263431 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.263285 2584 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-grpc-tls\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:03:37.263431 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.263299 2584 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4e5a694-dbdd-44a6-8234-4845bff8c58c-prometheus-trusted-ca-bundle\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:03:37.263431 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.263312 2584 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/e4e5a694-dbdd-44a6-8234-4845bff8c58c-config-out\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:03:37.263431 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.263326 2584 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-web-config\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:03:37.263431 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.263334 2584 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/e4e5a694-dbdd-44a6-8234-4845bff8c58c-prometheus-k8s-db\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:03:37.263431 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.263345 2584 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4e5a694-dbdd-44a6-8234-4845bff8c58c-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:03:37.263431 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.263361 2584 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e4e5a694-dbdd-44a6-8234-4845bff8c58c-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:03:37.263431 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.263375 2584 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/e4e5a694-dbdd-44a6-8234-4845bff8c58c-configmap-metrics-client-ca\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:03:37.263431 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.263389 2584 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-thanos-prometheus-http-client-file\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:03:37.263431 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.263402 2584 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-prometheus-k8s-tls\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:03:37.263431 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.263411 2584 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kclvc\" (UniqueName: \"kubernetes.io/projected/e4e5a694-dbdd-44a6-8234-4845bff8c58c-kube-api-access-kclvc\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:03:37.263431 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.263420 2584 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/e4e5a694-dbdd-44a6-8234-4845bff8c58c-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:03:37.263431 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.263428 2584 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/e4e5a694-dbdd-44a6-8234-4845bff8c58c-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:03:37.264853 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.264831 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:03:37.265747 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.265719 2584 scope.go:117] "RemoveContainer" containerID="1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46" Apr 16 16:03:37.271352 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.271333 2584 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:03:37.272438 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.272425 2584 scope.go:117] "RemoveContainer" containerID="95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951" Apr 16 16:03:37.278795 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.278782 2584 scope.go:117] "RemoveContainer" containerID="753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b" Apr 16 16:03:37.284565 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.284551 2584 scope.go:117] "RemoveContainer" containerID="cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82" Apr 16 16:03:37.284805 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:03:37.284791 2584 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82\": container with ID starting with cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82 not found: ID does not exist" containerID="cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82" Apr 16 16:03:37.284848 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.284813 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82"} err="failed to get container status \"cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82\": rpc error: code = NotFound desc = could not find container \"cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82\": container with ID starting with cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82 not found: ID does not exist" Apr 16 16:03:37.284848 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.284837 2584 scope.go:117] "RemoveContainer" containerID="43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259" Apr 16 16:03:37.285165 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:03:37.285148 2584 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259\": container with ID starting with 43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259 not found: ID does not exist" containerID="43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259" Apr 16 16:03:37.285213 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.285170 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259"} err="failed to get container status \"43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259\": rpc error: code = NotFound desc = could not find container \"43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259\": container with ID starting with 43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259 not found: ID does not exist" Apr 16 16:03:37.285213 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.285187 2584 scope.go:117] "RemoveContainer" containerID="98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1" Apr 16 16:03:37.285436 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:03:37.285417 2584 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1\": container with ID starting with 98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1 not found: ID does not exist" containerID="98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1" Apr 16 16:03:37.285484 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.285443 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1"} err="failed to get container status \"98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1\": rpc error: code = NotFound desc = could not find container \"98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1\": container with ID starting with 98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1 not found: ID does not exist" Apr 16 16:03:37.285484 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.285458 2584 scope.go:117] "RemoveContainer" containerID="c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e" Apr 16 16:03:37.285681 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:03:37.285667 2584 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e\": container with ID starting with c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e not found: ID does not exist" containerID="c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e" Apr 16 16:03:37.285726 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.285684 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e"} err="failed to get container status \"c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e\": rpc error: code = NotFound desc = could not find container \"c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e\": container with ID starting with c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e not found: ID does not exist" Apr 16 16:03:37.285726 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.285699 2584 scope.go:117] "RemoveContainer" containerID="1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46" Apr 16 16:03:37.285926 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:03:37.285912 2584 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46\": container with ID starting with 1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46 not found: ID does not exist" containerID="1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46" Apr 16 16:03:37.285991 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.285930 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46"} err="failed to get container status \"1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46\": rpc error: code = NotFound desc = could not find container \"1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46\": container with ID starting with 1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46 not found: ID does not exist" Apr 16 16:03:37.285991 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.285944 2584 scope.go:117] "RemoveContainer" containerID="95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951" Apr 16 16:03:37.286223 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:03:37.286206 2584 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951\": container with ID starting with 95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951 not found: ID does not exist" containerID="95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951" Apr 16 16:03:37.286269 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.286228 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951"} err="failed to get container status \"95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951\": rpc error: code = NotFound desc = could not find container \"95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951\": container with ID starting with 95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951 not found: ID does not exist" Apr 16 16:03:37.286269 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.286242 2584 scope.go:117] "RemoveContainer" containerID="753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b" Apr 16 16:03:37.286484 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:03:37.286469 2584 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b\": container with ID starting with 753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b not found: ID does not exist" containerID="753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b" Apr 16 16:03:37.286521 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.286488 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b"} err="failed to get container status \"753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b\": rpc error: code = NotFound desc = could not find container \"753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b\": container with ID starting with 753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b not found: ID does not exist" Apr 16 16:03:37.286521 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.286501 2584 scope.go:117] "RemoveContainer" containerID="cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82" Apr 16 16:03:37.286699 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.286679 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82"} err="failed to get container status \"cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82\": rpc error: code = NotFound desc = could not find container \"cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82\": container with ID starting with cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82 not found: ID does not exist" Apr 16 16:03:37.286767 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.286702 2584 scope.go:117] "RemoveContainer" containerID="43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259" Apr 16 16:03:37.286926 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.286908 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259"} err="failed to get container status \"43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259\": rpc error: code = NotFound desc = could not find container \"43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259\": container with ID starting with 43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259 not found: ID does not exist" Apr 16 16:03:37.286985 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.286927 2584 scope.go:117] "RemoveContainer" containerID="98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1" Apr 16 16:03:37.287150 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.287133 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1"} err="failed to get container status \"98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1\": rpc error: code = NotFound desc = could not find container \"98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1\": container with ID starting with 98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1 not found: ID does not exist" Apr 16 16:03:37.287194 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.287150 2584 scope.go:117] "RemoveContainer" containerID="c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e" Apr 16 16:03:37.287326 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.287307 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e"} err="failed to get container status \"c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e\": rpc error: code = NotFound desc = could not find container \"c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e\": container with ID starting with c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e not found: ID does not exist" Apr 16 16:03:37.287391 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.287327 2584 scope.go:117] "RemoveContainer" containerID="1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46" Apr 16 16:03:37.287544 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.287527 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46"} err="failed to get container status \"1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46\": rpc error: code = NotFound desc = could not find container \"1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46\": container with ID starting with 1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46 not found: ID does not exist" Apr 16 16:03:37.287588 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.287546 2584 scope.go:117] "RemoveContainer" containerID="95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951" Apr 16 16:03:37.287743 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.287726 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951"} err="failed to get container status \"95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951\": rpc error: code = NotFound desc = could not find container \"95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951\": container with ID starting with 95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951 not found: ID does not exist" Apr 16 16:03:37.287786 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.287743 2584 scope.go:117] "RemoveContainer" containerID="753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b" Apr 16 16:03:37.287993 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.287976 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b"} err="failed to get container status \"753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b\": rpc error: code = NotFound desc = could not find container \"753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b\": container with ID starting with 753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b not found: ID does not exist" Apr 16 16:03:37.288063 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.287994 2584 scope.go:117] "RemoveContainer" containerID="cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82" Apr 16 16:03:37.288203 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.288184 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82"} err="failed to get container status \"cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82\": rpc error: code = NotFound desc = could not find container \"cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82\": container with ID starting with cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82 not found: ID does not exist" Apr 16 16:03:37.288274 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.288203 2584 scope.go:117] "RemoveContainer" containerID="43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259" Apr 16 16:03:37.288451 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.288431 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259"} err="failed to get container status \"43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259\": rpc error: code = NotFound desc = could not find container \"43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259\": container with ID starting with 43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259 not found: ID does not exist" Apr 16 16:03:37.288553 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.288452 2584 scope.go:117] "RemoveContainer" containerID="98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1" Apr 16 16:03:37.288728 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.288708 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1"} err="failed to get container status \"98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1\": rpc error: code = NotFound desc = could not find container \"98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1\": container with ID starting with 98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1 not found: ID does not exist" Apr 16 16:03:37.288728 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.288726 2584 scope.go:117] "RemoveContainer" containerID="c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e" Apr 16 16:03:37.289015 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.288996 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e"} err="failed to get container status \"c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e\": rpc error: code = NotFound desc = could not find container \"c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e\": container with ID starting with c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e not found: ID does not exist" Apr 16 16:03:37.289076 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.289016 2584 scope.go:117] "RemoveContainer" containerID="1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46" Apr 16 16:03:37.289263 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.289246 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46"} err="failed to get container status \"1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46\": rpc error: code = NotFound desc = could not find container \"1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46\": container with ID starting with 1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46 not found: ID does not exist" Apr 16 16:03:37.289303 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.289265 2584 scope.go:117] "RemoveContainer" containerID="95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951" Apr 16 16:03:37.289463 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.289446 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951"} err="failed to get container status \"95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951\": rpc error: code = NotFound desc = could not find container \"95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951\": container with ID starting with 95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951 not found: ID does not exist" Apr 16 16:03:37.289496 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.289464 2584 scope.go:117] "RemoveContainer" containerID="753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b" Apr 16 16:03:37.289659 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.289636 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b"} err="failed to get container status \"753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b\": rpc error: code = NotFound desc = could not find container \"753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b\": container with ID starting with 753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b not found: ID does not exist" Apr 16 16:03:37.289724 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.289662 2584 scope.go:117] "RemoveContainer" containerID="cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82" Apr 16 16:03:37.289897 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.289881 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82"} err="failed to get container status \"cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82\": rpc error: code = NotFound desc = could not find container \"cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82\": container with ID starting with cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82 not found: ID does not exist" Apr 16 16:03:37.289941 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.289899 2584 scope.go:117] "RemoveContainer" containerID="43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259" Apr 16 16:03:37.290109 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.290090 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259"} err="failed to get container status \"43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259\": rpc error: code = NotFound desc = could not find container \"43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259\": container with ID starting with 43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259 not found: ID does not exist" Apr 16 16:03:37.290151 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.290110 2584 scope.go:117] "RemoveContainer" containerID="98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1" Apr 16 16:03:37.290320 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.290304 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1"} err="failed to get container status \"98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1\": rpc error: code = NotFound desc = could not find container \"98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1\": container with ID starting with 98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1 not found: ID does not exist" Apr 16 16:03:37.290360 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.290320 2584 scope.go:117] "RemoveContainer" containerID="c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e" Apr 16 16:03:37.290498 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.290482 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e"} err="failed to get container status \"c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e\": rpc error: code = NotFound desc = could not find container \"c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e\": container with ID starting with c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e not found: ID does not exist" Apr 16 16:03:37.290544 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.290498 2584 scope.go:117] "RemoveContainer" containerID="1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46" Apr 16 16:03:37.290689 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.290673 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46"} err="failed to get container status \"1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46\": rpc error: code = NotFound desc = could not find container \"1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46\": container with ID starting with 1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46 not found: ID does not exist" Apr 16 16:03:37.290732 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.290691 2584 scope.go:117] "RemoveContainer" containerID="95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951" Apr 16 16:03:37.290886 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.290867 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951"} err="failed to get container status \"95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951\": rpc error: code = NotFound desc = could not find container \"95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951\": container with ID starting with 95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951 not found: ID does not exist" Apr 16 16:03:37.290975 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.290888 2584 scope.go:117] "RemoveContainer" containerID="753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b" Apr 16 16:03:37.291104 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.291083 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b"} err="failed to get container status \"753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b\": rpc error: code = NotFound desc = could not find container \"753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b\": container with ID starting with 753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b not found: ID does not exist" Apr 16 16:03:37.291149 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.291107 2584 scope.go:117] "RemoveContainer" containerID="cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82" Apr 16 16:03:37.291285 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.291271 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82"} err="failed to get container status \"cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82\": rpc error: code = NotFound desc = could not find container \"cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82\": container with ID starting with cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82 not found: ID does not exist" Apr 16 16:03:37.291327 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.291286 2584 scope.go:117] "RemoveContainer" containerID="43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259" Apr 16 16:03:37.291502 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.291482 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259"} err="failed to get container status \"43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259\": rpc error: code = NotFound desc = could not find container \"43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259\": container with ID starting with 43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259 not found: ID does not exist" Apr 16 16:03:37.291542 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.291505 2584 scope.go:117] "RemoveContainer" containerID="98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1" Apr 16 16:03:37.291710 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.291693 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1"} err="failed to get container status \"98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1\": rpc error: code = NotFound desc = could not find container \"98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1\": container with ID starting with 98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1 not found: ID does not exist" Apr 16 16:03:37.291750 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.291713 2584 scope.go:117] "RemoveContainer" containerID="c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e" Apr 16 16:03:37.291897 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.291881 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e"} err="failed to get container status \"c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e\": rpc error: code = NotFound desc = could not find container \"c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e\": container with ID starting with c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e not found: ID does not exist" Apr 16 16:03:37.291936 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.291898 2584 scope.go:117] "RemoveContainer" containerID="1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46" Apr 16 16:03:37.292091 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.292075 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46"} err="failed to get container status \"1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46\": rpc error: code = NotFound desc = could not find container \"1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46\": container with ID starting with 1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46 not found: ID does not exist" Apr 16 16:03:37.292146 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.292092 2584 scope.go:117] "RemoveContainer" containerID="95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951" Apr 16 16:03:37.292270 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.292254 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951"} err="failed to get container status \"95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951\": rpc error: code = NotFound desc = could not find container \"95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951\": container with ID starting with 95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951 not found: ID does not exist" Apr 16 16:03:37.292308 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.292270 2584 scope.go:117] "RemoveContainer" containerID="753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b" Apr 16 16:03:37.292428 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.292414 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b"} err="failed to get container status \"753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b\": rpc error: code = NotFound desc = could not find container \"753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b\": container with ID starting with 753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b not found: ID does not exist" Apr 16 16:03:37.292471 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.292427 2584 scope.go:117] "RemoveContainer" containerID="cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82" Apr 16 16:03:37.292623 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.292609 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82"} err="failed to get container status \"cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82\": rpc error: code = NotFound desc = could not find container \"cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82\": container with ID starting with cb931fd858314ff8116708e115c2e9952f1f4987d33400557a8d3ae343b20c82 not found: ID does not exist" Apr 16 16:03:37.292671 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.292623 2584 scope.go:117] "RemoveContainer" containerID="43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259" Apr 16 16:03:37.292806 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.292788 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259"} err="failed to get container status \"43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259\": rpc error: code = NotFound desc = could not find container \"43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259\": container with ID starting with 43d34500976fb88f25277a577672ed2e98cbae68e12955035cfb55681d504259 not found: ID does not exist" Apr 16 16:03:37.296649 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.292807 2584 scope.go:117] "RemoveContainer" containerID="98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1" Apr 16 16:03:37.297170 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.297141 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1"} err="failed to get container status \"98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1\": rpc error: code = NotFound desc = could not find container \"98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1\": container with ID starting with 98ef6e7ce026db8d4eda6062b295b420581e83b556f12bbd5fc19ce0e5b81fd1 not found: ID does not exist" Apr 16 16:03:37.297265 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.297172 2584 scope.go:117] "RemoveContainer" containerID="c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e" Apr 16 16:03:37.297456 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.297427 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e"} err="failed to get container status \"c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e\": rpc error: code = NotFound desc = could not find container \"c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e\": container with ID starting with c0d59a712962e66fd169c370b87846123d6952481d945cff40efbc8055b2e37e not found: ID does not exist" Apr 16 16:03:37.297456 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.297453 2584 scope.go:117] "RemoveContainer" containerID="1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46" Apr 16 16:03:37.297753 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.297728 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46"} err="failed to get container status \"1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46\": rpc error: code = NotFound desc = could not find container \"1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46\": container with ID starting with 1bdd38d911edc8d2c679ede84019a1b3e96eb8fce185e27c6daa3776fb790f46 not found: ID does not exist" Apr 16 16:03:37.297753 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.297753 2584 scope.go:117] "RemoveContainer" containerID="95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951" Apr 16 16:03:37.298191 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.298166 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951"} err="failed to get container status \"95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951\": rpc error: code = NotFound desc = could not find container \"95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951\": container with ID starting with 95c82eb06f278378206498ce79cf74a37606985cd0c39e5b5fada3178184e951 not found: ID does not exist" Apr 16 16:03:37.298286 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.298194 2584 scope.go:117] "RemoveContainer" containerID="753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b" Apr 16 16:03:37.298607 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.298583 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b"} err="failed to get container status \"753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b\": rpc error: code = NotFound desc = could not find container \"753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b\": container with ID starting with 753307e7d410f1ccf4b130182a81bd946762d009efe447df7e7903dd5661d41b not found: ID does not exist" Apr 16 16:03:37.299731 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.299714 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:03:37.300028 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.300014 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e4e5a694-dbdd-44a6-8234-4845bff8c58c" containerName="init-config-reloader" Apr 16 16:03:37.300064 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.300031 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4e5a694-dbdd-44a6-8234-4845bff8c58c" containerName="init-config-reloader" Apr 16 16:03:37.300064 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.300041 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9" containerName="console" Apr 16 16:03:37.300064 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.300047 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9" containerName="console" Apr 16 16:03:37.300064 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.300054 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e4e5a694-dbdd-44a6-8234-4845bff8c58c" containerName="thanos-sidecar" Apr 16 16:03:37.300064 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.300059 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4e5a694-dbdd-44a6-8234-4845bff8c58c" containerName="thanos-sidecar" Apr 16 16:03:37.300256 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.300070 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e4e5a694-dbdd-44a6-8234-4845bff8c58c" containerName="prometheus" Apr 16 16:03:37.300256 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.300076 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4e5a694-dbdd-44a6-8234-4845bff8c58c" containerName="prometheus" Apr 16 16:03:37.300256 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.300085 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e4e5a694-dbdd-44a6-8234-4845bff8c58c" containerName="config-reloader" Apr 16 16:03:37.300256 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.300090 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4e5a694-dbdd-44a6-8234-4845bff8c58c" containerName="config-reloader" Apr 16 16:03:37.300256 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.300097 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e4e5a694-dbdd-44a6-8234-4845bff8c58c" containerName="kube-rbac-proxy-web" Apr 16 16:03:37.300256 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.300101 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4e5a694-dbdd-44a6-8234-4845bff8c58c" containerName="kube-rbac-proxy-web" Apr 16 16:03:37.300256 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.300106 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e4e5a694-dbdd-44a6-8234-4845bff8c58c" containerName="kube-rbac-proxy-thanos" Apr 16 16:03:37.300256 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.300111 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4e5a694-dbdd-44a6-8234-4845bff8c58c" containerName="kube-rbac-proxy-thanos" Apr 16 16:03:37.300256 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.300118 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e4e5a694-dbdd-44a6-8234-4845bff8c58c" containerName="kube-rbac-proxy" Apr 16 16:03:37.300256 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.300123 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4e5a694-dbdd-44a6-8234-4845bff8c58c" containerName="kube-rbac-proxy" Apr 16 16:03:37.300256 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.300161 2584 memory_manager.go:356] "RemoveStaleState removing state" podUID="e4e5a694-dbdd-44a6-8234-4845bff8c58c" containerName="thanos-sidecar" Apr 16 16:03:37.300256 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.300167 2584 memory_manager.go:356] "RemoveStaleState removing state" podUID="e4e5a694-dbdd-44a6-8234-4845bff8c58c" containerName="kube-rbac-proxy-thanos" Apr 16 16:03:37.300256 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.300174 2584 memory_manager.go:356] "RemoveStaleState removing state" podUID="e4e5a694-dbdd-44a6-8234-4845bff8c58c" containerName="kube-rbac-proxy" Apr 16 16:03:37.300256 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.300181 2584 memory_manager.go:356] "RemoveStaleState removing state" podUID="0d3e0abd-e178-44c5-9a7d-f7d9ba1954f9" containerName="console" Apr 16 16:03:37.300256 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.300186 2584 memory_manager.go:356] "RemoveStaleState removing state" podUID="e4e5a694-dbdd-44a6-8234-4845bff8c58c" containerName="prometheus" Apr 16 16:03:37.300256 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.300192 2584 memory_manager.go:356] "RemoveStaleState removing state" podUID="e4e5a694-dbdd-44a6-8234-4845bff8c58c" containerName="config-reloader" Apr 16 16:03:37.300256 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.300198 2584 memory_manager.go:356] "RemoveStaleState removing state" podUID="e4e5a694-dbdd-44a6-8234-4845bff8c58c" containerName="kube-rbac-proxy-web" Apr 16 16:03:37.305187 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.305171 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.308088 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.308042 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 16:03:37.308203 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.308089 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 16:03:37.308203 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.308186 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 16:03:37.308363 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.308312 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 16:03:37.308510 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.308378 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 16:03:37.308740 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.308723 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 16:03:37.308740 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.308736 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 16:03:37.308942 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.308727 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-d2t5a52ihtgro\"" Apr 16 16:03:37.308942 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.308723 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 16:03:37.308942 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.308829 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 16:03:37.308942 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.308853 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 16:03:37.309276 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.309029 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-99mkn\"" Apr 16 16:03:37.309276 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.309125 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 16:03:37.311896 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.311858 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 16:03:37.316034 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.316015 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 16:03:37.320719 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.320697 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:03:37.363706 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.363683 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/231bc417-6470-4556-9dbe-67ed2c3f2063-config\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.363791 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.363709 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/231bc417-6470-4556-9dbe-67ed2c3f2063-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.363791 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.363730 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/231bc417-6470-4556-9dbe-67ed2c3f2063-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.363791 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.363746 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/231bc417-6470-4556-9dbe-67ed2c3f2063-config-out\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.363992 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.363793 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xc5tj\" (UniqueName: \"kubernetes.io/projected/231bc417-6470-4556-9dbe-67ed2c3f2063-kube-api-access-xc5tj\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.363992 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.363859 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/231bc417-6470-4556-9dbe-67ed2c3f2063-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.363992 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.363887 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/231bc417-6470-4556-9dbe-67ed2c3f2063-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.363992 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.363919 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/231bc417-6470-4556-9dbe-67ed2c3f2063-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.363992 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.363979 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/231bc417-6470-4556-9dbe-67ed2c3f2063-web-config\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.364193 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.364002 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/231bc417-6470-4556-9dbe-67ed2c3f2063-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.364193 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.364047 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/231bc417-6470-4556-9dbe-67ed2c3f2063-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.364193 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.364077 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/231bc417-6470-4556-9dbe-67ed2c3f2063-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.364193 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.364109 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/231bc417-6470-4556-9dbe-67ed2c3f2063-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.364193 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.364136 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/231bc417-6470-4556-9dbe-67ed2c3f2063-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.364193 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.364164 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/231bc417-6470-4556-9dbe-67ed2c3f2063-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.364193 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.364185 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/231bc417-6470-4556-9dbe-67ed2c3f2063-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.364417 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.364202 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/231bc417-6470-4556-9dbe-67ed2c3f2063-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.364417 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.364225 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/231bc417-6470-4556-9dbe-67ed2c3f2063-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.464455 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.464430 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/231bc417-6470-4556-9dbe-67ed2c3f2063-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.464551 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.464459 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/231bc417-6470-4556-9dbe-67ed2c3f2063-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.464551 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.464476 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/231bc417-6470-4556-9dbe-67ed2c3f2063-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.464551 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.464493 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/231bc417-6470-4556-9dbe-67ed2c3f2063-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.464551 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.464511 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/231bc417-6470-4556-9dbe-67ed2c3f2063-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.464551 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.464540 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/231bc417-6470-4556-9dbe-67ed2c3f2063-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.464771 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.464563 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/231bc417-6470-4556-9dbe-67ed2c3f2063-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.464771 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.464597 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/231bc417-6470-4556-9dbe-67ed2c3f2063-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.464771 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.464641 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/231bc417-6470-4556-9dbe-67ed2c3f2063-config\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.464771 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.464664 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/231bc417-6470-4556-9dbe-67ed2c3f2063-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.464771 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.464692 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/231bc417-6470-4556-9dbe-67ed2c3f2063-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.464771 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.464716 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/231bc417-6470-4556-9dbe-67ed2c3f2063-config-out\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.464771 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.464738 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xc5tj\" (UniqueName: \"kubernetes.io/projected/231bc417-6470-4556-9dbe-67ed2c3f2063-kube-api-access-xc5tj\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.464771 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.464767 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/231bc417-6470-4556-9dbe-67ed2c3f2063-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.465178 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.464789 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/231bc417-6470-4556-9dbe-67ed2c3f2063-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.465178 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.464822 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/231bc417-6470-4556-9dbe-67ed2c3f2063-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.465178 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.464858 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/231bc417-6470-4556-9dbe-67ed2c3f2063-web-config\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.465178 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.464882 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/231bc417-6470-4556-9dbe-67ed2c3f2063-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.465541 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.465280 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/231bc417-6470-4556-9dbe-67ed2c3f2063-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.465541 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.465460 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/231bc417-6470-4556-9dbe-67ed2c3f2063-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.466237 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.465907 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/231bc417-6470-4556-9dbe-67ed2c3f2063-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.467882 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.467855 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/231bc417-6470-4556-9dbe-67ed2c3f2063-config-out\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.468000 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.467930 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/231bc417-6470-4556-9dbe-67ed2c3f2063-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.468000 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.467858 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/231bc417-6470-4556-9dbe-67ed2c3f2063-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.468112 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.468072 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/231bc417-6470-4556-9dbe-67ed2c3f2063-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.468112 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.468071 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/231bc417-6470-4556-9dbe-67ed2c3f2063-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.468542 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.468441 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/231bc417-6470-4556-9dbe-67ed2c3f2063-config\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.468542 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.467858 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/231bc417-6470-4556-9dbe-67ed2c3f2063-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.468742 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.468718 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/231bc417-6470-4556-9dbe-67ed2c3f2063-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.468897 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.468874 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/231bc417-6470-4556-9dbe-67ed2c3f2063-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.468990 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.468936 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/231bc417-6470-4556-9dbe-67ed2c3f2063-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.470098 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.470077 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/231bc417-6470-4556-9dbe-67ed2c3f2063-web-config\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.470967 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.470937 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/231bc417-6470-4556-9dbe-67ed2c3f2063-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.471018 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.470983 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/231bc417-6470-4556-9dbe-67ed2c3f2063-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.471059 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.471043 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/231bc417-6470-4556-9dbe-67ed2c3f2063-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.474529 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.474509 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xc5tj\" (UniqueName: \"kubernetes.io/projected/231bc417-6470-4556-9dbe-67ed2c3f2063-kube-api-access-xc5tj\") pod \"prometheus-k8s-0\" (UID: \"231bc417-6470-4556-9dbe-67ed2c3f2063\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.614581 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.614546 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:03:37.747859 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:37.747731 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 16:03:37.751835 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:03:37.751803 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod231bc417_6470_4556_9dbe_67ed2c3f2063.slice/crio-b4039830ac3df74055326e183040222e3e7ac2f317586628390f4400f2b7af87 WatchSource:0}: Error finding container b4039830ac3df74055326e183040222e3e7ac2f317586628390f4400f2b7af87: Status 404 returned error can't find the container with id b4039830ac3df74055326e183040222e3e7ac2f317586628390f4400f2b7af87 Apr 16 16:03:38.243692 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:38.243656 2584 generic.go:358] "Generic (PLEG): container finished" podID="231bc417-6470-4556-9dbe-67ed2c3f2063" containerID="44a19183b09c8c1fa399c5bf4399ced57f4ca99f997e6c0617e6bc1b086bc80b" exitCode=0 Apr 16 16:03:38.244067 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:38.243748 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"231bc417-6470-4556-9dbe-67ed2c3f2063","Type":"ContainerDied","Data":"44a19183b09c8c1fa399c5bf4399ced57f4ca99f997e6c0617e6bc1b086bc80b"} Apr 16 16:03:38.244067 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:38.243787 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"231bc417-6470-4556-9dbe-67ed2c3f2063","Type":"ContainerStarted","Data":"b4039830ac3df74055326e183040222e3e7ac2f317586628390f4400f2b7af87"} Apr 16 16:03:38.699767 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:38.699729 2584 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4e5a694-dbdd-44a6-8234-4845bff8c58c" path="/var/lib/kubelet/pods/e4e5a694-dbdd-44a6-8234-4845bff8c58c/volumes" Apr 16 16:03:39.249696 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:39.249666 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"231bc417-6470-4556-9dbe-67ed2c3f2063","Type":"ContainerStarted","Data":"50c1424fdc83bee43820d2175aeabaa974d548c36c2b3335e0ea8e31df9e1a91"} Apr 16 16:03:39.250055 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:39.249702 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"231bc417-6470-4556-9dbe-67ed2c3f2063","Type":"ContainerStarted","Data":"4b9b17d42f8b4b597d678e7f0c020ba12cd477d382d615783f0ec4440a255041"} Apr 16 16:03:39.250055 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:39.249712 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"231bc417-6470-4556-9dbe-67ed2c3f2063","Type":"ContainerStarted","Data":"da493ac59a2652ff124c6c2f45a6bcd99c21bb755e41beeb47fb2161fc192dd6"} Apr 16 16:03:39.250055 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:39.249722 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"231bc417-6470-4556-9dbe-67ed2c3f2063","Type":"ContainerStarted","Data":"54c102e8e7cb7c40cdd066556e247a76cc968076e6fc6d1c4139e7f58bd65bb9"} Apr 16 16:03:39.250055 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:39.249730 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"231bc417-6470-4556-9dbe-67ed2c3f2063","Type":"ContainerStarted","Data":"5543958ee58be424c7ddffd4ae6585d4ee094e4dc71dc4ae623ce26cbbb74b0c"} Apr 16 16:03:39.250055 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:39.249738 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"231bc417-6470-4556-9dbe-67ed2c3f2063","Type":"ContainerStarted","Data":"aede852db193720e8c2cb630dfda550784a9381e80cbf79fcc7d6cc5285abd30"} Apr 16 16:03:39.287462 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:39.287418 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.287405457 podStartE2EDuration="2.287405457s" podCreationTimestamp="2026-04-16 16:03:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:03:39.284827286 +0000 UTC m=+147.245019010" watchObservedRunningTime="2026-04-16 16:03:39.287405457 +0000 UTC m=+147.247597181" Apr 16 16:03:42.615310 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:03:42.615276 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:04:37.614790 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:04:37.614755 2584 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:04:37.631432 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:04:37.631403 2584 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:04:38.438739 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:04:38.438714 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 16:06:12.579869 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:06:12.579842 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65x5s_82145526-4c6f-43c3-8850-84adf5e445e9/ovn-acl-logging/0.log" Apr 16 16:06:12.580414 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:06:12.579947 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65x5s_82145526-4c6f-43c3-8850-84adf5e445e9/ovn-acl-logging/0.log" Apr 16 16:06:12.585473 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:06:12.585457 2584 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 16:11:12.598868 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:11:12.598841 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65x5s_82145526-4c6f-43c3-8850-84adf5e445e9/ovn-acl-logging/0.log" Apr 16 16:11:12.599754 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:11:12.599736 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65x5s_82145526-4c6f-43c3-8850-84adf5e445e9/ovn-acl-logging/0.log" Apr 16 16:15:25.340319 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:15:25.340288 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-6mqs7"] Apr 16 16:15:25.343493 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:15:25.343475 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-6mqs7" Apr 16 16:15:25.346922 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:15:25.346897 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-d7dg7\"" Apr 16 16:15:25.346922 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:15:25.346906 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 16:15:25.346922 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:15:25.346918 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 16:15:25.352665 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:15:25.352644 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-6mqs7"] Apr 16 16:15:25.440877 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:15:25.440856 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/734558b6-d016-423d-bb66-e0f3daaa8faf-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-6mqs7\" (UID: \"734558b6-d016-423d-bb66-e0f3daaa8faf\") " pod="cert-manager/cert-manager-webhook-597b96b99b-6mqs7" Apr 16 16:15:25.441002 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:15:25.440913 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x64th\" (UniqueName: \"kubernetes.io/projected/734558b6-d016-423d-bb66-e0f3daaa8faf-kube-api-access-x64th\") pod \"cert-manager-webhook-597b96b99b-6mqs7\" (UID: \"734558b6-d016-423d-bb66-e0f3daaa8faf\") " pod="cert-manager/cert-manager-webhook-597b96b99b-6mqs7" Apr 16 16:15:25.541684 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:15:25.541656 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x64th\" (UniqueName: \"kubernetes.io/projected/734558b6-d016-423d-bb66-e0f3daaa8faf-kube-api-access-x64th\") pod \"cert-manager-webhook-597b96b99b-6mqs7\" (UID: \"734558b6-d016-423d-bb66-e0f3daaa8faf\") " pod="cert-manager/cert-manager-webhook-597b96b99b-6mqs7" Apr 16 16:15:25.541774 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:15:25.541696 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/734558b6-d016-423d-bb66-e0f3daaa8faf-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-6mqs7\" (UID: \"734558b6-d016-423d-bb66-e0f3daaa8faf\") " pod="cert-manager/cert-manager-webhook-597b96b99b-6mqs7" Apr 16 16:15:25.553404 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:15:25.553386 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/734558b6-d016-423d-bb66-e0f3daaa8faf-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-6mqs7\" (UID: \"734558b6-d016-423d-bb66-e0f3daaa8faf\") " pod="cert-manager/cert-manager-webhook-597b96b99b-6mqs7" Apr 16 16:15:25.553549 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:15:25.553531 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x64th\" (UniqueName: \"kubernetes.io/projected/734558b6-d016-423d-bb66-e0f3daaa8faf-kube-api-access-x64th\") pod \"cert-manager-webhook-597b96b99b-6mqs7\" (UID: \"734558b6-d016-423d-bb66-e0f3daaa8faf\") " pod="cert-manager/cert-manager-webhook-597b96b99b-6mqs7" Apr 16 16:15:25.664711 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:15:25.664688 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-6mqs7" Apr 16 16:15:25.786243 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:15:25.786212 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-6mqs7"] Apr 16 16:15:25.789292 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:15:25.789254 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod734558b6_d016_423d_bb66_e0f3daaa8faf.slice/crio-00359850bb2e770378b6c5e670f26f907f8519f3623d755e041dfa0eb2403f16 WatchSource:0}: Error finding container 00359850bb2e770378b6c5e670f26f907f8519f3623d755e041dfa0eb2403f16: Status 404 returned error can't find the container with id 00359850bb2e770378b6c5e670f26f907f8519f3623d755e041dfa0eb2403f16 Apr 16 16:15:25.791124 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:15:25.791108 2584 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:15:26.226895 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:15:26.226868 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-6mqs7" event={"ID":"734558b6-d016-423d-bb66-e0f3daaa8faf","Type":"ContainerStarted","Data":"00359850bb2e770378b6c5e670f26f907f8519f3623d755e041dfa0eb2403f16"} Apr 16 16:15:29.237181 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:15:29.237150 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-6mqs7" event={"ID":"734558b6-d016-423d-bb66-e0f3daaa8faf","Type":"ContainerStarted","Data":"59f413ec6a2882ef5a627b02591ef5f4bc288044ed80f819ea9736cc038f2b23"} Apr 16 16:15:29.237537 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:15:29.237269 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-6mqs7" Apr 16 16:15:29.253228 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:15:29.253178 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-6mqs7" podStartSLOduration=1.178974895 podStartE2EDuration="4.25316417s" podCreationTimestamp="2026-04-16 16:15:25 +0000 UTC" firstStartedPulling="2026-04-16 16:15:25.791238627 +0000 UTC m=+853.751430329" lastFinishedPulling="2026-04-16 16:15:28.865427902 +0000 UTC m=+856.825619604" observedRunningTime="2026-04-16 16:15:29.251767673 +0000 UTC m=+857.211959397" watchObservedRunningTime="2026-04-16 16:15:29.25316417 +0000 UTC m=+857.213355894" Apr 16 16:15:35.242870 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:15:35.242836 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-6mqs7" Apr 16 16:15:35.435547 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:15:35.435514 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-2rjnl"] Apr 16 16:15:35.438552 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:15:35.438535 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-2rjnl" Apr 16 16:15:35.440887 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:15:35.440865 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-zjtn9\"" Apr 16 16:15:35.445658 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:15:35.445636 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-2rjnl"] Apr 16 16:15:35.617150 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:15:35.617083 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfj22\" (UniqueName: \"kubernetes.io/projected/7dcfba3f-9868-41d8-afb8-f7e968cbcc1b-kube-api-access-lfj22\") pod \"cert-manager-759f64656b-2rjnl\" (UID: \"7dcfba3f-9868-41d8-afb8-f7e968cbcc1b\") " pod="cert-manager/cert-manager-759f64656b-2rjnl" Apr 16 16:15:35.617150 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:15:35.617128 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7dcfba3f-9868-41d8-afb8-f7e968cbcc1b-bound-sa-token\") pod \"cert-manager-759f64656b-2rjnl\" (UID: \"7dcfba3f-9868-41d8-afb8-f7e968cbcc1b\") " pod="cert-manager/cert-manager-759f64656b-2rjnl" Apr 16 16:15:35.717789 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:15:35.717752 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lfj22\" (UniqueName: \"kubernetes.io/projected/7dcfba3f-9868-41d8-afb8-f7e968cbcc1b-kube-api-access-lfj22\") pod \"cert-manager-759f64656b-2rjnl\" (UID: \"7dcfba3f-9868-41d8-afb8-f7e968cbcc1b\") " pod="cert-manager/cert-manager-759f64656b-2rjnl" Apr 16 16:15:35.717895 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:15:35.717815 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7dcfba3f-9868-41d8-afb8-f7e968cbcc1b-bound-sa-token\") pod \"cert-manager-759f64656b-2rjnl\" (UID: \"7dcfba3f-9868-41d8-afb8-f7e968cbcc1b\") " pod="cert-manager/cert-manager-759f64656b-2rjnl" Apr 16 16:15:35.725924 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:15:35.725899 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7dcfba3f-9868-41d8-afb8-f7e968cbcc1b-bound-sa-token\") pod \"cert-manager-759f64656b-2rjnl\" (UID: \"7dcfba3f-9868-41d8-afb8-f7e968cbcc1b\") " pod="cert-manager/cert-manager-759f64656b-2rjnl" Apr 16 16:15:35.726147 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:15:35.726123 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfj22\" (UniqueName: \"kubernetes.io/projected/7dcfba3f-9868-41d8-afb8-f7e968cbcc1b-kube-api-access-lfj22\") pod \"cert-manager-759f64656b-2rjnl\" (UID: \"7dcfba3f-9868-41d8-afb8-f7e968cbcc1b\") " pod="cert-manager/cert-manager-759f64656b-2rjnl" Apr 16 16:15:35.748910 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:15:35.748887 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-2rjnl" Apr 16 16:15:35.863700 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:15:35.863678 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-2rjnl"] Apr 16 16:15:35.866057 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:15:35.866030 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7dcfba3f_9868_41d8_afb8_f7e968cbcc1b.slice/crio-cd18403b1208bda66afd64455a5672f943755e41f7e843965bb1e2fd1e9f8b77 WatchSource:0}: Error finding container cd18403b1208bda66afd64455a5672f943755e41f7e843965bb1e2fd1e9f8b77: Status 404 returned error can't find the container with id cd18403b1208bda66afd64455a5672f943755e41f7e843965bb1e2fd1e9f8b77 Apr 16 16:15:36.257830 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:15:36.257799 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-2rjnl" event={"ID":"7dcfba3f-9868-41d8-afb8-f7e968cbcc1b","Type":"ContainerStarted","Data":"dbd4e76d18baf77c367fed9ece3eaea0931afce2a66a554972c522919ec1f312"} Apr 16 16:15:36.257830 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:15:36.257834 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-2rjnl" event={"ID":"7dcfba3f-9868-41d8-afb8-f7e968cbcc1b","Type":"ContainerStarted","Data":"cd18403b1208bda66afd64455a5672f943755e41f7e843965bb1e2fd1e9f8b77"} Apr 16 16:15:36.272380 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:15:36.272322 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-2rjnl" podStartSLOduration=1.272309318 podStartE2EDuration="1.272309318s" podCreationTimestamp="2026-04-16 16:15:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:15:36.271553839 +0000 UTC m=+864.231745585" watchObservedRunningTime="2026-04-16 16:15:36.272309318 +0000 UTC m=+864.232501041" Apr 16 16:16:03.768437 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:03.768349 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5879d548d6-72fnv"] Apr 16 16:16:03.770602 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:03.770572 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5879d548d6-72fnv" Apr 16 16:16:03.773822 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:03.773799 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 16:16:03.773822 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:03.773816 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 16:16:03.774030 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:03.773799 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 16:16:03.774030 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:03.773841 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 16:16:03.774030 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:03.773808 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 16:16:03.774030 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:03.773802 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-5gld5\"" Apr 16 16:16:03.794107 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:03.794058 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5879d548d6-72fnv"] Apr 16 16:16:03.823421 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:03.823394 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf26s\" (UniqueName: \"kubernetes.io/projected/320a3360-ff52-4838-9486-9d08a32bda77-kube-api-access-zf26s\") pod \"lws-controller-manager-5879d548d6-72fnv\" (UID: \"320a3360-ff52-4838-9486-9d08a32bda77\") " pod="openshift-lws-operator/lws-controller-manager-5879d548d6-72fnv" Apr 16 16:16:03.823528 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:03.823449 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/320a3360-ff52-4838-9486-9d08a32bda77-manager-config\") pod \"lws-controller-manager-5879d548d6-72fnv\" (UID: \"320a3360-ff52-4838-9486-9d08a32bda77\") " pod="openshift-lws-operator/lws-controller-manager-5879d548d6-72fnv" Apr 16 16:16:03.823528 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:03.823477 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/320a3360-ff52-4838-9486-9d08a32bda77-cert\") pod \"lws-controller-manager-5879d548d6-72fnv\" (UID: \"320a3360-ff52-4838-9486-9d08a32bda77\") " pod="openshift-lws-operator/lws-controller-manager-5879d548d6-72fnv" Apr 16 16:16:03.823528 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:03.823507 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/320a3360-ff52-4838-9486-9d08a32bda77-metrics-cert\") pod \"lws-controller-manager-5879d548d6-72fnv\" (UID: \"320a3360-ff52-4838-9486-9d08a32bda77\") " pod="openshift-lws-operator/lws-controller-manager-5879d548d6-72fnv" Apr 16 16:16:03.923821 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:03.923796 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/320a3360-ff52-4838-9486-9d08a32bda77-manager-config\") pod \"lws-controller-manager-5879d548d6-72fnv\" (UID: \"320a3360-ff52-4838-9486-9d08a32bda77\") " pod="openshift-lws-operator/lws-controller-manager-5879d548d6-72fnv" Apr 16 16:16:03.923917 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:03.923825 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/320a3360-ff52-4838-9486-9d08a32bda77-cert\") pod \"lws-controller-manager-5879d548d6-72fnv\" (UID: \"320a3360-ff52-4838-9486-9d08a32bda77\") " pod="openshift-lws-operator/lws-controller-manager-5879d548d6-72fnv" Apr 16 16:16:03.923917 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:03.923847 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/320a3360-ff52-4838-9486-9d08a32bda77-metrics-cert\") pod \"lws-controller-manager-5879d548d6-72fnv\" (UID: \"320a3360-ff52-4838-9486-9d08a32bda77\") " pod="openshift-lws-operator/lws-controller-manager-5879d548d6-72fnv" Apr 16 16:16:03.923917 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:03.923887 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zf26s\" (UniqueName: \"kubernetes.io/projected/320a3360-ff52-4838-9486-9d08a32bda77-kube-api-access-zf26s\") pod \"lws-controller-manager-5879d548d6-72fnv\" (UID: \"320a3360-ff52-4838-9486-9d08a32bda77\") " pod="openshift-lws-operator/lws-controller-manager-5879d548d6-72fnv" Apr 16 16:16:03.924506 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:03.924473 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/320a3360-ff52-4838-9486-9d08a32bda77-manager-config\") pod \"lws-controller-manager-5879d548d6-72fnv\" (UID: \"320a3360-ff52-4838-9486-9d08a32bda77\") " pod="openshift-lws-operator/lws-controller-manager-5879d548d6-72fnv" Apr 16 16:16:03.926318 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:03.926299 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/320a3360-ff52-4838-9486-9d08a32bda77-cert\") pod \"lws-controller-manager-5879d548d6-72fnv\" (UID: \"320a3360-ff52-4838-9486-9d08a32bda77\") " pod="openshift-lws-operator/lws-controller-manager-5879d548d6-72fnv" Apr 16 16:16:03.926419 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:03.926363 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/320a3360-ff52-4838-9486-9d08a32bda77-metrics-cert\") pod \"lws-controller-manager-5879d548d6-72fnv\" (UID: \"320a3360-ff52-4838-9486-9d08a32bda77\") " pod="openshift-lws-operator/lws-controller-manager-5879d548d6-72fnv" Apr 16 16:16:03.932743 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:03.932720 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf26s\" (UniqueName: \"kubernetes.io/projected/320a3360-ff52-4838-9486-9d08a32bda77-kube-api-access-zf26s\") pod \"lws-controller-manager-5879d548d6-72fnv\" (UID: \"320a3360-ff52-4838-9486-9d08a32bda77\") " pod="openshift-lws-operator/lws-controller-manager-5879d548d6-72fnv" Apr 16 16:16:04.107122 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:04.107056 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5879d548d6-72fnv" Apr 16 16:16:04.228879 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:04.228856 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5879d548d6-72fnv"] Apr 16 16:16:04.231340 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:16:04.231310 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod320a3360_ff52_4838_9486_9d08a32bda77.slice/crio-0e0dcb97005ce6fdd2c02965dc304fcf5d2bc9a083fc5bf8abf671297cd9c6dc WatchSource:0}: Error finding container 0e0dcb97005ce6fdd2c02965dc304fcf5d2bc9a083fc5bf8abf671297cd9c6dc: Status 404 returned error can't find the container with id 0e0dcb97005ce6fdd2c02965dc304fcf5d2bc9a083fc5bf8abf671297cd9c6dc Apr 16 16:16:04.334684 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:04.334654 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5879d548d6-72fnv" event={"ID":"320a3360-ff52-4838-9486-9d08a32bda77","Type":"ContainerStarted","Data":"0e0dcb97005ce6fdd2c02965dc304fcf5d2bc9a083fc5bf8abf671297cd9c6dc"} Apr 16 16:16:07.346274 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:07.346232 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5879d548d6-72fnv" event={"ID":"320a3360-ff52-4838-9486-9d08a32bda77","Type":"ContainerStarted","Data":"fa46e1bff90cf74bba6ab909ce93c8e13dd70ccd38f40ae9ff4fa77b55b09a1c"} Apr 16 16:16:07.346990 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:07.346942 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5879d548d6-72fnv" Apr 16 16:16:07.366334 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:07.366290 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5879d548d6-72fnv" podStartSLOduration=1.826131038 podStartE2EDuration="4.366279193s" podCreationTimestamp="2026-04-16 16:16:03 +0000 UTC" firstStartedPulling="2026-04-16 16:16:04.233126735 +0000 UTC m=+892.193318437" lastFinishedPulling="2026-04-16 16:16:06.773274887 +0000 UTC m=+894.733466592" observedRunningTime="2026-04-16 16:16:07.36474781 +0000 UTC m=+895.324939531" watchObservedRunningTime="2026-04-16 16:16:07.366279193 +0000 UTC m=+895.326470917" Apr 16 16:16:12.619655 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:12.619629 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65x5s_82145526-4c6f-43c3-8850-84adf5e445e9/ovn-acl-logging/0.log" Apr 16 16:16:12.621109 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:12.621084 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65x5s_82145526-4c6f-43c3-8850-84adf5e445e9/ovn-acl-logging/0.log" Apr 16 16:16:19.354091 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:19.354062 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5879d548d6-72fnv" Apr 16 16:16:24.762589 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:24.762556 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc"] Apr 16 16:16:24.764895 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:24.764875 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" Apr 16 16:16:24.767283 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:24.767263 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 16:16:24.767474 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:24.767450 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 16:16:24.767590 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:24.767494 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"openshift-ai-inference-openshift-default-dockercfg-trd7k\"" Apr 16 16:16:24.767865 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:24.767605 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 16:16:24.775823 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:24.775784 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc"] Apr 16 16:16:24.880720 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:24.880689 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-wv6xc\" (UID: \"632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" Apr 16 16:16:24.880822 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:24.880736 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fm4x\" (UniqueName: \"kubernetes.io/projected/632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f-kube-api-access-6fm4x\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-wv6xc\" (UID: \"632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" Apr 16 16:16:24.880822 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:24.880759 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-wv6xc\" (UID: \"632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" Apr 16 16:16:24.880822 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:24.880777 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-wv6xc\" (UID: \"632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" Apr 16 16:16:24.880822 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:24.880800 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-wv6xc\" (UID: \"632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" Apr 16 16:16:24.880977 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:24.880831 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-wv6xc\" (UID: \"632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" Apr 16 16:16:24.880977 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:24.880881 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-wv6xc\" (UID: \"632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" Apr 16 16:16:24.880977 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:24.880903 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-wv6xc\" (UID: \"632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" Apr 16 16:16:24.880977 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:24.880940 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-wv6xc\" (UID: \"632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" Apr 16 16:16:24.981496 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:24.981470 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-wv6xc\" (UID: \"632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" Apr 16 16:16:24.981604 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:24.981521 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-wv6xc\" (UID: \"632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" Apr 16 16:16:24.981653 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:24.981634 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-wv6xc\" (UID: \"632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" Apr 16 16:16:24.981704 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:24.981666 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-wv6xc\" (UID: \"632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" Apr 16 16:16:24.981755 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:24.981728 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-wv6xc\" (UID: \"632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" Apr 16 16:16:24.981810 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:24.981751 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6fm4x\" (UniqueName: \"kubernetes.io/projected/632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f-kube-api-access-6fm4x\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-wv6xc\" (UID: \"632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" Apr 16 16:16:24.981810 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:24.981781 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-wv6xc\" (UID: \"632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" Apr 16 16:16:24.981810 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:24.981785 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-wv6xc\" (UID: \"632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" Apr 16 16:16:24.981939 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:24.981870 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-wv6xc\" (UID: \"632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" Apr 16 16:16:24.982122 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:24.982077 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-wv6xc\" (UID: \"632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" Apr 16 16:16:24.982204 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:24.982127 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-wv6xc\" (UID: \"632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" Apr 16 16:16:24.982204 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:24.982170 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-wv6xc\" (UID: \"632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" Apr 16 16:16:24.982391 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:24.982370 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-wv6xc\" (UID: \"632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" Apr 16 16:16:24.982449 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:24.982434 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-wv6xc\" (UID: \"632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" Apr 16 16:16:24.983975 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:24.983933 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-wv6xc\" (UID: \"632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" Apr 16 16:16:24.984379 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:24.984361 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-wv6xc\" (UID: \"632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" Apr 16 16:16:24.992379 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:24.992361 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-wv6xc\" (UID: \"632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" Apr 16 16:16:24.992457 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:24.992444 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fm4x\" (UniqueName: \"kubernetes.io/projected/632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f-kube-api-access-6fm4x\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-wv6xc\" (UID: \"632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" Apr 16 16:16:25.077158 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:25.077105 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" Apr 16 16:16:25.198020 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:25.197872 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc"] Apr 16 16:16:25.200491 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:16:25.200460 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod632fe0a6_d8a9_4cf2_b1bd_e086ee0e738f.slice/crio-865318b2f1464a60e1848f7fdf38774d3adbf9dab0ea317d688f527409dd67f1 WatchSource:0}: Error finding container 865318b2f1464a60e1848f7fdf38774d3adbf9dab0ea317d688f527409dd67f1: Status 404 returned error can't find the container with id 865318b2f1464a60e1848f7fdf38774d3adbf9dab0ea317d688f527409dd67f1 Apr 16 16:16:25.397386 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:25.397324 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" event={"ID":"632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f","Type":"ContainerStarted","Data":"865318b2f1464a60e1848f7fdf38774d3adbf9dab0ea317d688f527409dd67f1"} Apr 16 16:16:27.446049 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:27.446009 2584 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 16:16:27.446322 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:27.446081 2584 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 16:16:27.446322 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:27.446126 2584 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 16:16:28.407668 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:28.407630 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" event={"ID":"632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f","Type":"ContainerStarted","Data":"631f5c0403a763e4b94e3cb63bcfd30fefb3f8f4951e3a06a35490ffd17ae80b"} Apr 16 16:16:28.436165 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:28.436115 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" podStartSLOduration=2.19283225 podStartE2EDuration="4.436097792s" podCreationTimestamp="2026-04-16 16:16:24 +0000 UTC" firstStartedPulling="2026-04-16 16:16:25.202482224 +0000 UTC m=+913.162673929" lastFinishedPulling="2026-04-16 16:16:27.445747768 +0000 UTC m=+915.405939471" observedRunningTime="2026-04-16 16:16:28.434909359 +0000 UTC m=+916.395101126" watchObservedRunningTime="2026-04-16 16:16:28.436097792 +0000 UTC m=+916.396289516" Apr 16 16:16:29.078377 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:29.078295 2584 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" Apr 16 16:16:29.083617 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:29.083594 2584 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" Apr 16 16:16:29.410476 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:29.410400 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" Apr 16 16:16:29.411272 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:29.411254 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-wv6xc" Apr 16 16:16:51.686832 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:51.686786 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-skggk"] Apr 16 16:16:51.690942 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:51.690914 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-skggk" Apr 16 16:16:51.693651 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:51.693623 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 16:16:51.693769 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:51.693681 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-wrwwf\"" Apr 16 16:16:51.693824 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:51.693801 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 16:16:51.698896 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:51.698875 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-skggk"] Apr 16 16:16:51.781192 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:51.781167 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjcmf\" (UniqueName: \"kubernetes.io/projected/cfb990b1-22b4-4fc6-be44-09cd1c9aeadb-kube-api-access-vjcmf\") pod \"authorino-operator-7587b89b76-skggk\" (UID: \"cfb990b1-22b4-4fc6-be44-09cd1c9aeadb\") " pod="kuadrant-system/authorino-operator-7587b89b76-skggk" Apr 16 16:16:51.881480 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:51.881437 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjcmf\" (UniqueName: \"kubernetes.io/projected/cfb990b1-22b4-4fc6-be44-09cd1c9aeadb-kube-api-access-vjcmf\") pod \"authorino-operator-7587b89b76-skggk\" (UID: \"cfb990b1-22b4-4fc6-be44-09cd1c9aeadb\") " pod="kuadrant-system/authorino-operator-7587b89b76-skggk" Apr 16 16:16:51.890080 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:51.890051 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjcmf\" (UniqueName: \"kubernetes.io/projected/cfb990b1-22b4-4fc6-be44-09cd1c9aeadb-kube-api-access-vjcmf\") pod \"authorino-operator-7587b89b76-skggk\" (UID: \"cfb990b1-22b4-4fc6-be44-09cd1c9aeadb\") " pod="kuadrant-system/authorino-operator-7587b89b76-skggk" Apr 16 16:16:52.002485 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:52.002420 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-skggk" Apr 16 16:16:52.128871 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:52.128826 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-skggk"] Apr 16 16:16:52.131150 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:16:52.131120 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcfb990b1_22b4_4fc6_be44_09cd1c9aeadb.slice/crio-989f75e12babd93f7ba49f3705689e76ce8c612b267c108bb00ff4265d28cd83 WatchSource:0}: Error finding container 989f75e12babd93f7ba49f3705689e76ce8c612b267c108bb00ff4265d28cd83: Status 404 returned error can't find the container with id 989f75e12babd93f7ba49f3705689e76ce8c612b267c108bb00ff4265d28cd83 Apr 16 16:16:52.473007 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:52.472974 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-skggk" event={"ID":"cfb990b1-22b4-4fc6-be44-09cd1c9aeadb","Type":"ContainerStarted","Data":"989f75e12babd93f7ba49f3705689e76ce8c612b267c108bb00ff4265d28cd83"} Apr 16 16:16:54.479565 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:54.479530 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-skggk" event={"ID":"cfb990b1-22b4-4fc6-be44-09cd1c9aeadb","Type":"ContainerStarted","Data":"d0b16cfecca6ece3262e80113d90a16cf794d58ea2db8979e50c1b6737fe8847"} Apr 16 16:16:54.479896 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:54.479591 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-7587b89b76-skggk" Apr 16 16:16:54.496545 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:16:54.496498 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-7587b89b76-skggk" podStartSLOduration=1.477420467 podStartE2EDuration="3.496485566s" podCreationTimestamp="2026-04-16 16:16:51 +0000 UTC" firstStartedPulling="2026-04-16 16:16:52.133294565 +0000 UTC m=+940.093486267" lastFinishedPulling="2026-04-16 16:16:54.152359663 +0000 UTC m=+942.112551366" observedRunningTime="2026-04-16 16:16:54.495280286 +0000 UTC m=+942.455472007" watchObservedRunningTime="2026-04-16 16:16:54.496485566 +0000 UTC m=+942.456677290" Apr 16 16:17:05.484677 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:05.484642 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-7587b89b76-skggk" Apr 16 16:17:38.577421 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:38.577333 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-674b59b84c-ghlqv"] Apr 16 16:17:38.580803 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:38.580779 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-ghlqv" Apr 16 16:17:38.583187 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:38.583158 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-b8gtt\"" Apr 16 16:17:38.585171 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:38.585141 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-ghlqv"] Apr 16 16:17:38.625127 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:38.625093 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2lp6z\" (UniqueName: \"kubernetes.io/projected/2d7043ed-c152-4bcd-9570-d655d4232a7c-kube-api-access-2lp6z\") pod \"authorino-674b59b84c-ghlqv\" (UID: \"2d7043ed-c152-4bcd-9570-d655d4232a7c\") " pod="kuadrant-system/authorino-674b59b84c-ghlqv" Apr 16 16:17:38.725770 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:38.725743 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2lp6z\" (UniqueName: \"kubernetes.io/projected/2d7043ed-c152-4bcd-9570-d655d4232a7c-kube-api-access-2lp6z\") pod \"authorino-674b59b84c-ghlqv\" (UID: \"2d7043ed-c152-4bcd-9570-d655d4232a7c\") " pod="kuadrant-system/authorino-674b59b84c-ghlqv" Apr 16 16:17:38.733503 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:38.733480 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2lp6z\" (UniqueName: \"kubernetes.io/projected/2d7043ed-c152-4bcd-9570-d655d4232a7c-kube-api-access-2lp6z\") pod \"authorino-674b59b84c-ghlqv\" (UID: \"2d7043ed-c152-4bcd-9570-d655d4232a7c\") " pod="kuadrant-system/authorino-674b59b84c-ghlqv" Apr 16 16:17:38.770421 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:38.770399 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79cbc94b89-8ssxd"] Apr 16 16:17:38.773394 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:38.773379 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-8ssxd" Apr 16 16:17:38.785723 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:38.785703 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-8ssxd"] Apr 16 16:17:38.826425 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:38.826397 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhdwd\" (UniqueName: \"kubernetes.io/projected/429c2dd4-a743-4491-b304-4d344f0b0d54-kube-api-access-fhdwd\") pod \"authorino-79cbc94b89-8ssxd\" (UID: \"429c2dd4-a743-4491-b304-4d344f0b0d54\") " pod="kuadrant-system/authorino-79cbc94b89-8ssxd" Apr 16 16:17:38.891066 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:38.891015 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-ghlqv" Apr 16 16:17:38.927189 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:38.927075 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fhdwd\" (UniqueName: \"kubernetes.io/projected/429c2dd4-a743-4491-b304-4d344f0b0d54-kube-api-access-fhdwd\") pod \"authorino-79cbc94b89-8ssxd\" (UID: \"429c2dd4-a743-4491-b304-4d344f0b0d54\") " pod="kuadrant-system/authorino-79cbc94b89-8ssxd" Apr 16 16:17:38.935704 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:38.935681 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhdwd\" (UniqueName: \"kubernetes.io/projected/429c2dd4-a743-4491-b304-4d344f0b0d54-kube-api-access-fhdwd\") pod \"authorino-79cbc94b89-8ssxd\" (UID: \"429c2dd4-a743-4491-b304-4d344f0b0d54\") " pod="kuadrant-system/authorino-79cbc94b89-8ssxd" Apr 16 16:17:39.006894 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:39.006870 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-ghlqv"] Apr 16 16:17:39.009136 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:17:39.009111 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d7043ed_c152_4bcd_9570_d655d4232a7c.slice/crio-0bfc7528a44bd54e91ee7e1cc536ce45ea155d1e9a36f50f66e34938185cec12 WatchSource:0}: Error finding container 0bfc7528a44bd54e91ee7e1cc536ce45ea155d1e9a36f50f66e34938185cec12: Status 404 returned error can't find the container with id 0bfc7528a44bd54e91ee7e1cc536ce45ea155d1e9a36f50f66e34938185cec12 Apr 16 16:17:39.081690 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:39.081668 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-8ssxd" Apr 16 16:17:39.197278 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:39.197245 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-8ssxd"] Apr 16 16:17:39.201016 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:17:39.200993 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod429c2dd4_a743_4491_b304_4d344f0b0d54.slice/crio-e8eed961604bdcab169964e6b8588604ecc13a4004ec12146f206fe5936d9cb4 WatchSource:0}: Error finding container e8eed961604bdcab169964e6b8588604ecc13a4004ec12146f206fe5936d9cb4: Status 404 returned error can't find the container with id e8eed961604bdcab169964e6b8588604ecc13a4004ec12146f206fe5936d9cb4 Apr 16 16:17:39.618272 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:39.618238 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-8ssxd" event={"ID":"429c2dd4-a743-4491-b304-4d344f0b0d54","Type":"ContainerStarted","Data":"e8eed961604bdcab169964e6b8588604ecc13a4004ec12146f206fe5936d9cb4"} Apr 16 16:17:39.619183 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:39.619162 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-ghlqv" event={"ID":"2d7043ed-c152-4bcd-9570-d655d4232a7c","Type":"ContainerStarted","Data":"0bfc7528a44bd54e91ee7e1cc536ce45ea155d1e9a36f50f66e34938185cec12"} Apr 16 16:17:43.632441 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:43.632403 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-8ssxd" event={"ID":"429c2dd4-a743-4491-b304-4d344f0b0d54","Type":"ContainerStarted","Data":"8d2741bee4a34b9044e36304bd81d1c645951e46631e1bea056db97229749b4c"} Apr 16 16:17:43.633849 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:43.633821 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-ghlqv" event={"ID":"2d7043ed-c152-4bcd-9570-d655d4232a7c","Type":"ContainerStarted","Data":"eb2c8d005ebfa108f11f0ca5a792b03c08e54affe56f219d2a6802b9fb8b46ee"} Apr 16 16:17:43.647858 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:43.647808 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79cbc94b89-8ssxd" podStartSLOduration=1.9434178119999999 podStartE2EDuration="5.647795238s" podCreationTimestamp="2026-04-16 16:17:38 +0000 UTC" firstStartedPulling="2026-04-16 16:17:39.202312878 +0000 UTC m=+987.162504583" lastFinishedPulling="2026-04-16 16:17:42.906690293 +0000 UTC m=+990.866882009" observedRunningTime="2026-04-16 16:17:43.646026756 +0000 UTC m=+991.606218480" watchObservedRunningTime="2026-04-16 16:17:43.647795238 +0000 UTC m=+991.607986962" Apr 16 16:17:43.661397 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:43.661350 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-674b59b84c-ghlqv" podStartSLOduration=1.8328576349999999 podStartE2EDuration="5.661334483s" podCreationTimestamp="2026-04-16 16:17:38 +0000 UTC" firstStartedPulling="2026-04-16 16:17:39.010437016 +0000 UTC m=+986.970628719" lastFinishedPulling="2026-04-16 16:17:42.838913866 +0000 UTC m=+990.799105567" observedRunningTime="2026-04-16 16:17:43.660553567 +0000 UTC m=+991.620745310" watchObservedRunningTime="2026-04-16 16:17:43.661334483 +0000 UTC m=+991.621526207" Apr 16 16:17:43.685446 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:43.685419 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-ghlqv"] Apr 16 16:17:45.641298 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:45.641235 2584 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-674b59b84c-ghlqv" podUID="2d7043ed-c152-4bcd-9570-d655d4232a7c" containerName="authorino" containerID="cri-o://eb2c8d005ebfa108f11f0ca5a792b03c08e54affe56f219d2a6802b9fb8b46ee" gracePeriod=30 Apr 16 16:17:45.880893 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:45.880873 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-ghlqv" Apr 16 16:17:45.988342 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:45.988317 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2lp6z\" (UniqueName: \"kubernetes.io/projected/2d7043ed-c152-4bcd-9570-d655d4232a7c-kube-api-access-2lp6z\") pod \"2d7043ed-c152-4bcd-9570-d655d4232a7c\" (UID: \"2d7043ed-c152-4bcd-9570-d655d4232a7c\") " Apr 16 16:17:45.990550 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:45.990527 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d7043ed-c152-4bcd-9570-d655d4232a7c-kube-api-access-2lp6z" (OuterVolumeSpecName: "kube-api-access-2lp6z") pod "2d7043ed-c152-4bcd-9570-d655d4232a7c" (UID: "2d7043ed-c152-4bcd-9570-d655d4232a7c"). InnerVolumeSpecName "kube-api-access-2lp6z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:17:46.089090 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:46.089053 2584 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-2lp6z\" (UniqueName: \"kubernetes.io/projected/2d7043ed-c152-4bcd-9570-d655d4232a7c-kube-api-access-2lp6z\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:17:46.645927 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:46.645891 2584 generic.go:358] "Generic (PLEG): container finished" podID="2d7043ed-c152-4bcd-9570-d655d4232a7c" containerID="eb2c8d005ebfa108f11f0ca5a792b03c08e54affe56f219d2a6802b9fb8b46ee" exitCode=0 Apr 16 16:17:46.646430 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:46.645942 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-ghlqv" event={"ID":"2d7043ed-c152-4bcd-9570-d655d4232a7c","Type":"ContainerDied","Data":"eb2c8d005ebfa108f11f0ca5a792b03c08e54affe56f219d2a6802b9fb8b46ee"} Apr 16 16:17:46.646430 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:46.645966 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-ghlqv" Apr 16 16:17:46.646430 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:46.645996 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-ghlqv" event={"ID":"2d7043ed-c152-4bcd-9570-d655d4232a7c","Type":"ContainerDied","Data":"0bfc7528a44bd54e91ee7e1cc536ce45ea155d1e9a36f50f66e34938185cec12"} Apr 16 16:17:46.646430 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:46.646014 2584 scope.go:117] "RemoveContainer" containerID="eb2c8d005ebfa108f11f0ca5a792b03c08e54affe56f219d2a6802b9fb8b46ee" Apr 16 16:17:46.654541 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:46.654519 2584 scope.go:117] "RemoveContainer" containerID="eb2c8d005ebfa108f11f0ca5a792b03c08e54affe56f219d2a6802b9fb8b46ee" Apr 16 16:17:46.654827 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:17:46.654807 2584 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb2c8d005ebfa108f11f0ca5a792b03c08e54affe56f219d2a6802b9fb8b46ee\": container with ID starting with eb2c8d005ebfa108f11f0ca5a792b03c08e54affe56f219d2a6802b9fb8b46ee not found: ID does not exist" containerID="eb2c8d005ebfa108f11f0ca5a792b03c08e54affe56f219d2a6802b9fb8b46ee" Apr 16 16:17:46.654886 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:46.654837 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb2c8d005ebfa108f11f0ca5a792b03c08e54affe56f219d2a6802b9fb8b46ee"} err="failed to get container status \"eb2c8d005ebfa108f11f0ca5a792b03c08e54affe56f219d2a6802b9fb8b46ee\": rpc error: code = NotFound desc = could not find container \"eb2c8d005ebfa108f11f0ca5a792b03c08e54affe56f219d2a6802b9fb8b46ee\": container with ID starting with eb2c8d005ebfa108f11f0ca5a792b03c08e54affe56f219d2a6802b9fb8b46ee not found: ID does not exist" Apr 16 16:17:46.666759 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:46.666730 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-ghlqv"] Apr 16 16:17:46.669741 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:46.669716 2584 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-674b59b84c-ghlqv"] Apr 16 16:17:46.698193 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:46.698163 2584 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d7043ed-c152-4bcd-9570-d655d4232a7c" path="/var/lib/kubelet/pods/2d7043ed-c152-4bcd-9570-d655d4232a7c/volumes" Apr 16 16:17:59.303745 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:59.303714 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-68bd676465-bkrvd"] Apr 16 16:17:59.304147 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:59.304027 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d7043ed-c152-4bcd-9570-d655d4232a7c" containerName="authorino" Apr 16 16:17:59.304147 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:59.304038 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d7043ed-c152-4bcd-9570-d655d4232a7c" containerName="authorino" Apr 16 16:17:59.304147 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:59.304106 2584 memory_manager.go:356] "RemoveStaleState removing state" podUID="2d7043ed-c152-4bcd-9570-d655d4232a7c" containerName="authorino" Apr 16 16:17:59.308500 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:59.308482 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-bkrvd" Apr 16 16:17:59.311050 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:59.311016 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 16 16:17:59.313853 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:59.313831 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-bkrvd"] Apr 16 16:17:59.383129 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:59.383101 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4jtf\" (UniqueName: \"kubernetes.io/projected/d741c8d3-f544-45b6-95fe-2fab2f9d8bf6-kube-api-access-v4jtf\") pod \"authorino-68bd676465-bkrvd\" (UID: \"d741c8d3-f544-45b6-95fe-2fab2f9d8bf6\") " pod="kuadrant-system/authorino-68bd676465-bkrvd" Apr 16 16:17:59.383231 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:59.383147 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/d741c8d3-f544-45b6-95fe-2fab2f9d8bf6-tls-cert\") pod \"authorino-68bd676465-bkrvd\" (UID: \"d741c8d3-f544-45b6-95fe-2fab2f9d8bf6\") " pod="kuadrant-system/authorino-68bd676465-bkrvd" Apr 16 16:17:59.484381 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:59.484330 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/d741c8d3-f544-45b6-95fe-2fab2f9d8bf6-tls-cert\") pod \"authorino-68bd676465-bkrvd\" (UID: \"d741c8d3-f544-45b6-95fe-2fab2f9d8bf6\") " pod="kuadrant-system/authorino-68bd676465-bkrvd" Apr 16 16:17:59.484482 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:59.484422 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v4jtf\" (UniqueName: \"kubernetes.io/projected/d741c8d3-f544-45b6-95fe-2fab2f9d8bf6-kube-api-access-v4jtf\") pod \"authorino-68bd676465-bkrvd\" (UID: \"d741c8d3-f544-45b6-95fe-2fab2f9d8bf6\") " pod="kuadrant-system/authorino-68bd676465-bkrvd" Apr 16 16:17:59.486993 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:59.486971 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/d741c8d3-f544-45b6-95fe-2fab2f9d8bf6-tls-cert\") pod \"authorino-68bd676465-bkrvd\" (UID: \"d741c8d3-f544-45b6-95fe-2fab2f9d8bf6\") " pod="kuadrant-system/authorino-68bd676465-bkrvd" Apr 16 16:17:59.491872 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:59.491845 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4jtf\" (UniqueName: \"kubernetes.io/projected/d741c8d3-f544-45b6-95fe-2fab2f9d8bf6-kube-api-access-v4jtf\") pod \"authorino-68bd676465-bkrvd\" (UID: \"d741c8d3-f544-45b6-95fe-2fab2f9d8bf6\") " pod="kuadrant-system/authorino-68bd676465-bkrvd" Apr 16 16:17:59.618609 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:59.618551 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-bkrvd" Apr 16 16:17:59.736844 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:17:59.736820 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-bkrvd"] Apr 16 16:17:59.738644 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:17:59.738619 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd741c8d3_f544_45b6_95fe_2fab2f9d8bf6.slice/crio-a1584c145c234ff6bb64aa51f139432828b57024723f9f3be6dd6330706e028a WatchSource:0}: Error finding container a1584c145c234ff6bb64aa51f139432828b57024723f9f3be6dd6330706e028a: Status 404 returned error can't find the container with id a1584c145c234ff6bb64aa51f139432828b57024723f9f3be6dd6330706e028a Apr 16 16:18:00.689032 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:18:00.688995 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-bkrvd" event={"ID":"d741c8d3-f544-45b6-95fe-2fab2f9d8bf6","Type":"ContainerStarted","Data":"b71d43de3a950089c0a902ed9d537dee796b8f9dcfb1518f11c90f9b84809724"} Apr 16 16:18:00.689376 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:18:00.689039 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-bkrvd" event={"ID":"d741c8d3-f544-45b6-95fe-2fab2f9d8bf6","Type":"ContainerStarted","Data":"a1584c145c234ff6bb64aa51f139432828b57024723f9f3be6dd6330706e028a"} Apr 16 16:18:00.705006 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:18:00.704944 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-68bd676465-bkrvd" podStartSLOduration=1.234934784 podStartE2EDuration="1.704931106s" podCreationTimestamp="2026-04-16 16:17:59 +0000 UTC" firstStartedPulling="2026-04-16 16:17:59.739988585 +0000 UTC m=+1007.700180288" lastFinishedPulling="2026-04-16 16:18:00.209984908 +0000 UTC m=+1008.170176610" observedRunningTime="2026-04-16 16:18:00.703227427 +0000 UTC m=+1008.663419157" watchObservedRunningTime="2026-04-16 16:18:00.704931106 +0000 UTC m=+1008.665122830" Apr 16 16:18:00.727771 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:18:00.727745 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-8ssxd"] Apr 16 16:18:00.727922 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:18:00.727904 2584 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-79cbc94b89-8ssxd" podUID="429c2dd4-a743-4491-b304-4d344f0b0d54" containerName="authorino" containerID="cri-o://8d2741bee4a34b9044e36304bd81d1c645951e46631e1bea056db97229749b4c" gracePeriod=30 Apr 16 16:18:00.962794 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:18:00.962774 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-8ssxd" Apr 16 16:18:01.100987 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:18:01.100938 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fhdwd\" (UniqueName: \"kubernetes.io/projected/429c2dd4-a743-4491-b304-4d344f0b0d54-kube-api-access-fhdwd\") pod \"429c2dd4-a743-4491-b304-4d344f0b0d54\" (UID: \"429c2dd4-a743-4491-b304-4d344f0b0d54\") " Apr 16 16:18:01.103148 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:18:01.103120 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/429c2dd4-a743-4491-b304-4d344f0b0d54-kube-api-access-fhdwd" (OuterVolumeSpecName: "kube-api-access-fhdwd") pod "429c2dd4-a743-4491-b304-4d344f0b0d54" (UID: "429c2dd4-a743-4491-b304-4d344f0b0d54"). InnerVolumeSpecName "kube-api-access-fhdwd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:18:01.201777 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:18:01.201722 2584 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fhdwd\" (UniqueName: \"kubernetes.io/projected/429c2dd4-a743-4491-b304-4d344f0b0d54-kube-api-access-fhdwd\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:18:01.696168 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:18:01.696139 2584 generic.go:358] "Generic (PLEG): container finished" podID="429c2dd4-a743-4491-b304-4d344f0b0d54" containerID="8d2741bee4a34b9044e36304bd81d1c645951e46631e1bea056db97229749b4c" exitCode=0 Apr 16 16:18:01.696605 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:18:01.696195 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-8ssxd" Apr 16 16:18:01.696605 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:18:01.696227 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-8ssxd" event={"ID":"429c2dd4-a743-4491-b304-4d344f0b0d54","Type":"ContainerDied","Data":"8d2741bee4a34b9044e36304bd81d1c645951e46631e1bea056db97229749b4c"} Apr 16 16:18:01.696605 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:18:01.696255 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-8ssxd" event={"ID":"429c2dd4-a743-4491-b304-4d344f0b0d54","Type":"ContainerDied","Data":"e8eed961604bdcab169964e6b8588604ecc13a4004ec12146f206fe5936d9cb4"} Apr 16 16:18:01.696605 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:18:01.696270 2584 scope.go:117] "RemoveContainer" containerID="8d2741bee4a34b9044e36304bd81d1c645951e46631e1bea056db97229749b4c" Apr 16 16:18:01.704309 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:18:01.704293 2584 scope.go:117] "RemoveContainer" containerID="8d2741bee4a34b9044e36304bd81d1c645951e46631e1bea056db97229749b4c" Apr 16 16:18:01.704547 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:18:01.704529 2584 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d2741bee4a34b9044e36304bd81d1c645951e46631e1bea056db97229749b4c\": container with ID starting with 8d2741bee4a34b9044e36304bd81d1c645951e46631e1bea056db97229749b4c not found: ID does not exist" containerID="8d2741bee4a34b9044e36304bd81d1c645951e46631e1bea056db97229749b4c" Apr 16 16:18:01.704606 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:18:01.704557 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d2741bee4a34b9044e36304bd81d1c645951e46631e1bea056db97229749b4c"} err="failed to get container status \"8d2741bee4a34b9044e36304bd81d1c645951e46631e1bea056db97229749b4c\": rpc error: code = NotFound desc = could not find container \"8d2741bee4a34b9044e36304bd81d1c645951e46631e1bea056db97229749b4c\": container with ID starting with 8d2741bee4a34b9044e36304bd81d1c645951e46631e1bea056db97229749b4c not found: ID does not exist" Apr 16 16:18:01.718080 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:18:01.718056 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-8ssxd"] Apr 16 16:18:01.719343 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:18:01.719324 2584 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-8ssxd"] Apr 16 16:18:02.697917 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:18:02.697890 2584 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="429c2dd4-a743-4491-b304-4d344f0b0d54" path="/var/lib/kubelet/pods/429c2dd4-a743-4491-b304-4d344f0b0d54/volumes" Apr 16 16:19:48.816909 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:19:48.816875 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-tx69r"] Apr 16 16:19:48.819450 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:19:48.817212 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="429c2dd4-a743-4491-b304-4d344f0b0d54" containerName="authorino" Apr 16 16:19:48.819450 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:19:48.817225 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="429c2dd4-a743-4491-b304-4d344f0b0d54" containerName="authorino" Apr 16 16:19:48.819450 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:19:48.817305 2584 memory_manager.go:356] "RemoveStaleState removing state" podUID="429c2dd4-a743-4491-b304-4d344f0b0d54" containerName="authorino" Apr 16 16:19:48.820220 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:19:48.820198 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-tx69r" Apr 16 16:19:48.822702 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:19:48.822560 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 16 16:19:48.822702 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:19:48.822587 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 16:19:48.822702 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:19:48.822592 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 16:19:48.823420 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:19:48.823398 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-bl9k5\"" Apr 16 16:19:48.829131 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:19:48.829102 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-tx69r"] Apr 16 16:19:48.980781 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:19:48.980753 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twpkl\" (UniqueName: \"kubernetes.io/projected/990a703c-862e-42b0-a6c3-16cf98da639f-kube-api-access-twpkl\") pod \"s3-init-tx69r\" (UID: \"990a703c-862e-42b0-a6c3-16cf98da639f\") " pod="kserve/s3-init-tx69r" Apr 16 16:19:49.081269 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:19:49.081210 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twpkl\" (UniqueName: \"kubernetes.io/projected/990a703c-862e-42b0-a6c3-16cf98da639f-kube-api-access-twpkl\") pod \"s3-init-tx69r\" (UID: \"990a703c-862e-42b0-a6c3-16cf98da639f\") " pod="kserve/s3-init-tx69r" Apr 16 16:19:49.088667 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:19:49.088643 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twpkl\" (UniqueName: \"kubernetes.io/projected/990a703c-862e-42b0-a6c3-16cf98da639f-kube-api-access-twpkl\") pod \"s3-init-tx69r\" (UID: \"990a703c-862e-42b0-a6c3-16cf98da639f\") " pod="kserve/s3-init-tx69r" Apr 16 16:19:49.130534 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:19:49.130514 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-tx69r" Apr 16 16:19:49.245879 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:19:49.245852 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-tx69r"] Apr 16 16:19:49.251438 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:19:49.248214 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod990a703c_862e_42b0_a6c3_16cf98da639f.slice/crio-3b2e05c1fd531a798b99e90e8e1d7728afd32e851115c6a721ee86d772f72d8f WatchSource:0}: Error finding container 3b2e05c1fd531a798b99e90e8e1d7728afd32e851115c6a721ee86d772f72d8f: Status 404 returned error can't find the container with id 3b2e05c1fd531a798b99e90e8e1d7728afd32e851115c6a721ee86d772f72d8f Apr 16 16:19:50.010602 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:19:50.010563 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-tx69r" event={"ID":"990a703c-862e-42b0-a6c3-16cf98da639f","Type":"ContainerStarted","Data":"3b2e05c1fd531a798b99e90e8e1d7728afd32e851115c6a721ee86d772f72d8f"} Apr 16 16:19:54.025720 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:19:54.025638 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-tx69r" event={"ID":"990a703c-862e-42b0-a6c3-16cf98da639f","Type":"ContainerStarted","Data":"e6304438e869f6f0e23cadd75cad708d38beb4769a39a8a89b1b3829acdabe2d"} Apr 16 16:19:54.040091 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:19:54.040033 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-tx69r" podStartSLOduration=1.537291416 podStartE2EDuration="6.040014851s" podCreationTimestamp="2026-04-16 16:19:48 +0000 UTC" firstStartedPulling="2026-04-16 16:19:49.251779495 +0000 UTC m=+1117.211971210" lastFinishedPulling="2026-04-16 16:19:53.754502939 +0000 UTC m=+1121.714694645" observedRunningTime="2026-04-16 16:19:54.039619697 +0000 UTC m=+1121.999811423" watchObservedRunningTime="2026-04-16 16:19:54.040014851 +0000 UTC m=+1122.000206577" Apr 16 16:19:57.035866 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:19:57.035776 2584 generic.go:358] "Generic (PLEG): container finished" podID="990a703c-862e-42b0-a6c3-16cf98da639f" containerID="e6304438e869f6f0e23cadd75cad708d38beb4769a39a8a89b1b3829acdabe2d" exitCode=0 Apr 16 16:19:57.036253 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:19:57.035856 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-tx69r" event={"ID":"990a703c-862e-42b0-a6c3-16cf98da639f","Type":"ContainerDied","Data":"e6304438e869f6f0e23cadd75cad708d38beb4769a39a8a89b1b3829acdabe2d"} Apr 16 16:19:58.164557 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:19:58.164533 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-tx69r" Apr 16 16:19:58.259017 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:19:58.258990 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twpkl\" (UniqueName: \"kubernetes.io/projected/990a703c-862e-42b0-a6c3-16cf98da639f-kube-api-access-twpkl\") pod \"990a703c-862e-42b0-a6c3-16cf98da639f\" (UID: \"990a703c-862e-42b0-a6c3-16cf98da639f\") " Apr 16 16:19:58.261210 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:19:58.261182 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/990a703c-862e-42b0-a6c3-16cf98da639f-kube-api-access-twpkl" (OuterVolumeSpecName: "kube-api-access-twpkl") pod "990a703c-862e-42b0-a6c3-16cf98da639f" (UID: "990a703c-862e-42b0-a6c3-16cf98da639f"). InnerVolumeSpecName "kube-api-access-twpkl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:19:58.360263 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:19:58.360208 2584 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-twpkl\" (UniqueName: \"kubernetes.io/projected/990a703c-862e-42b0-a6c3-16cf98da639f-kube-api-access-twpkl\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:19:59.042819 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:19:59.042743 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-tx69r" event={"ID":"990a703c-862e-42b0-a6c3-16cf98da639f","Type":"ContainerDied","Data":"3b2e05c1fd531a798b99e90e8e1d7728afd32e851115c6a721ee86d772f72d8f"} Apr 16 16:19:59.042819 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:19:59.042785 2584 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3b2e05c1fd531a798b99e90e8e1d7728afd32e851115c6a721ee86d772f72d8f" Apr 16 16:19:59.042819 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:19:59.042759 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-tx69r" Apr 16 16:20:09.176171 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.176142 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw"] Apr 16 16:20:09.182121 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.182089 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="990a703c-862e-42b0-a6c3-16cf98da639f" containerName="s3-init" Apr 16 16:20:09.182299 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.182287 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="990a703c-862e-42b0-a6c3-16cf98da639f" containerName="s3-init" Apr 16 16:20:09.182572 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.182558 2584 memory_manager.go:356] "RemoveStaleState removing state" podUID="990a703c-862e-42b0-a6c3-16cf98da639f" containerName="s3-init" Apr 16 16:20:09.185459 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.185440 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" Apr 16 16:20:09.188806 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.188789 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 16 16:20:09.188912 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.188873 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 16:20:09.188983 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.188938 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 16:20:09.189078 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.189059 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-1-openshift-default-dockercfg-lxrsc\"" Apr 16 16:20:09.194622 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.194602 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw"] Apr 16 16:20:09.341914 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.341878 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/74131d38-6ea6-4653-9bc8-d1804ed203fc-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4lthw\" (UID: \"74131d38-6ea6-4653-9bc8-d1804ed203fc\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" Apr 16 16:20:09.342142 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.341948 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gfdxf\" (UniqueName: \"kubernetes.io/projected/74131d38-6ea6-4653-9bc8-d1804ed203fc-kube-api-access-gfdxf\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4lthw\" (UID: \"74131d38-6ea6-4653-9bc8-d1804ed203fc\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" Apr 16 16:20:09.342142 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.342066 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/74131d38-6ea6-4653-9bc8-d1804ed203fc-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4lthw\" (UID: \"74131d38-6ea6-4653-9bc8-d1804ed203fc\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" Apr 16 16:20:09.342142 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.342129 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/74131d38-6ea6-4653-9bc8-d1804ed203fc-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4lthw\" (UID: \"74131d38-6ea6-4653-9bc8-d1804ed203fc\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" Apr 16 16:20:09.342290 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.342163 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/74131d38-6ea6-4653-9bc8-d1804ed203fc-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4lthw\" (UID: \"74131d38-6ea6-4653-9bc8-d1804ed203fc\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" Apr 16 16:20:09.342290 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.342188 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/74131d38-6ea6-4653-9bc8-d1804ed203fc-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4lthw\" (UID: \"74131d38-6ea6-4653-9bc8-d1804ed203fc\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" Apr 16 16:20:09.342290 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.342256 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/74131d38-6ea6-4653-9bc8-d1804ed203fc-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4lthw\" (UID: \"74131d38-6ea6-4653-9bc8-d1804ed203fc\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" Apr 16 16:20:09.342402 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.342287 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/74131d38-6ea6-4653-9bc8-d1804ed203fc-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4lthw\" (UID: \"74131d38-6ea6-4653-9bc8-d1804ed203fc\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" Apr 16 16:20:09.342402 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.342316 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/74131d38-6ea6-4653-9bc8-d1804ed203fc-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4lthw\" (UID: \"74131d38-6ea6-4653-9bc8-d1804ed203fc\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" Apr 16 16:20:09.443520 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.443450 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/74131d38-6ea6-4653-9bc8-d1804ed203fc-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4lthw\" (UID: \"74131d38-6ea6-4653-9bc8-d1804ed203fc\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" Apr 16 16:20:09.443520 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.443498 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gfdxf\" (UniqueName: \"kubernetes.io/projected/74131d38-6ea6-4653-9bc8-d1804ed203fc-kube-api-access-gfdxf\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4lthw\" (UID: \"74131d38-6ea6-4653-9bc8-d1804ed203fc\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" Apr 16 16:20:09.443696 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.443539 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/74131d38-6ea6-4653-9bc8-d1804ed203fc-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4lthw\" (UID: \"74131d38-6ea6-4653-9bc8-d1804ed203fc\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" Apr 16 16:20:09.443696 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.443586 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/74131d38-6ea6-4653-9bc8-d1804ed203fc-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4lthw\" (UID: \"74131d38-6ea6-4653-9bc8-d1804ed203fc\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" Apr 16 16:20:09.443696 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.443615 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/74131d38-6ea6-4653-9bc8-d1804ed203fc-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4lthw\" (UID: \"74131d38-6ea6-4653-9bc8-d1804ed203fc\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" Apr 16 16:20:09.443696 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.443642 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/74131d38-6ea6-4653-9bc8-d1804ed203fc-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4lthw\" (UID: \"74131d38-6ea6-4653-9bc8-d1804ed203fc\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" Apr 16 16:20:09.443904 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.443722 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/74131d38-6ea6-4653-9bc8-d1804ed203fc-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4lthw\" (UID: \"74131d38-6ea6-4653-9bc8-d1804ed203fc\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" Apr 16 16:20:09.443904 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.443754 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/74131d38-6ea6-4653-9bc8-d1804ed203fc-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4lthw\" (UID: \"74131d38-6ea6-4653-9bc8-d1804ed203fc\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" Apr 16 16:20:09.443904 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.443792 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/74131d38-6ea6-4653-9bc8-d1804ed203fc-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4lthw\" (UID: \"74131d38-6ea6-4653-9bc8-d1804ed203fc\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" Apr 16 16:20:09.444083 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.443914 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/74131d38-6ea6-4653-9bc8-d1804ed203fc-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4lthw\" (UID: \"74131d38-6ea6-4653-9bc8-d1804ed203fc\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" Apr 16 16:20:09.444083 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.444013 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/74131d38-6ea6-4653-9bc8-d1804ed203fc-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4lthw\" (UID: \"74131d38-6ea6-4653-9bc8-d1804ed203fc\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" Apr 16 16:20:09.444083 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.444043 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/74131d38-6ea6-4653-9bc8-d1804ed203fc-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4lthw\" (UID: \"74131d38-6ea6-4653-9bc8-d1804ed203fc\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" Apr 16 16:20:09.444192 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.444120 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/74131d38-6ea6-4653-9bc8-d1804ed203fc-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4lthw\" (UID: \"74131d38-6ea6-4653-9bc8-d1804ed203fc\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" Apr 16 16:20:09.444513 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.444423 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/74131d38-6ea6-4653-9bc8-d1804ed203fc-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4lthw\" (UID: \"74131d38-6ea6-4653-9bc8-d1804ed203fc\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" Apr 16 16:20:09.445940 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.445912 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/74131d38-6ea6-4653-9bc8-d1804ed203fc-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4lthw\" (UID: \"74131d38-6ea6-4653-9bc8-d1804ed203fc\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" Apr 16 16:20:09.446269 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.446248 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/74131d38-6ea6-4653-9bc8-d1804ed203fc-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4lthw\" (UID: \"74131d38-6ea6-4653-9bc8-d1804ed203fc\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" Apr 16 16:20:09.452659 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.452634 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/74131d38-6ea6-4653-9bc8-d1804ed203fc-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4lthw\" (UID: \"74131d38-6ea6-4653-9bc8-d1804ed203fc\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" Apr 16 16:20:09.452879 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.452860 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gfdxf\" (UniqueName: \"kubernetes.io/projected/74131d38-6ea6-4653-9bc8-d1804ed203fc-kube-api-access-gfdxf\") pod \"router-gateway-1-openshift-default-6c59fbf55c-4lthw\" (UID: \"74131d38-6ea6-4653-9bc8-d1804ed203fc\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" Apr 16 16:20:09.498309 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.498283 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" Apr 16 16:20:09.620590 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.620549 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw"] Apr 16 16:20:09.623764 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:20:09.623729 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74131d38_6ea6_4653_9bc8_d1804ed203fc.slice/crio-fe68d3372dc74e355ece57a45740ff41c60a3a735681145e22a33b69b8b13c03 WatchSource:0}: Error finding container fe68d3372dc74e355ece57a45740ff41c60a3a735681145e22a33b69b8b13c03: Status 404 returned error can't find the container with id fe68d3372dc74e355ece57a45740ff41c60a3a735681145e22a33b69b8b13c03 Apr 16 16:20:09.626209 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.626168 2584 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 16:20:09.626295 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.626237 2584 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 16:20:09.626295 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:09.626268 2584 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 16:20:10.075634 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:10.075598 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" event={"ID":"74131d38-6ea6-4653-9bc8-d1804ed203fc","Type":"ContainerStarted","Data":"880cba2f3f2bbf651c13aea405143b33a8f7d4fa276500312183cbbdc3b522c0"} Apr 16 16:20:10.075634 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:10.075636 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" event={"ID":"74131d38-6ea6-4653-9bc8-d1804ed203fc","Type":"ContainerStarted","Data":"fe68d3372dc74e355ece57a45740ff41c60a3a735681145e22a33b69b8b13c03"} Apr 16 16:20:10.093973 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:10.093845 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" podStartSLOduration=1.093827467 podStartE2EDuration="1.093827467s" podCreationTimestamp="2026-04-16 16:20:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:20:10.093194243 +0000 UTC m=+1138.053385968" watchObservedRunningTime="2026-04-16 16:20:10.093827467 +0000 UTC m=+1138.054019193" Apr 16 16:20:10.499345 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:10.499313 2584 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" Apr 16 16:20:10.504019 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:10.503993 2584 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" Apr 16 16:20:11.079618 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:11.079566 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" Apr 16 16:20:11.080820 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:11.080790 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-4lthw" Apr 16 16:20:20.865731 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:20.865700 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh"] Apr 16 16:20:20.868413 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:20.868397 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh" Apr 16 16:20:20.871887 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:20.871866 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 16 16:20:20.871887 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:20.871872 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-2fdlg\"" Apr 16 16:20:20.878628 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:20.878609 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh"] Apr 16 16:20:20.924801 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:20.924777 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c77ea171-cea0-48ef-8a62-1dd7c6893863-dshm\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh\" (UID: \"c77ea171-cea0-48ef-8a62-1dd7c6893863\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh" Apr 16 16:20:20.924882 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:20.924820 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c77ea171-cea0-48ef-8a62-1dd7c6893863-home\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh\" (UID: \"c77ea171-cea0-48ef-8a62-1dd7c6893863\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh" Apr 16 16:20:20.924882 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:20.924846 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c77ea171-cea0-48ef-8a62-1dd7c6893863-model-cache\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh\" (UID: \"c77ea171-cea0-48ef-8a62-1dd7c6893863\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh" Apr 16 16:20:20.924986 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:20.924901 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pztkb\" (UniqueName: \"kubernetes.io/projected/c77ea171-cea0-48ef-8a62-1dd7c6893863-kube-api-access-pztkb\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh\" (UID: \"c77ea171-cea0-48ef-8a62-1dd7c6893863\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh" Apr 16 16:20:20.924986 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:20.924928 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c77ea171-cea0-48ef-8a62-1dd7c6893863-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh\" (UID: \"c77ea171-cea0-48ef-8a62-1dd7c6893863\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh" Apr 16 16:20:20.925062 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:20.924987 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c77ea171-cea0-48ef-8a62-1dd7c6893863-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh\" (UID: \"c77ea171-cea0-48ef-8a62-1dd7c6893863\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh" Apr 16 16:20:21.025467 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:21.025444 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c77ea171-cea0-48ef-8a62-1dd7c6893863-dshm\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh\" (UID: \"c77ea171-cea0-48ef-8a62-1dd7c6893863\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh" Apr 16 16:20:21.025566 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:21.025482 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c77ea171-cea0-48ef-8a62-1dd7c6893863-home\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh\" (UID: \"c77ea171-cea0-48ef-8a62-1dd7c6893863\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh" Apr 16 16:20:21.025566 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:21.025505 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c77ea171-cea0-48ef-8a62-1dd7c6893863-model-cache\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh\" (UID: \"c77ea171-cea0-48ef-8a62-1dd7c6893863\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh" Apr 16 16:20:21.025566 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:21.025534 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pztkb\" (UniqueName: \"kubernetes.io/projected/c77ea171-cea0-48ef-8a62-1dd7c6893863-kube-api-access-pztkb\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh\" (UID: \"c77ea171-cea0-48ef-8a62-1dd7c6893863\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh" Apr 16 16:20:21.025566 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:21.025558 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c77ea171-cea0-48ef-8a62-1dd7c6893863-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh\" (UID: \"c77ea171-cea0-48ef-8a62-1dd7c6893863\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh" Apr 16 16:20:21.025757 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:21.025578 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c77ea171-cea0-48ef-8a62-1dd7c6893863-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh\" (UID: \"c77ea171-cea0-48ef-8a62-1dd7c6893863\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh" Apr 16 16:20:21.025936 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:21.025908 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c77ea171-cea0-48ef-8a62-1dd7c6893863-home\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh\" (UID: \"c77ea171-cea0-48ef-8a62-1dd7c6893863\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh" Apr 16 16:20:21.026071 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:21.026001 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c77ea171-cea0-48ef-8a62-1dd7c6893863-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh\" (UID: \"c77ea171-cea0-48ef-8a62-1dd7c6893863\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh" Apr 16 16:20:21.026071 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:21.026052 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c77ea171-cea0-48ef-8a62-1dd7c6893863-model-cache\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh\" (UID: \"c77ea171-cea0-48ef-8a62-1dd7c6893863\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh" Apr 16 16:20:21.027998 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:21.027978 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c77ea171-cea0-48ef-8a62-1dd7c6893863-dshm\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh\" (UID: \"c77ea171-cea0-48ef-8a62-1dd7c6893863\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh" Apr 16 16:20:21.028123 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:21.028107 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c77ea171-cea0-48ef-8a62-1dd7c6893863-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh\" (UID: \"c77ea171-cea0-48ef-8a62-1dd7c6893863\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh" Apr 16 16:20:21.032757 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:21.032730 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pztkb\" (UniqueName: \"kubernetes.io/projected/c77ea171-cea0-48ef-8a62-1dd7c6893863-kube-api-access-pztkb\") pod \"scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh\" (UID: \"c77ea171-cea0-48ef-8a62-1dd7c6893863\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh" Apr 16 16:20:21.179203 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:21.179179 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh" Apr 16 16:20:21.305885 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:21.305853 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh"] Apr 16 16:20:21.309225 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:20:21.309197 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc77ea171_cea0_48ef_8a62_1dd7c6893863.slice/crio-5eaf5ab522dc6db327d0710c9b0a4d0761851c90d4951b15393be8c256f5fa4c WatchSource:0}: Error finding container 5eaf5ab522dc6db327d0710c9b0a4d0761851c90d4951b15393be8c256f5fa4c: Status 404 returned error can't find the container with id 5eaf5ab522dc6db327d0710c9b0a4d0761851c90d4951b15393be8c256f5fa4c Apr 16 16:20:22.119167 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:22.119125 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh" event={"ID":"c77ea171-cea0-48ef-8a62-1dd7c6893863","Type":"ContainerStarted","Data":"5eaf5ab522dc6db327d0710c9b0a4d0761851c90d4951b15393be8c256f5fa4c"} Apr 16 16:20:26.137311 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:26.137267 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh" event={"ID":"c77ea171-cea0-48ef-8a62-1dd7c6893863","Type":"ContainerStarted","Data":"1933cb47fa2b9294c1cb8377ad7a15d4dad95ee67198b229d405d8fc33fed905"} Apr 16 16:20:30.152793 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:30.152706 2584 generic.go:358] "Generic (PLEG): container finished" podID="c77ea171-cea0-48ef-8a62-1dd7c6893863" containerID="1933cb47fa2b9294c1cb8377ad7a15d4dad95ee67198b229d405d8fc33fed905" exitCode=0 Apr 16 16:20:30.153166 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:30.152784 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh" event={"ID":"c77ea171-cea0-48ef-8a62-1dd7c6893863","Type":"ContainerDied","Data":"1933cb47fa2b9294c1cb8377ad7a15d4dad95ee67198b229d405d8fc33fed905"} Apr 16 16:20:30.153977 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:30.153946 2584 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:20:32.161348 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:32.161316 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh" event={"ID":"c77ea171-cea0-48ef-8a62-1dd7c6893863","Type":"ContainerStarted","Data":"349992abf8ff19cc1b469706495175298961d271f8fefa8674605089816b5272"} Apr 16 16:20:32.179562 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:32.179519 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh" podStartSLOduration=2.07754953 podStartE2EDuration="12.179507661s" podCreationTimestamp="2026-04-16 16:20:20 +0000 UTC" firstStartedPulling="2026-04-16 16:20:21.311679072 +0000 UTC m=+1149.271870789" lastFinishedPulling="2026-04-16 16:20:31.413637218 +0000 UTC m=+1159.373828920" observedRunningTime="2026-04-16 16:20:32.177919128 +0000 UTC m=+1160.138110853" watchObservedRunningTime="2026-04-16 16:20:32.179507661 +0000 UTC m=+1160.139699384" Apr 16 16:20:41.179665 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:41.179621 2584 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh" Apr 16 16:20:41.179665 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:41.179665 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh" Apr 16 16:20:41.192003 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:41.191981 2584 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh" Apr 16 16:20:41.203550 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:20:41.203531 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh" Apr 16 16:21:12.585020 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:12.584983 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh"] Apr 16 16:21:12.585500 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:12.585371 2584 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh" podUID="c77ea171-cea0-48ef-8a62-1dd7c6893863" containerName="main" containerID="cri-o://349992abf8ff19cc1b469706495175298961d271f8fefa8674605089816b5272" gracePeriod=30 Apr 16 16:21:12.646782 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:12.646755 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65x5s_82145526-4c6f-43c3-8850-84adf5e445e9/ovn-acl-logging/0.log" Apr 16 16:21:12.647545 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:12.647518 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65x5s_82145526-4c6f-43c3-8850-84adf5e445e9/ovn-acl-logging/0.log" Apr 16 16:21:12.822066 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:12.822043 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh" Apr 16 16:21:12.918277 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:12.918252 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c77ea171-cea0-48ef-8a62-1dd7c6893863-home\") pod \"c77ea171-cea0-48ef-8a62-1dd7c6893863\" (UID: \"c77ea171-cea0-48ef-8a62-1dd7c6893863\") " Apr 16 16:21:12.918426 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:12.918306 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c77ea171-cea0-48ef-8a62-1dd7c6893863-tls-certs\") pod \"c77ea171-cea0-48ef-8a62-1dd7c6893863\" (UID: \"c77ea171-cea0-48ef-8a62-1dd7c6893863\") " Apr 16 16:21:12.918426 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:12.918383 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pztkb\" (UniqueName: \"kubernetes.io/projected/c77ea171-cea0-48ef-8a62-1dd7c6893863-kube-api-access-pztkb\") pod \"c77ea171-cea0-48ef-8a62-1dd7c6893863\" (UID: \"c77ea171-cea0-48ef-8a62-1dd7c6893863\") " Apr 16 16:21:12.918513 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:12.918432 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c77ea171-cea0-48ef-8a62-1dd7c6893863-kserve-provision-location\") pod \"c77ea171-cea0-48ef-8a62-1dd7c6893863\" (UID: \"c77ea171-cea0-48ef-8a62-1dd7c6893863\") " Apr 16 16:21:12.918513 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:12.918455 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c77ea171-cea0-48ef-8a62-1dd7c6893863-dshm\") pod \"c77ea171-cea0-48ef-8a62-1dd7c6893863\" (UID: \"c77ea171-cea0-48ef-8a62-1dd7c6893863\") " Apr 16 16:21:12.918513 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:12.918492 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c77ea171-cea0-48ef-8a62-1dd7c6893863-model-cache\") pod \"c77ea171-cea0-48ef-8a62-1dd7c6893863\" (UID: \"c77ea171-cea0-48ef-8a62-1dd7c6893863\") " Apr 16 16:21:12.918660 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:12.918533 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c77ea171-cea0-48ef-8a62-1dd7c6893863-home" (OuterVolumeSpecName: "home") pod "c77ea171-cea0-48ef-8a62-1dd7c6893863" (UID: "c77ea171-cea0-48ef-8a62-1dd7c6893863"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:21:12.918734 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:12.918709 2584 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c77ea171-cea0-48ef-8a62-1dd7c6893863-home\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:21:12.918790 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:12.918760 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c77ea171-cea0-48ef-8a62-1dd7c6893863-model-cache" (OuterVolumeSpecName: "model-cache") pod "c77ea171-cea0-48ef-8a62-1dd7c6893863" (UID: "c77ea171-cea0-48ef-8a62-1dd7c6893863"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:21:12.920937 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:12.920914 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c77ea171-cea0-48ef-8a62-1dd7c6893863-dshm" (OuterVolumeSpecName: "dshm") pod "c77ea171-cea0-48ef-8a62-1dd7c6893863" (UID: "c77ea171-cea0-48ef-8a62-1dd7c6893863"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:21:12.921065 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:12.921040 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c77ea171-cea0-48ef-8a62-1dd7c6893863-kube-api-access-pztkb" (OuterVolumeSpecName: "kube-api-access-pztkb") pod "c77ea171-cea0-48ef-8a62-1dd7c6893863" (UID: "c77ea171-cea0-48ef-8a62-1dd7c6893863"). InnerVolumeSpecName "kube-api-access-pztkb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:21:12.921108 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:12.921074 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c77ea171-cea0-48ef-8a62-1dd7c6893863-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c77ea171-cea0-48ef-8a62-1dd7c6893863" (UID: "c77ea171-cea0-48ef-8a62-1dd7c6893863"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:21:12.980825 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:12.980794 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c77ea171-cea0-48ef-8a62-1dd7c6893863-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c77ea171-cea0-48ef-8a62-1dd7c6893863" (UID: "c77ea171-cea0-48ef-8a62-1dd7c6893863"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:21:13.019886 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:13.019856 2584 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c77ea171-cea0-48ef-8a62-1dd7c6893863-tls-certs\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:21:13.019886 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:13.019884 2584 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pztkb\" (UniqueName: \"kubernetes.io/projected/c77ea171-cea0-48ef-8a62-1dd7c6893863-kube-api-access-pztkb\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:21:13.020055 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:13.019899 2584 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c77ea171-cea0-48ef-8a62-1dd7c6893863-kserve-provision-location\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:21:13.020055 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:13.019912 2584 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c77ea171-cea0-48ef-8a62-1dd7c6893863-dshm\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:21:13.020055 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:13.019925 2584 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c77ea171-cea0-48ef-8a62-1dd7c6893863-model-cache\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:21:13.294779 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:13.294714 2584 generic.go:358] "Generic (PLEG): container finished" podID="c77ea171-cea0-48ef-8a62-1dd7c6893863" containerID="349992abf8ff19cc1b469706495175298961d271f8fefa8674605089816b5272" exitCode=0 Apr 16 16:21:13.294779 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:13.294750 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh" event={"ID":"c77ea171-cea0-48ef-8a62-1dd7c6893863","Type":"ContainerDied","Data":"349992abf8ff19cc1b469706495175298961d271f8fefa8674605089816b5272"} Apr 16 16:21:13.294779 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:13.294773 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh" event={"ID":"c77ea171-cea0-48ef-8a62-1dd7c6893863","Type":"ContainerDied","Data":"5eaf5ab522dc6db327d0710c9b0a4d0761851c90d4951b15393be8c256f5fa4c"} Apr 16 16:21:13.295007 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:13.294791 2584 scope.go:117] "RemoveContainer" containerID="349992abf8ff19cc1b469706495175298961d271f8fefa8674605089816b5272" Apr 16 16:21:13.295007 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:13.294798 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh" Apr 16 16:21:13.304128 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:13.304109 2584 scope.go:117] "RemoveContainer" containerID="1933cb47fa2b9294c1cb8377ad7a15d4dad95ee67198b229d405d8fc33fed905" Apr 16 16:21:13.317807 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:13.317782 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh"] Apr 16 16:21:13.321193 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:13.321171 2584 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-848465d4f4-qsjmh"] Apr 16 16:21:13.321739 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:13.321715 2584 scope.go:117] "RemoveContainer" containerID="349992abf8ff19cc1b469706495175298961d271f8fefa8674605089816b5272" Apr 16 16:21:13.322024 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:21:13.321999 2584 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"349992abf8ff19cc1b469706495175298961d271f8fefa8674605089816b5272\": container with ID starting with 349992abf8ff19cc1b469706495175298961d271f8fefa8674605089816b5272 not found: ID does not exist" containerID="349992abf8ff19cc1b469706495175298961d271f8fefa8674605089816b5272" Apr 16 16:21:13.322150 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:13.322030 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"349992abf8ff19cc1b469706495175298961d271f8fefa8674605089816b5272"} err="failed to get container status \"349992abf8ff19cc1b469706495175298961d271f8fefa8674605089816b5272\": rpc error: code = NotFound desc = could not find container \"349992abf8ff19cc1b469706495175298961d271f8fefa8674605089816b5272\": container with ID starting with 349992abf8ff19cc1b469706495175298961d271f8fefa8674605089816b5272 not found: ID does not exist" Apr 16 16:21:13.322150 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:13.322048 2584 scope.go:117] "RemoveContainer" containerID="1933cb47fa2b9294c1cb8377ad7a15d4dad95ee67198b229d405d8fc33fed905" Apr 16 16:21:13.322287 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:21:13.322270 2584 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1933cb47fa2b9294c1cb8377ad7a15d4dad95ee67198b229d405d8fc33fed905\": container with ID starting with 1933cb47fa2b9294c1cb8377ad7a15d4dad95ee67198b229d405d8fc33fed905 not found: ID does not exist" containerID="1933cb47fa2b9294c1cb8377ad7a15d4dad95ee67198b229d405d8fc33fed905" Apr 16 16:21:13.322326 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:13.322292 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1933cb47fa2b9294c1cb8377ad7a15d4dad95ee67198b229d405d8fc33fed905"} err="failed to get container status \"1933cb47fa2b9294c1cb8377ad7a15d4dad95ee67198b229d405d8fc33fed905\": rpc error: code = NotFound desc = could not find container \"1933cb47fa2b9294c1cb8377ad7a15d4dad95ee67198b229d405d8fc33fed905\": container with ID starting with 1933cb47fa2b9294c1cb8377ad7a15d4dad95ee67198b229d405d8fc33fed905 not found: ID does not exist" Apr 16 16:21:14.698765 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:14.698730 2584 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c77ea171-cea0-48ef-8a62-1dd7c6893863" path="/var/lib/kubelet/pods/c77ea171-cea0-48ef-8a62-1dd7c6893863/volumes" Apr 16 16:21:17.788410 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:17.788374 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v"] Apr 16 16:21:17.788775 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:17.788685 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c77ea171-cea0-48ef-8a62-1dd7c6893863" containerName="main" Apr 16 16:21:17.788775 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:17.788696 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="c77ea171-cea0-48ef-8a62-1dd7c6893863" containerName="main" Apr 16 16:21:17.788775 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:17.788706 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c77ea171-cea0-48ef-8a62-1dd7c6893863" containerName="storage-initializer" Apr 16 16:21:17.788775 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:17.788712 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="c77ea171-cea0-48ef-8a62-1dd7c6893863" containerName="storage-initializer" Apr 16 16:21:17.788775 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:17.788774 2584 memory_manager.go:356] "RemoveStaleState removing state" podUID="c77ea171-cea0-48ef-8a62-1dd7c6893863" containerName="main" Apr 16 16:21:17.793568 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:17.793550 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v" Apr 16 16:21:17.796673 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:17.796654 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-2fdlg\"" Apr 16 16:21:17.797584 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:17.797561 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 16 16:21:17.802482 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:17.802460 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v"] Apr 16 16:21:17.858328 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:17.858303 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eafd52ef-5377-43b9-86e1-e200da4052cb-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v\" (UID: \"eafd52ef-5377-43b9-86e1-e200da4052cb\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v" Apr 16 16:21:17.858438 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:17.858343 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/eafd52ef-5377-43b9-86e1-e200da4052cb-dshm\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v\" (UID: \"eafd52ef-5377-43b9-86e1-e200da4052cb\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v" Apr 16 16:21:17.858438 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:17.858359 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/eafd52ef-5377-43b9-86e1-e200da4052cb-model-cache\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v\" (UID: \"eafd52ef-5377-43b9-86e1-e200da4052cb\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v" Apr 16 16:21:17.858438 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:17.858404 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7zd5t\" (UniqueName: \"kubernetes.io/projected/eafd52ef-5377-43b9-86e1-e200da4052cb-kube-api-access-7zd5t\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v\" (UID: \"eafd52ef-5377-43b9-86e1-e200da4052cb\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v" Apr 16 16:21:17.858544 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:17.858496 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/eafd52ef-5377-43b9-86e1-e200da4052cb-home\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v\" (UID: \"eafd52ef-5377-43b9-86e1-e200da4052cb\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v" Apr 16 16:21:17.858544 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:17.858532 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/eafd52ef-5377-43b9-86e1-e200da4052cb-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v\" (UID: \"eafd52ef-5377-43b9-86e1-e200da4052cb\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v" Apr 16 16:21:17.959886 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:17.959858 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eafd52ef-5377-43b9-86e1-e200da4052cb-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v\" (UID: \"eafd52ef-5377-43b9-86e1-e200da4052cb\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v" Apr 16 16:21:17.960042 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:17.959902 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/eafd52ef-5377-43b9-86e1-e200da4052cb-dshm\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v\" (UID: \"eafd52ef-5377-43b9-86e1-e200da4052cb\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v" Apr 16 16:21:17.960042 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:17.959921 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/eafd52ef-5377-43b9-86e1-e200da4052cb-model-cache\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v\" (UID: \"eafd52ef-5377-43b9-86e1-e200da4052cb\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v" Apr 16 16:21:17.960042 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:17.959947 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7zd5t\" (UniqueName: \"kubernetes.io/projected/eafd52ef-5377-43b9-86e1-e200da4052cb-kube-api-access-7zd5t\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v\" (UID: \"eafd52ef-5377-43b9-86e1-e200da4052cb\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v" Apr 16 16:21:17.960042 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:17.960036 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/eafd52ef-5377-43b9-86e1-e200da4052cb-home\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v\" (UID: \"eafd52ef-5377-43b9-86e1-e200da4052cb\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v" Apr 16 16:21:17.960248 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:17.960082 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/eafd52ef-5377-43b9-86e1-e200da4052cb-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v\" (UID: \"eafd52ef-5377-43b9-86e1-e200da4052cb\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v" Apr 16 16:21:17.960304 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:17.960262 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eafd52ef-5377-43b9-86e1-e200da4052cb-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v\" (UID: \"eafd52ef-5377-43b9-86e1-e200da4052cb\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v" Apr 16 16:21:17.960409 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:17.960390 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/eafd52ef-5377-43b9-86e1-e200da4052cb-model-cache\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v\" (UID: \"eafd52ef-5377-43b9-86e1-e200da4052cb\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v" Apr 16 16:21:17.960409 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:17.960401 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/eafd52ef-5377-43b9-86e1-e200da4052cb-home\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v\" (UID: \"eafd52ef-5377-43b9-86e1-e200da4052cb\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v" Apr 16 16:21:17.962254 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:17.962228 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/eafd52ef-5377-43b9-86e1-e200da4052cb-dshm\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v\" (UID: \"eafd52ef-5377-43b9-86e1-e200da4052cb\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v" Apr 16 16:21:17.962535 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:17.962516 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/eafd52ef-5377-43b9-86e1-e200da4052cb-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v\" (UID: \"eafd52ef-5377-43b9-86e1-e200da4052cb\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v" Apr 16 16:21:17.968774 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:17.968757 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7zd5t\" (UniqueName: \"kubernetes.io/projected/eafd52ef-5377-43b9-86e1-e200da4052cb-kube-api-access-7zd5t\") pod \"scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v\" (UID: \"eafd52ef-5377-43b9-86e1-e200da4052cb\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v" Apr 16 16:21:18.104653 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:18.104575 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v" Apr 16 16:21:18.121172 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:18.121147 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf"] Apr 16 16:21:18.125761 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:18.125738 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" Apr 16 16:21:18.128511 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:18.128408 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-5fq82\"" Apr 16 16:21:18.137248 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:18.137221 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf"] Apr 16 16:21:18.238847 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:18.238822 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v"] Apr 16 16:21:18.240652 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:21:18.240620 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeafd52ef_5377_43b9_86e1_e200da4052cb.slice/crio-b6b31aad987d8d7f1630f30df3f82b6de3a9bb683cda1ba12ba2c0bdcf0c5832 WatchSource:0}: Error finding container b6b31aad987d8d7f1630f30df3f82b6de3a9bb683cda1ba12ba2c0bdcf0c5832: Status 404 returned error can't find the container with id b6b31aad987d8d7f1630f30df3f82b6de3a9bb683cda1ba12ba2c0bdcf0c5832 Apr 16 16:21:18.262123 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:18.262096 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d9107edf-1fd8-4238-971a-7d681557d119-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf\" (UID: \"d9107edf-1fd8-4238-971a-7d681557d119\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" Apr 16 16:21:18.262211 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:18.262146 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d9107edf-1fd8-4238-971a-7d681557d119-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf\" (UID: \"d9107edf-1fd8-4238-971a-7d681557d119\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" Apr 16 16:21:18.262211 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:18.262203 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9107edf-1fd8-4238-971a-7d681557d119-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf\" (UID: \"d9107edf-1fd8-4238-971a-7d681557d119\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" Apr 16 16:21:18.262294 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:18.262254 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d9107edf-1fd8-4238-971a-7d681557d119-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf\" (UID: \"d9107edf-1fd8-4238-971a-7d681557d119\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" Apr 16 16:21:18.262333 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:18.262310 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8ddh\" (UniqueName: \"kubernetes.io/projected/d9107edf-1fd8-4238-971a-7d681557d119-kube-api-access-x8ddh\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf\" (UID: \"d9107edf-1fd8-4238-971a-7d681557d119\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" Apr 16 16:21:18.262377 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:18.262359 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d9107edf-1fd8-4238-971a-7d681557d119-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf\" (UID: \"d9107edf-1fd8-4238-971a-7d681557d119\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" Apr 16 16:21:18.311874 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:18.311846 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v" event={"ID":"eafd52ef-5377-43b9-86e1-e200da4052cb","Type":"ContainerStarted","Data":"9a1119a61d4ed1d9983de117ece865ff2902fff2f10b753e55896f68566afa43"} Apr 16 16:21:18.312017 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:18.311881 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v" event={"ID":"eafd52ef-5377-43b9-86e1-e200da4052cb","Type":"ContainerStarted","Data":"b6b31aad987d8d7f1630f30df3f82b6de3a9bb683cda1ba12ba2c0bdcf0c5832"} Apr 16 16:21:18.363614 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:18.363526 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d9107edf-1fd8-4238-971a-7d681557d119-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf\" (UID: \"d9107edf-1fd8-4238-971a-7d681557d119\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" Apr 16 16:21:18.363614 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:18.363593 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d9107edf-1fd8-4238-971a-7d681557d119-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf\" (UID: \"d9107edf-1fd8-4238-971a-7d681557d119\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" Apr 16 16:21:18.363614 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:18.363615 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d9107edf-1fd8-4238-971a-7d681557d119-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf\" (UID: \"d9107edf-1fd8-4238-971a-7d681557d119\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" Apr 16 16:21:18.363887 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:18.363638 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9107edf-1fd8-4238-971a-7d681557d119-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf\" (UID: \"d9107edf-1fd8-4238-971a-7d681557d119\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" Apr 16 16:21:18.363887 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:18.363667 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d9107edf-1fd8-4238-971a-7d681557d119-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf\" (UID: \"d9107edf-1fd8-4238-971a-7d681557d119\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" Apr 16 16:21:18.363887 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:18.363736 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x8ddh\" (UniqueName: \"kubernetes.io/projected/d9107edf-1fd8-4238-971a-7d681557d119-kube-api-access-x8ddh\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf\" (UID: \"d9107edf-1fd8-4238-971a-7d681557d119\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" Apr 16 16:21:18.364081 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:18.363993 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d9107edf-1fd8-4238-971a-7d681557d119-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf\" (UID: \"d9107edf-1fd8-4238-971a-7d681557d119\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" Apr 16 16:21:18.364081 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:18.364036 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d9107edf-1fd8-4238-971a-7d681557d119-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf\" (UID: \"d9107edf-1fd8-4238-971a-7d681557d119\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" Apr 16 16:21:18.364192 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:18.364170 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9107edf-1fd8-4238-971a-7d681557d119-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf\" (UID: \"d9107edf-1fd8-4238-971a-7d681557d119\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" Apr 16 16:21:18.364192 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:18.364179 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d9107edf-1fd8-4238-971a-7d681557d119-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf\" (UID: \"d9107edf-1fd8-4238-971a-7d681557d119\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" Apr 16 16:21:18.366196 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:18.366175 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d9107edf-1fd8-4238-971a-7d681557d119-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf\" (UID: \"d9107edf-1fd8-4238-971a-7d681557d119\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" Apr 16 16:21:18.371357 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:18.371335 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8ddh\" (UniqueName: \"kubernetes.io/projected/d9107edf-1fd8-4238-971a-7d681557d119-kube-api-access-x8ddh\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf\" (UID: \"d9107edf-1fd8-4238-971a-7d681557d119\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" Apr 16 16:21:18.437356 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:18.437322 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" Apr 16 16:21:18.583127 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:18.583102 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf"] Apr 16 16:21:18.586200 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:21:18.586166 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9107edf_1fd8_4238_971a_7d681557d119.slice/crio-7ff0e115532d335762ae0a9c9c88d460eb0c8b4b3c8e82f4d504acb7a4261b07 WatchSource:0}: Error finding container 7ff0e115532d335762ae0a9c9c88d460eb0c8b4b3c8e82f4d504acb7a4261b07: Status 404 returned error can't find the container with id 7ff0e115532d335762ae0a9c9c88d460eb0c8b4b3c8e82f4d504acb7a4261b07 Apr 16 16:21:19.316104 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:19.316067 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" event={"ID":"d9107edf-1fd8-4238-971a-7d681557d119","Type":"ContainerStarted","Data":"2c2c44e8b67731b780bda7b03f605b3bd7df81c00725d7e203cfc80200a56a4f"} Apr 16 16:21:19.316104 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:19.316104 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" event={"ID":"d9107edf-1fd8-4238-971a-7d681557d119","Type":"ContainerStarted","Data":"7ff0e115532d335762ae0a9c9c88d460eb0c8b4b3c8e82f4d504acb7a4261b07"} Apr 16 16:21:20.320724 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:20.320686 2584 generic.go:358] "Generic (PLEG): container finished" podID="d9107edf-1fd8-4238-971a-7d681557d119" containerID="2c2c44e8b67731b780bda7b03f605b3bd7df81c00725d7e203cfc80200a56a4f" exitCode=0 Apr 16 16:21:20.320724 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:20.320725 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" event={"ID":"d9107edf-1fd8-4238-971a-7d681557d119","Type":"ContainerDied","Data":"2c2c44e8b67731b780bda7b03f605b3bd7df81c00725d7e203cfc80200a56a4f"} Apr 16 16:21:22.331652 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:22.331582 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" event={"ID":"d9107edf-1fd8-4238-971a-7d681557d119","Type":"ContainerStarted","Data":"1a271ca13d56c0743566b3f3375cb55be04ade947911486f91eaad3b5f6db4e1"} Apr 16 16:21:23.338366 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:23.338279 2584 generic.go:358] "Generic (PLEG): container finished" podID="eafd52ef-5377-43b9-86e1-e200da4052cb" containerID="9a1119a61d4ed1d9983de117ece865ff2902fff2f10b753e55896f68566afa43" exitCode=0 Apr 16 16:21:23.338366 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:23.338347 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v" event={"ID":"eafd52ef-5377-43b9-86e1-e200da4052cb","Type":"ContainerDied","Data":"9a1119a61d4ed1d9983de117ece865ff2902fff2f10b753e55896f68566afa43"} Apr 16 16:21:24.344618 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:24.344576 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v" event={"ID":"eafd52ef-5377-43b9-86e1-e200da4052cb","Type":"ContainerStarted","Data":"908f723a9235ac33833f605f9e7f174bcfa829f62d197f79692eee5f12e14bc0"} Apr 16 16:21:24.364748 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:24.364694 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v" podStartSLOduration=7.364679908 podStartE2EDuration="7.364679908s" podCreationTimestamp="2026-04-16 16:21:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:21:24.363307377 +0000 UTC m=+1212.323499102" watchObservedRunningTime="2026-04-16 16:21:24.364679908 +0000 UTC m=+1212.324871633" Apr 16 16:21:28.104899 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:28.104863 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v" Apr 16 16:21:28.105352 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:28.104916 2584 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v" Apr 16 16:21:28.120308 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:28.120280 2584 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v" Apr 16 16:21:28.372999 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:28.372904 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v" Apr 16 16:21:50.908761 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:50.908717 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v"] Apr 16 16:21:50.909307 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:50.909048 2584 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v" podUID="eafd52ef-5377-43b9-86e1-e200da4052cb" containerName="main" containerID="cri-o://908f723a9235ac33833f605f9e7f174bcfa829f62d197f79692eee5f12e14bc0" gracePeriod=30 Apr 16 16:21:50.916897 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:50.916829 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf"] Apr 16 16:21:51.231281 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.231255 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v" Apr 16 16:21:51.259277 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.259255 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7zd5t\" (UniqueName: \"kubernetes.io/projected/eafd52ef-5377-43b9-86e1-e200da4052cb-kube-api-access-7zd5t\") pod \"eafd52ef-5377-43b9-86e1-e200da4052cb\" (UID: \"eafd52ef-5377-43b9-86e1-e200da4052cb\") " Apr 16 16:21:51.259432 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.259314 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eafd52ef-5377-43b9-86e1-e200da4052cb-kserve-provision-location\") pod \"eafd52ef-5377-43b9-86e1-e200da4052cb\" (UID: \"eafd52ef-5377-43b9-86e1-e200da4052cb\") " Apr 16 16:21:51.259498 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.259417 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/eafd52ef-5377-43b9-86e1-e200da4052cb-tls-certs\") pod \"eafd52ef-5377-43b9-86e1-e200da4052cb\" (UID: \"eafd52ef-5377-43b9-86e1-e200da4052cb\") " Apr 16 16:21:51.259780 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.259617 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/eafd52ef-5377-43b9-86e1-e200da4052cb-model-cache\") pod \"eafd52ef-5377-43b9-86e1-e200da4052cb\" (UID: \"eafd52ef-5377-43b9-86e1-e200da4052cb\") " Apr 16 16:21:51.259780 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.259674 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/eafd52ef-5377-43b9-86e1-e200da4052cb-dshm\") pod \"eafd52ef-5377-43b9-86e1-e200da4052cb\" (UID: \"eafd52ef-5377-43b9-86e1-e200da4052cb\") " Apr 16 16:21:51.259780 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.259701 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/eafd52ef-5377-43b9-86e1-e200da4052cb-home\") pod \"eafd52ef-5377-43b9-86e1-e200da4052cb\" (UID: \"eafd52ef-5377-43b9-86e1-e200da4052cb\") " Apr 16 16:21:51.260052 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.259826 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eafd52ef-5377-43b9-86e1-e200da4052cb-model-cache" (OuterVolumeSpecName: "model-cache") pod "eafd52ef-5377-43b9-86e1-e200da4052cb" (UID: "eafd52ef-5377-43b9-86e1-e200da4052cb"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:21:51.260052 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.260034 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eafd52ef-5377-43b9-86e1-e200da4052cb-home" (OuterVolumeSpecName: "home") pod "eafd52ef-5377-43b9-86e1-e200da4052cb" (UID: "eafd52ef-5377-43b9-86e1-e200da4052cb"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:21:51.260161 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.260064 2584 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/eafd52ef-5377-43b9-86e1-e200da4052cb-model-cache\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:21:51.262026 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.261998 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eafd52ef-5377-43b9-86e1-e200da4052cb-kube-api-access-7zd5t" (OuterVolumeSpecName: "kube-api-access-7zd5t") pod "eafd52ef-5377-43b9-86e1-e200da4052cb" (UID: "eafd52ef-5377-43b9-86e1-e200da4052cb"). InnerVolumeSpecName "kube-api-access-7zd5t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:21:51.262331 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.262308 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eafd52ef-5377-43b9-86e1-e200da4052cb-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "eafd52ef-5377-43b9-86e1-e200da4052cb" (UID: "eafd52ef-5377-43b9-86e1-e200da4052cb"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:21:51.262331 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.262322 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eafd52ef-5377-43b9-86e1-e200da4052cb-dshm" (OuterVolumeSpecName: "dshm") pod "eafd52ef-5377-43b9-86e1-e200da4052cb" (UID: "eafd52ef-5377-43b9-86e1-e200da4052cb"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:21:51.323781 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.323754 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eafd52ef-5377-43b9-86e1-e200da4052cb-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "eafd52ef-5377-43b9-86e1-e200da4052cb" (UID: "eafd52ef-5377-43b9-86e1-e200da4052cb"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:21:51.361084 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.361060 2584 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/eafd52ef-5377-43b9-86e1-e200da4052cb-home\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:21:51.361187 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.361103 2584 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7zd5t\" (UniqueName: \"kubernetes.io/projected/eafd52ef-5377-43b9-86e1-e200da4052cb-kube-api-access-7zd5t\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:21:51.361187 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.361121 2584 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eafd52ef-5377-43b9-86e1-e200da4052cb-kserve-provision-location\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:21:51.361187 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.361134 2584 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/eafd52ef-5377-43b9-86e1-e200da4052cb-tls-certs\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:21:51.361187 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.361148 2584 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/eafd52ef-5377-43b9-86e1-e200da4052cb-dshm\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:21:51.454023 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.453991 2584 generic.go:358] "Generic (PLEG): container finished" podID="eafd52ef-5377-43b9-86e1-e200da4052cb" containerID="908f723a9235ac33833f605f9e7f174bcfa829f62d197f79692eee5f12e14bc0" exitCode=0 Apr 16 16:21:51.454154 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.454064 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v" Apr 16 16:21:51.454154 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.454063 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v" event={"ID":"eafd52ef-5377-43b9-86e1-e200da4052cb","Type":"ContainerDied","Data":"908f723a9235ac33833f605f9e7f174bcfa829f62d197f79692eee5f12e14bc0"} Apr 16 16:21:51.454154 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.454112 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v" event={"ID":"eafd52ef-5377-43b9-86e1-e200da4052cb","Type":"ContainerDied","Data":"b6b31aad987d8d7f1630f30df3f82b6de3a9bb683cda1ba12ba2c0bdcf0c5832"} Apr 16 16:21:51.454154 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.454138 2584 scope.go:117] "RemoveContainer" containerID="908f723a9235ac33833f605f9e7f174bcfa829f62d197f79692eee5f12e14bc0" Apr 16 16:21:51.456406 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.456385 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" event={"ID":"d9107edf-1fd8-4238-971a-7d681557d119","Type":"ContainerStarted","Data":"5d50c2097716cee2bb75a86d0cdf5bf99e5d04b425ad4e914c912f3e4c88ad42"} Apr 16 16:21:51.456622 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.456583 2584 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" podUID="d9107edf-1fd8-4238-971a-7d681557d119" containerName="tokenizer" containerID="cri-o://5d50c2097716cee2bb75a86d0cdf5bf99e5d04b425ad4e914c912f3e4c88ad42" gracePeriod=30 Apr 16 16:21:51.456714 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.456636 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" Apr 16 16:21:51.456714 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.456567 2584 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" podUID="d9107edf-1fd8-4238-971a-7d681557d119" containerName="main" containerID="cri-o://1a271ca13d56c0743566b3f3375cb55be04ade947911486f91eaad3b5f6db4e1" gracePeriod=30 Apr 16 16:21:51.459472 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.459423 2584 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" podUID="d9107edf-1fd8-4238-971a-7d681557d119" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 16 16:21:51.464023 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.464006 2584 scope.go:117] "RemoveContainer" containerID="9a1119a61d4ed1d9983de117ece865ff2902fff2f10b753e55896f68566afa43" Apr 16 16:21:51.474511 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.474497 2584 scope.go:117] "RemoveContainer" containerID="908f723a9235ac33833f605f9e7f174bcfa829f62d197f79692eee5f12e14bc0" Apr 16 16:21:51.474764 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:21:51.474747 2584 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"908f723a9235ac33833f605f9e7f174bcfa829f62d197f79692eee5f12e14bc0\": container with ID starting with 908f723a9235ac33833f605f9e7f174bcfa829f62d197f79692eee5f12e14bc0 not found: ID does not exist" containerID="908f723a9235ac33833f605f9e7f174bcfa829f62d197f79692eee5f12e14bc0" Apr 16 16:21:51.474819 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.474770 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"908f723a9235ac33833f605f9e7f174bcfa829f62d197f79692eee5f12e14bc0"} err="failed to get container status \"908f723a9235ac33833f605f9e7f174bcfa829f62d197f79692eee5f12e14bc0\": rpc error: code = NotFound desc = could not find container \"908f723a9235ac33833f605f9e7f174bcfa829f62d197f79692eee5f12e14bc0\": container with ID starting with 908f723a9235ac33833f605f9e7f174bcfa829f62d197f79692eee5f12e14bc0 not found: ID does not exist" Apr 16 16:21:51.474819 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.474785 2584 scope.go:117] "RemoveContainer" containerID="9a1119a61d4ed1d9983de117ece865ff2902fff2f10b753e55896f68566afa43" Apr 16 16:21:51.475014 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:21:51.474994 2584 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a1119a61d4ed1d9983de117ece865ff2902fff2f10b753e55896f68566afa43\": container with ID starting with 9a1119a61d4ed1d9983de117ece865ff2902fff2f10b753e55896f68566afa43 not found: ID does not exist" containerID="9a1119a61d4ed1d9983de117ece865ff2902fff2f10b753e55896f68566afa43" Apr 16 16:21:51.475071 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.475023 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a1119a61d4ed1d9983de117ece865ff2902fff2f10b753e55896f68566afa43"} err="failed to get container status \"9a1119a61d4ed1d9983de117ece865ff2902fff2f10b753e55896f68566afa43\": rpc error: code = NotFound desc = could not find container \"9a1119a61d4ed1d9983de117ece865ff2902fff2f10b753e55896f68566afa43\": container with ID starting with 9a1119a61d4ed1d9983de117ece865ff2902fff2f10b753e55896f68566afa43 not found: ID does not exist" Apr 16 16:21:51.484024 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.483983 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" podStartSLOduration=2.501407384 podStartE2EDuration="33.483972146s" podCreationTimestamp="2026-04-16 16:21:18 +0000 UTC" firstStartedPulling="2026-04-16 16:21:20.321918491 +0000 UTC m=+1208.282110193" lastFinishedPulling="2026-04-16 16:21:51.30448325 +0000 UTC m=+1239.264674955" observedRunningTime="2026-04-16 16:21:51.482038894 +0000 UTC m=+1239.442230620" watchObservedRunningTime="2026-04-16 16:21:51.483972146 +0000 UTC m=+1239.444163870" Apr 16 16:21:51.496335 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.496312 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v"] Apr 16 16:21:51.504389 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:51.504365 2584 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-59f6d9dbc4-mkt7v"] Apr 16 16:21:52.462197 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:52.462164 2584 generic.go:358] "Generic (PLEG): container finished" podID="d9107edf-1fd8-4238-971a-7d681557d119" containerID="1a271ca13d56c0743566b3f3375cb55be04ade947911486f91eaad3b5f6db4e1" exitCode=0 Apr 16 16:21:52.462550 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:52.462238 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" event={"ID":"d9107edf-1fd8-4238-971a-7d681557d119","Type":"ContainerDied","Data":"1a271ca13d56c0743566b3f3375cb55be04ade947911486f91eaad3b5f6db4e1"} Apr 16 16:21:52.699136 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:52.699104 2584 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eafd52ef-5377-43b9-86e1-e200da4052cb" path="/var/lib/kubelet/pods/eafd52ef-5377-43b9-86e1-e200da4052cb/volumes" Apr 16 16:21:58.438202 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:21:58.438161 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" Apr 16 16:22:01.457573 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:22:01.457542 2584 logging.go:55] [core] [Channel #20 SubChannel #21]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.29:9003", ServerName: "10.133.0.29:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.29:9003: connect: connection refused" Apr 16 16:22:02.457796 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:02.457751 2584 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" podUID="d9107edf-1fd8-4238-971a-7d681557d119" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.29:9003\" within 1s: context deadline exceeded" Apr 16 16:22:11.457296 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:22:11.457268 2584 logging.go:55] [core] [Channel #22 SubChannel #23]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.29:9003", ServerName: "10.133.0.29:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.29:9003: connect: connection refused" Apr 16 16:22:12.457154 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:12.457106 2584 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" podUID="d9107edf-1fd8-4238-971a-7d681557d119" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.29:9003\" within 1s: context deadline exceeded" Apr 16 16:22:21.457444 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:22:21.457414 2584 logging.go:55] [core] [Channel #24 SubChannel #25]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.29:9003", ServerName: "10.133.0.29:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.29:9003: connect: connection refused" Apr 16 16:22:21.557032 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:21.557007 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf_d9107edf-1fd8-4238-971a-7d681557d119/tokenizer/0.log" Apr 16 16:22:21.557622 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:21.557600 2584 generic.go:358] "Generic (PLEG): container finished" podID="d9107edf-1fd8-4238-971a-7d681557d119" containerID="5d50c2097716cee2bb75a86d0cdf5bf99e5d04b425ad4e914c912f3e4c88ad42" exitCode=137 Apr 16 16:22:21.557677 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:21.557641 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" event={"ID":"d9107edf-1fd8-4238-971a-7d681557d119","Type":"ContainerDied","Data":"5d50c2097716cee2bb75a86d0cdf5bf99e5d04b425ad4e914c912f3e4c88ad42"} Apr 16 16:22:22.106362 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:22.106341 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf_d9107edf-1fd8-4238-971a-7d681557d119/tokenizer/0.log" Apr 16 16:22:22.107057 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:22.107038 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" Apr 16 16:22:22.202669 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:22.202645 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d9107edf-1fd8-4238-971a-7d681557d119-tls-certs\") pod \"d9107edf-1fd8-4238-971a-7d681557d119\" (UID: \"d9107edf-1fd8-4238-971a-7d681557d119\") " Apr 16 16:22:22.202799 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:22.202691 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d9107edf-1fd8-4238-971a-7d681557d119-tokenizer-tmp\") pod \"d9107edf-1fd8-4238-971a-7d681557d119\" (UID: \"d9107edf-1fd8-4238-971a-7d681557d119\") " Apr 16 16:22:22.202799 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:22.202716 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9107edf-1fd8-4238-971a-7d681557d119-kserve-provision-location\") pod \"d9107edf-1fd8-4238-971a-7d681557d119\" (UID: \"d9107edf-1fd8-4238-971a-7d681557d119\") " Apr 16 16:22:22.202799 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:22.202763 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d9107edf-1fd8-4238-971a-7d681557d119-tokenizer-cache\") pod \"d9107edf-1fd8-4238-971a-7d681557d119\" (UID: \"d9107edf-1fd8-4238-971a-7d681557d119\") " Apr 16 16:22:22.202993 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:22.202803 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8ddh\" (UniqueName: \"kubernetes.io/projected/d9107edf-1fd8-4238-971a-7d681557d119-kube-api-access-x8ddh\") pod \"d9107edf-1fd8-4238-971a-7d681557d119\" (UID: \"d9107edf-1fd8-4238-971a-7d681557d119\") " Apr 16 16:22:22.202993 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:22.202825 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d9107edf-1fd8-4238-971a-7d681557d119-tokenizer-uds\") pod \"d9107edf-1fd8-4238-971a-7d681557d119\" (UID: \"d9107edf-1fd8-4238-971a-7d681557d119\") " Apr 16 16:22:22.203114 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:22.203089 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9107edf-1fd8-4238-971a-7d681557d119-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "d9107edf-1fd8-4238-971a-7d681557d119" (UID: "d9107edf-1fd8-4238-971a-7d681557d119"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:22:22.203114 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:22.203101 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9107edf-1fd8-4238-971a-7d681557d119-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "d9107edf-1fd8-4238-971a-7d681557d119" (UID: "d9107edf-1fd8-4238-971a-7d681557d119"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:22:22.203294 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:22.203254 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9107edf-1fd8-4238-971a-7d681557d119-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "d9107edf-1fd8-4238-971a-7d681557d119" (UID: "d9107edf-1fd8-4238-971a-7d681557d119"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:22:22.203591 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:22.203570 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9107edf-1fd8-4238-971a-7d681557d119-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d9107edf-1fd8-4238-971a-7d681557d119" (UID: "d9107edf-1fd8-4238-971a-7d681557d119"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:22:22.205003 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:22.204977 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9107edf-1fd8-4238-971a-7d681557d119-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d9107edf-1fd8-4238-971a-7d681557d119" (UID: "d9107edf-1fd8-4238-971a-7d681557d119"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:22:22.205003 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:22.204988 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9107edf-1fd8-4238-971a-7d681557d119-kube-api-access-x8ddh" (OuterVolumeSpecName: "kube-api-access-x8ddh") pod "d9107edf-1fd8-4238-971a-7d681557d119" (UID: "d9107edf-1fd8-4238-971a-7d681557d119"). InnerVolumeSpecName "kube-api-access-x8ddh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:22:22.304038 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:22.303974 2584 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-x8ddh\" (UniqueName: \"kubernetes.io/projected/d9107edf-1fd8-4238-971a-7d681557d119-kube-api-access-x8ddh\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:22:22.304038 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:22.304006 2584 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d9107edf-1fd8-4238-971a-7d681557d119-tokenizer-uds\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:22:22.304038 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:22.304016 2584 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d9107edf-1fd8-4238-971a-7d681557d119-tls-certs\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:22:22.304038 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:22.304025 2584 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d9107edf-1fd8-4238-971a-7d681557d119-tokenizer-tmp\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:22:22.304038 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:22.304033 2584 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9107edf-1fd8-4238-971a-7d681557d119-kserve-provision-location\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:22:22.304038 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:22.304041 2584 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d9107edf-1fd8-4238-971a-7d681557d119-tokenizer-cache\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:22:22.458004 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:22.457973 2584 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" podUID="d9107edf-1fd8-4238-971a-7d681557d119" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.29:9003\" within 1s: context deadline exceeded" Apr 16 16:22:22.563031 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:22.562949 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf_d9107edf-1fd8-4238-971a-7d681557d119/tokenizer/0.log" Apr 16 16:22:22.563717 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:22.563687 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" event={"ID":"d9107edf-1fd8-4238-971a-7d681557d119","Type":"ContainerDied","Data":"7ff0e115532d335762ae0a9c9c88d460eb0c8b4b3c8e82f4d504acb7a4261b07"} Apr 16 16:22:22.563822 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:22.563739 2584 scope.go:117] "RemoveContainer" containerID="5d50c2097716cee2bb75a86d0cdf5bf99e5d04b425ad4e914c912f3e4c88ad42" Apr 16 16:22:22.563822 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:22.563747 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf" Apr 16 16:22:22.571816 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:22.571648 2584 scope.go:117] "RemoveContainer" containerID="1a271ca13d56c0743566b3f3375cb55be04ade947911486f91eaad3b5f6db4e1" Apr 16 16:22:22.578903 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:22.578882 2584 scope.go:117] "RemoveContainer" containerID="2c2c44e8b67731b780bda7b03f605b3bd7df81c00725d7e203cfc80200a56a4f" Apr 16 16:22:22.586752 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:22.586733 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf"] Apr 16 16:22:22.590085 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:22.590062 2584 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7df4bf9crddf"] Apr 16 16:22:22.702790 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:22.702756 2584 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9107edf-1fd8-4238-971a-7d681557d119" path="/var/lib/kubelet/pods/d9107edf-1fd8-4238-971a-7d681557d119/volumes" Apr 16 16:22:48.659077 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.659036 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-8465f78db8-n6z4k"] Apr 16 16:22:48.659657 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.659375 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9107edf-1fd8-4238-971a-7d681557d119" containerName="storage-initializer" Apr 16 16:22:48.659657 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.659385 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9107edf-1fd8-4238-971a-7d681557d119" containerName="storage-initializer" Apr 16 16:22:48.659657 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.659394 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9107edf-1fd8-4238-971a-7d681557d119" containerName="main" Apr 16 16:22:48.659657 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.659399 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9107edf-1fd8-4238-971a-7d681557d119" containerName="main" Apr 16 16:22:48.659657 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.659413 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eafd52ef-5377-43b9-86e1-e200da4052cb" containerName="main" Apr 16 16:22:48.659657 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.659418 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="eafd52ef-5377-43b9-86e1-e200da4052cb" containerName="main" Apr 16 16:22:48.659657 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.659426 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eafd52ef-5377-43b9-86e1-e200da4052cb" containerName="storage-initializer" Apr 16 16:22:48.659657 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.659432 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="eafd52ef-5377-43b9-86e1-e200da4052cb" containerName="storage-initializer" Apr 16 16:22:48.659657 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.659439 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9107edf-1fd8-4238-971a-7d681557d119" containerName="tokenizer" Apr 16 16:22:48.659657 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.659444 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9107edf-1fd8-4238-971a-7d681557d119" containerName="tokenizer" Apr 16 16:22:48.659657 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.659490 2584 memory_manager.go:356] "RemoveStaleState removing state" podUID="eafd52ef-5377-43b9-86e1-e200da4052cb" containerName="main" Apr 16 16:22:48.659657 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.659498 2584 memory_manager.go:356] "RemoveStaleState removing state" podUID="d9107edf-1fd8-4238-971a-7d681557d119" containerName="tokenizer" Apr 16 16:22:48.659657 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.659507 2584 memory_manager.go:356] "RemoveStaleState removing state" podUID="d9107edf-1fd8-4238-971a-7d681557d119" containerName="main" Apr 16 16:22:48.661598 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.661578 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-8465f78db8-n6z4k" Apr 16 16:22:48.664025 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.664003 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-2fdlg\"" Apr 16 16:22:48.664141 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.664043 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"conv-test-lora-crit-kserve-self-signed-certs\"" Apr 16 16:22:48.672693 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.672672 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-8465f78db8-n6z4k"] Apr 16 16:22:48.796871 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.796839 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkpc2\" (UniqueName: \"kubernetes.io/projected/51e4e537-20b9-4094-8fc0-f5892380c80d-kube-api-access-vkpc2\") pod \"conv-test-lora-crit-kserve-8465f78db8-n6z4k\" (UID: \"51e4e537-20b9-4094-8fc0-f5892380c80d\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-8465f78db8-n6z4k" Apr 16 16:22:48.797019 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.796883 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/51e4e537-20b9-4094-8fc0-f5892380c80d-kserve-provision-location\") pod \"conv-test-lora-crit-kserve-8465f78db8-n6z4k\" (UID: \"51e4e537-20b9-4094-8fc0-f5892380c80d\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-8465f78db8-n6z4k" Apr 16 16:22:48.797019 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.796940 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/51e4e537-20b9-4094-8fc0-f5892380c80d-tls-certs\") pod \"conv-test-lora-crit-kserve-8465f78db8-n6z4k\" (UID: \"51e4e537-20b9-4094-8fc0-f5892380c80d\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-8465f78db8-n6z4k" Apr 16 16:22:48.797019 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.796992 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/51e4e537-20b9-4094-8fc0-f5892380c80d-dshm\") pod \"conv-test-lora-crit-kserve-8465f78db8-n6z4k\" (UID: \"51e4e537-20b9-4094-8fc0-f5892380c80d\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-8465f78db8-n6z4k" Apr 16 16:22:48.797124 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.797041 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/51e4e537-20b9-4094-8fc0-f5892380c80d-home\") pod \"conv-test-lora-crit-kserve-8465f78db8-n6z4k\" (UID: \"51e4e537-20b9-4094-8fc0-f5892380c80d\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-8465f78db8-n6z4k" Apr 16 16:22:48.797162 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.797117 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/51e4e537-20b9-4094-8fc0-f5892380c80d-model-cache\") pod \"conv-test-lora-crit-kserve-8465f78db8-n6z4k\" (UID: \"51e4e537-20b9-4094-8fc0-f5892380c80d\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-8465f78db8-n6z4k" Apr 16 16:22:48.898323 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.898292 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/51e4e537-20b9-4094-8fc0-f5892380c80d-model-cache\") pod \"conv-test-lora-crit-kserve-8465f78db8-n6z4k\" (UID: \"51e4e537-20b9-4094-8fc0-f5892380c80d\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-8465f78db8-n6z4k" Apr 16 16:22:48.898475 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.898338 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vkpc2\" (UniqueName: \"kubernetes.io/projected/51e4e537-20b9-4094-8fc0-f5892380c80d-kube-api-access-vkpc2\") pod \"conv-test-lora-crit-kserve-8465f78db8-n6z4k\" (UID: \"51e4e537-20b9-4094-8fc0-f5892380c80d\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-8465f78db8-n6z4k" Apr 16 16:22:48.898475 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.898368 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/51e4e537-20b9-4094-8fc0-f5892380c80d-kserve-provision-location\") pod \"conv-test-lora-crit-kserve-8465f78db8-n6z4k\" (UID: \"51e4e537-20b9-4094-8fc0-f5892380c80d\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-8465f78db8-n6z4k" Apr 16 16:22:48.898475 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.898390 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/51e4e537-20b9-4094-8fc0-f5892380c80d-tls-certs\") pod \"conv-test-lora-crit-kserve-8465f78db8-n6z4k\" (UID: \"51e4e537-20b9-4094-8fc0-f5892380c80d\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-8465f78db8-n6z4k" Apr 16 16:22:48.898475 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.898418 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/51e4e537-20b9-4094-8fc0-f5892380c80d-dshm\") pod \"conv-test-lora-crit-kserve-8465f78db8-n6z4k\" (UID: \"51e4e537-20b9-4094-8fc0-f5892380c80d\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-8465f78db8-n6z4k" Apr 16 16:22:48.898738 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.898474 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/51e4e537-20b9-4094-8fc0-f5892380c80d-home\") pod \"conv-test-lora-crit-kserve-8465f78db8-n6z4k\" (UID: \"51e4e537-20b9-4094-8fc0-f5892380c80d\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-8465f78db8-n6z4k" Apr 16 16:22:48.898738 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.898661 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/51e4e537-20b9-4094-8fc0-f5892380c80d-model-cache\") pod \"conv-test-lora-crit-kserve-8465f78db8-n6z4k\" (UID: \"51e4e537-20b9-4094-8fc0-f5892380c80d\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-8465f78db8-n6z4k" Apr 16 16:22:48.898854 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.898791 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/51e4e537-20b9-4094-8fc0-f5892380c80d-home\") pod \"conv-test-lora-crit-kserve-8465f78db8-n6z4k\" (UID: \"51e4e537-20b9-4094-8fc0-f5892380c80d\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-8465f78db8-n6z4k" Apr 16 16:22:48.898921 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.898902 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/51e4e537-20b9-4094-8fc0-f5892380c80d-kserve-provision-location\") pod \"conv-test-lora-crit-kserve-8465f78db8-n6z4k\" (UID: \"51e4e537-20b9-4094-8fc0-f5892380c80d\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-8465f78db8-n6z4k" Apr 16 16:22:48.900730 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.900707 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/51e4e537-20b9-4094-8fc0-f5892380c80d-dshm\") pod \"conv-test-lora-crit-kserve-8465f78db8-n6z4k\" (UID: \"51e4e537-20b9-4094-8fc0-f5892380c80d\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-8465f78db8-n6z4k" Apr 16 16:22:48.900919 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.900899 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/51e4e537-20b9-4094-8fc0-f5892380c80d-tls-certs\") pod \"conv-test-lora-crit-kserve-8465f78db8-n6z4k\" (UID: \"51e4e537-20b9-4094-8fc0-f5892380c80d\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-8465f78db8-n6z4k" Apr 16 16:22:48.906760 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.906739 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkpc2\" (UniqueName: \"kubernetes.io/projected/51e4e537-20b9-4094-8fc0-f5892380c80d-kube-api-access-vkpc2\") pod \"conv-test-lora-crit-kserve-8465f78db8-n6z4k\" (UID: \"51e4e537-20b9-4094-8fc0-f5892380c80d\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-8465f78db8-n6z4k" Apr 16 16:22:48.973238 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:48.973215 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-8465f78db8-n6z4k" Apr 16 16:22:49.106176 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:49.106143 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-8465f78db8-n6z4k"] Apr 16 16:22:49.109748 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:22:49.109714 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51e4e537_20b9_4094_8fc0_f5892380c80d.slice/crio-e46a4a146da7d85b36c6fc65f24c593e40a50b51e488bddb7cb082d4dc21f012 WatchSource:0}: Error finding container e46a4a146da7d85b36c6fc65f24c593e40a50b51e488bddb7cb082d4dc21f012: Status 404 returned error can't find the container with id e46a4a146da7d85b36c6fc65f24c593e40a50b51e488bddb7cb082d4dc21f012 Apr 16 16:22:49.654862 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:49.654828 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-8465f78db8-n6z4k" event={"ID":"51e4e537-20b9-4094-8fc0-f5892380c80d","Type":"ContainerStarted","Data":"23e3098a1b4b1cbf52db6ca92dfc490d7846188d2463760bbd09719f3083274f"} Apr 16 16:22:49.655058 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:49.654868 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-8465f78db8-n6z4k" event={"ID":"51e4e537-20b9-4094-8fc0-f5892380c80d","Type":"ContainerStarted","Data":"e46a4a146da7d85b36c6fc65f24c593e40a50b51e488bddb7cb082d4dc21f012"} Apr 16 16:22:50.659799 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:50.659764 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-8465f78db8-n6z4k_51e4e537-20b9-4094-8fc0-f5892380c80d/storage-initializer/0.log" Apr 16 16:22:50.660217 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:50.659809 2584 generic.go:358] "Generic (PLEG): container finished" podID="51e4e537-20b9-4094-8fc0-f5892380c80d" containerID="23e3098a1b4b1cbf52db6ca92dfc490d7846188d2463760bbd09719f3083274f" exitCode=1 Apr 16 16:22:50.660217 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:50.659848 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-8465f78db8-n6z4k" event={"ID":"51e4e537-20b9-4094-8fc0-f5892380c80d","Type":"ContainerDied","Data":"23e3098a1b4b1cbf52db6ca92dfc490d7846188d2463760bbd09719f3083274f"} Apr 16 16:22:51.664357 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:51.664325 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-8465f78db8-n6z4k_51e4e537-20b9-4094-8fc0-f5892380c80d/storage-initializer/1.log" Apr 16 16:22:51.664771 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:51.664673 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-8465f78db8-n6z4k_51e4e537-20b9-4094-8fc0-f5892380c80d/storage-initializer/0.log" Apr 16 16:22:51.664771 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:51.664708 2584 generic.go:358] "Generic (PLEG): container finished" podID="51e4e537-20b9-4094-8fc0-f5892380c80d" containerID="6f72c1633f7c51085c5aa189f0d03af9083282e7e7e6735f2c39b172d603a36a" exitCode=1 Apr 16 16:22:51.664843 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:51.664770 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-8465f78db8-n6z4k" event={"ID":"51e4e537-20b9-4094-8fc0-f5892380c80d","Type":"ContainerDied","Data":"6f72c1633f7c51085c5aa189f0d03af9083282e7e7e6735f2c39b172d603a36a"} Apr 16 16:22:51.664843 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:51.664808 2584 scope.go:117] "RemoveContainer" containerID="23e3098a1b4b1cbf52db6ca92dfc490d7846188d2463760bbd09719f3083274f" Apr 16 16:22:51.665079 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:51.665057 2584 scope.go:117] "RemoveContainer" containerID="23e3098a1b4b1cbf52db6ca92dfc490d7846188d2463760bbd09719f3083274f" Apr 16 16:22:51.674603 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:22:51.674574 2584 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_conv-test-lora-crit-kserve-8465f78db8-n6z4k_kserve-ci-e2e-test_51e4e537-20b9-4094-8fc0-f5892380c80d_0 in pod sandbox e46a4a146da7d85b36c6fc65f24c593e40a50b51e488bddb7cb082d4dc21f012 from index: no such id: '23e3098a1b4b1cbf52db6ca92dfc490d7846188d2463760bbd09719f3083274f'" containerID="23e3098a1b4b1cbf52db6ca92dfc490d7846188d2463760bbd09719f3083274f" Apr 16 16:22:51.674661 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:51.674617 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"23e3098a1b4b1cbf52db6ca92dfc490d7846188d2463760bbd09719f3083274f"} err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_conv-test-lora-crit-kserve-8465f78db8-n6z4k_kserve-ci-e2e-test_51e4e537-20b9-4094-8fc0-f5892380c80d_0 in pod sandbox e46a4a146da7d85b36c6fc65f24c593e40a50b51e488bddb7cb082d4dc21f012 from index: no such id: '23e3098a1b4b1cbf52db6ca92dfc490d7846188d2463760bbd09719f3083274f'" Apr 16 16:22:51.674790 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:22:51.674771 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=conv-test-lora-crit-kserve-8465f78db8-n6z4k_kserve-ci-e2e-test(51e4e537-20b9-4094-8fc0-f5892380c80d)\"" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-8465f78db8-n6z4k" podUID="51e4e537-20b9-4094-8fc0-f5892380c80d" Apr 16 16:22:52.438660 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:52.438630 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm"] Apr 16 16:22:52.441839 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:52.441818 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" Apr 16 16:22:52.444645 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:52.444624 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-m6d5g\"" Apr 16 16:22:52.444737 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:52.444664 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 16 16:22:52.457582 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:52.457560 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm"] Apr 16 16:22:52.626154 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:52.626120 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0938d657-15ac-4b62-91d5-1e0f3aa54647-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm\" (UID: \"0938d657-15ac-4b62-91d5-1e0f3aa54647\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" Apr 16 16:22:52.626545 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:52.626182 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0938d657-15ac-4b62-91d5-1e0f3aa54647-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm\" (UID: \"0938d657-15ac-4b62-91d5-1e0f3aa54647\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" Apr 16 16:22:52.626545 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:52.626220 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0938d657-15ac-4b62-91d5-1e0f3aa54647-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm\" (UID: \"0938d657-15ac-4b62-91d5-1e0f3aa54647\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" Apr 16 16:22:52.626545 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:52.626250 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0938d657-15ac-4b62-91d5-1e0f3aa54647-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm\" (UID: \"0938d657-15ac-4b62-91d5-1e0f3aa54647\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" Apr 16 16:22:52.626545 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:52.626276 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0938d657-15ac-4b62-91d5-1e0f3aa54647-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm\" (UID: \"0938d657-15ac-4b62-91d5-1e0f3aa54647\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" Apr 16 16:22:52.626545 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:52.626374 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqd76\" (UniqueName: \"kubernetes.io/projected/0938d657-15ac-4b62-91d5-1e0f3aa54647-kube-api-access-mqd76\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm\" (UID: \"0938d657-15ac-4b62-91d5-1e0f3aa54647\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" Apr 16 16:22:52.670402 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:52.670379 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-8465f78db8-n6z4k_51e4e537-20b9-4094-8fc0-f5892380c80d/storage-initializer/1.log" Apr 16 16:22:52.670912 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:22:52.670893 2584 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=conv-test-lora-crit-kserve-8465f78db8-n6z4k_kserve-ci-e2e-test(51e4e537-20b9-4094-8fc0-f5892380c80d)\"" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-8465f78db8-n6z4k" podUID="51e4e537-20b9-4094-8fc0-f5892380c80d" Apr 16 16:22:52.727328 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:52.727261 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0938d657-15ac-4b62-91d5-1e0f3aa54647-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm\" (UID: \"0938d657-15ac-4b62-91d5-1e0f3aa54647\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" Apr 16 16:22:52.727328 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:52.727296 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0938d657-15ac-4b62-91d5-1e0f3aa54647-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm\" (UID: \"0938d657-15ac-4b62-91d5-1e0f3aa54647\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" Apr 16 16:22:52.727502 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:52.727325 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0938d657-15ac-4b62-91d5-1e0f3aa54647-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm\" (UID: \"0938d657-15ac-4b62-91d5-1e0f3aa54647\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" Apr 16 16:22:52.727502 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:52.727354 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0938d657-15ac-4b62-91d5-1e0f3aa54647-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm\" (UID: \"0938d657-15ac-4b62-91d5-1e0f3aa54647\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" Apr 16 16:22:52.727502 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:52.727378 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0938d657-15ac-4b62-91d5-1e0f3aa54647-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm\" (UID: \"0938d657-15ac-4b62-91d5-1e0f3aa54647\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" Apr 16 16:22:52.727502 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:52.727435 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mqd76\" (UniqueName: \"kubernetes.io/projected/0938d657-15ac-4b62-91d5-1e0f3aa54647-kube-api-access-mqd76\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm\" (UID: \"0938d657-15ac-4b62-91d5-1e0f3aa54647\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" Apr 16 16:22:52.727696 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:52.727662 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0938d657-15ac-4b62-91d5-1e0f3aa54647-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm\" (UID: \"0938d657-15ac-4b62-91d5-1e0f3aa54647\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" Apr 16 16:22:52.727759 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:52.727712 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0938d657-15ac-4b62-91d5-1e0f3aa54647-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm\" (UID: \"0938d657-15ac-4b62-91d5-1e0f3aa54647\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" Apr 16 16:22:52.727759 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:52.727744 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0938d657-15ac-4b62-91d5-1e0f3aa54647-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm\" (UID: \"0938d657-15ac-4b62-91d5-1e0f3aa54647\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" Apr 16 16:22:52.727868 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:52.727842 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0938d657-15ac-4b62-91d5-1e0f3aa54647-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm\" (UID: \"0938d657-15ac-4b62-91d5-1e0f3aa54647\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" Apr 16 16:22:52.729833 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:52.729814 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0938d657-15ac-4b62-91d5-1e0f3aa54647-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm\" (UID: \"0938d657-15ac-4b62-91d5-1e0f3aa54647\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" Apr 16 16:22:52.735935 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:52.735904 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqd76\" (UniqueName: \"kubernetes.io/projected/0938d657-15ac-4b62-91d5-1e0f3aa54647-kube-api-access-mqd76\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm\" (UID: \"0938d657-15ac-4b62-91d5-1e0f3aa54647\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" Apr 16 16:22:52.751179 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:52.751155 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" Apr 16 16:22:52.878330 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:52.878309 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm"] Apr 16 16:22:52.880493 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:22:52.880465 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0938d657_15ac_4b62_91d5_1e0f3aa54647.slice/crio-aebe46304faf2a0f8a681c2c001e5c3e50240a42a6358cf819d1667e578c644e WatchSource:0}: Error finding container aebe46304faf2a0f8a681c2c001e5c3e50240a42a6358cf819d1667e578c644e: Status 404 returned error can't find the container with id aebe46304faf2a0f8a681c2c001e5c3e50240a42a6358cf819d1667e578c644e Apr 16 16:22:53.674996 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:53.674948 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" event={"ID":"0938d657-15ac-4b62-91d5-1e0f3aa54647","Type":"ContainerStarted","Data":"f61cee7c762906349a4427718d9c0c2ad2854481b897a440e9e299dd957edd3e"} Apr 16 16:22:53.674996 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:53.675001 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" event={"ID":"0938d657-15ac-4b62-91d5-1e0f3aa54647","Type":"ContainerStarted","Data":"aebe46304faf2a0f8a681c2c001e5c3e50240a42a6358cf819d1667e578c644e"} Apr 16 16:22:54.054149 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:54.054071 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-8465f78db8-n6z4k"] Apr 16 16:22:54.184043 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:54.184024 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-8465f78db8-n6z4k_51e4e537-20b9-4094-8fc0-f5892380c80d/storage-initializer/1.log" Apr 16 16:22:54.184149 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:54.184088 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-8465f78db8-n6z4k" Apr 16 16:22:54.343594 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:54.343495 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkpc2\" (UniqueName: \"kubernetes.io/projected/51e4e537-20b9-4094-8fc0-f5892380c80d-kube-api-access-vkpc2\") pod \"51e4e537-20b9-4094-8fc0-f5892380c80d\" (UID: \"51e4e537-20b9-4094-8fc0-f5892380c80d\") " Apr 16 16:22:54.343594 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:54.343553 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/51e4e537-20b9-4094-8fc0-f5892380c80d-kserve-provision-location\") pod \"51e4e537-20b9-4094-8fc0-f5892380c80d\" (UID: \"51e4e537-20b9-4094-8fc0-f5892380c80d\") " Apr 16 16:22:54.343594 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:54.343581 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/51e4e537-20b9-4094-8fc0-f5892380c80d-model-cache\") pod \"51e4e537-20b9-4094-8fc0-f5892380c80d\" (UID: \"51e4e537-20b9-4094-8fc0-f5892380c80d\") " Apr 16 16:22:54.344021 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:54.343633 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/51e4e537-20b9-4094-8fc0-f5892380c80d-home\") pod \"51e4e537-20b9-4094-8fc0-f5892380c80d\" (UID: \"51e4e537-20b9-4094-8fc0-f5892380c80d\") " Apr 16 16:22:54.344021 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:54.343698 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/51e4e537-20b9-4094-8fc0-f5892380c80d-tls-certs\") pod \"51e4e537-20b9-4094-8fc0-f5892380c80d\" (UID: \"51e4e537-20b9-4094-8fc0-f5892380c80d\") " Apr 16 16:22:54.344021 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:54.343758 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/51e4e537-20b9-4094-8fc0-f5892380c80d-dshm\") pod \"51e4e537-20b9-4094-8fc0-f5892380c80d\" (UID: \"51e4e537-20b9-4094-8fc0-f5892380c80d\") " Apr 16 16:22:54.344021 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:54.343835 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51e4e537-20b9-4094-8fc0-f5892380c80d-model-cache" (OuterVolumeSpecName: "model-cache") pod "51e4e537-20b9-4094-8fc0-f5892380c80d" (UID: "51e4e537-20b9-4094-8fc0-f5892380c80d"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:22:54.344021 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:54.343914 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51e4e537-20b9-4094-8fc0-f5892380c80d-home" (OuterVolumeSpecName: "home") pod "51e4e537-20b9-4094-8fc0-f5892380c80d" (UID: "51e4e537-20b9-4094-8fc0-f5892380c80d"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:22:54.344021 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:54.344005 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51e4e537-20b9-4094-8fc0-f5892380c80d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "51e4e537-20b9-4094-8fc0-f5892380c80d" (UID: "51e4e537-20b9-4094-8fc0-f5892380c80d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:22:54.344491 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:54.344157 2584 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/51e4e537-20b9-4094-8fc0-f5892380c80d-kserve-provision-location\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:22:54.344491 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:54.344174 2584 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/51e4e537-20b9-4094-8fc0-f5892380c80d-model-cache\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:22:54.344491 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:54.344188 2584 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/51e4e537-20b9-4094-8fc0-f5892380c80d-home\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:22:54.345901 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:54.345876 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51e4e537-20b9-4094-8fc0-f5892380c80d-kube-api-access-vkpc2" (OuterVolumeSpecName: "kube-api-access-vkpc2") pod "51e4e537-20b9-4094-8fc0-f5892380c80d" (UID: "51e4e537-20b9-4094-8fc0-f5892380c80d"). InnerVolumeSpecName "kube-api-access-vkpc2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:22:54.346021 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:54.345987 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/51e4e537-20b9-4094-8fc0-f5892380c80d-dshm" (OuterVolumeSpecName: "dshm") pod "51e4e537-20b9-4094-8fc0-f5892380c80d" (UID: "51e4e537-20b9-4094-8fc0-f5892380c80d"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:22:54.346091 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:54.346075 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/51e4e537-20b9-4094-8fc0-f5892380c80d-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "51e4e537-20b9-4094-8fc0-f5892380c80d" (UID: "51e4e537-20b9-4094-8fc0-f5892380c80d"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:22:54.445426 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:54.445403 2584 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/51e4e537-20b9-4094-8fc0-f5892380c80d-dshm\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:22:54.445426 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:54.445425 2584 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vkpc2\" (UniqueName: \"kubernetes.io/projected/51e4e537-20b9-4094-8fc0-f5892380c80d-kube-api-access-vkpc2\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:22:54.445544 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:54.445435 2584 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/51e4e537-20b9-4094-8fc0-f5892380c80d-tls-certs\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:22:54.679801 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:54.679778 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-8465f78db8-n6z4k_51e4e537-20b9-4094-8fc0-f5892380c80d/storage-initializer/1.log" Apr 16 16:22:54.680204 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:54.679868 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-8465f78db8-n6z4k" event={"ID":"51e4e537-20b9-4094-8fc0-f5892380c80d","Type":"ContainerDied","Data":"e46a4a146da7d85b36c6fc65f24c593e40a50b51e488bddb7cb082d4dc21f012"} Apr 16 16:22:54.680204 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:54.679892 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-8465f78db8-n6z4k" Apr 16 16:22:54.680204 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:54.679900 2584 scope.go:117] "RemoveContainer" containerID="6f72c1633f7c51085c5aa189f0d03af9083282e7e7e6735f2c39b172d603a36a" Apr 16 16:22:54.681531 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:54.681503 2584 generic.go:358] "Generic (PLEG): container finished" podID="0938d657-15ac-4b62-91d5-1e0f3aa54647" containerID="f61cee7c762906349a4427718d9c0c2ad2854481b897a440e9e299dd957edd3e" exitCode=0 Apr 16 16:22:54.681678 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:54.681547 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" event={"ID":"0938d657-15ac-4b62-91d5-1e0f3aa54647","Type":"ContainerDied","Data":"f61cee7c762906349a4427718d9c0c2ad2854481b897a440e9e299dd957edd3e"} Apr 16 16:22:54.738969 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:54.738933 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-8465f78db8-n6z4k"] Apr 16 16:22:54.752015 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:54.751989 2584 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-8465f78db8-n6z4k"] Apr 16 16:22:55.687618 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:55.687569 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" event={"ID":"0938d657-15ac-4b62-91d5-1e0f3aa54647","Type":"ContainerStarted","Data":"d47605192a2ae0e4a2dcd66998f70eb37c1fa94c0d871ed91b4503698857c07b"} Apr 16 16:22:55.687618 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:55.687616 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" event={"ID":"0938d657-15ac-4b62-91d5-1e0f3aa54647","Type":"ContainerStarted","Data":"9498491551ed4d9df7fc11ae5037b115c4fd4011fbb8b30ab485184c7cf01dbf"} Apr 16 16:22:55.688046 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:55.687742 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" Apr 16 16:22:55.709911 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:55.709857 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" podStartSLOduration=3.709840792 podStartE2EDuration="3.709840792s" podCreationTimestamp="2026-04-16 16:22:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:22:55.708683333 +0000 UTC m=+1303.668875057" watchObservedRunningTime="2026-04-16 16:22:55.709840792 +0000 UTC m=+1303.670032519" Apr 16 16:22:56.699214 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:22:56.699182 2584 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51e4e537-20b9-4094-8fc0-f5892380c80d" path="/var/lib/kubelet/pods/51e4e537-20b9-4094-8fc0-f5892380c80d/volumes" Apr 16 16:23:02.751377 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:23:02.751342 2584 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" Apr 16 16:23:02.751770 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:23:02.751390 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" Apr 16 16:23:02.753914 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:23:02.753893 2584 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" Apr 16 16:23:03.716933 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:23:03.716902 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" Apr 16 16:23:24.720891 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:23:24.720851 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" Apr 16 16:25:14.144367 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:14.144295 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm"] Apr 16 16:25:14.145996 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:14.144610 2584 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" podUID="0938d657-15ac-4b62-91d5-1e0f3aa54647" containerName="main" containerID="cri-o://9498491551ed4d9df7fc11ae5037b115c4fd4011fbb8b30ab485184c7cf01dbf" gracePeriod=30 Apr 16 16:25:14.145996 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:14.144680 2584 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" podUID="0938d657-15ac-4b62-91d5-1e0f3aa54647" containerName="tokenizer" containerID="cri-o://d47605192a2ae0e4a2dcd66998f70eb37c1fa94c0d871ed91b4503698857c07b" gracePeriod=30 Apr 16 16:25:14.720233 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:25:14.720203 2584 logging.go:55] [core] [Channel #101 SubChannel #102]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.31:9003", ServerName: "10.133.0.31:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.31:9003: connect: connection refused" Apr 16 16:25:15.165051 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:15.165020 2584 generic.go:358] "Generic (PLEG): container finished" podID="0938d657-15ac-4b62-91d5-1e0f3aa54647" containerID="d47605192a2ae0e4a2dcd66998f70eb37c1fa94c0d871ed91b4503698857c07b" exitCode=0 Apr 16 16:25:15.165051 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:15.165049 2584 generic.go:358] "Generic (PLEG): container finished" podID="0938d657-15ac-4b62-91d5-1e0f3aa54647" containerID="9498491551ed4d9df7fc11ae5037b115c4fd4011fbb8b30ab485184c7cf01dbf" exitCode=0 Apr 16 16:25:15.165367 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:15.165088 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" event={"ID":"0938d657-15ac-4b62-91d5-1e0f3aa54647","Type":"ContainerDied","Data":"d47605192a2ae0e4a2dcd66998f70eb37c1fa94c0d871ed91b4503698857c07b"} Apr 16 16:25:15.165367 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:15.165123 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" event={"ID":"0938d657-15ac-4b62-91d5-1e0f3aa54647","Type":"ContainerDied","Data":"9498491551ed4d9df7fc11ae5037b115c4fd4011fbb8b30ab485184c7cf01dbf"} Apr 16 16:25:15.292783 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:15.292763 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" Apr 16 16:25:15.395949 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:15.395881 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0938d657-15ac-4b62-91d5-1e0f3aa54647-kserve-provision-location\") pod \"0938d657-15ac-4b62-91d5-1e0f3aa54647\" (UID: \"0938d657-15ac-4b62-91d5-1e0f3aa54647\") " Apr 16 16:25:15.395949 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:15.395947 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0938d657-15ac-4b62-91d5-1e0f3aa54647-tokenizer-tmp\") pod \"0938d657-15ac-4b62-91d5-1e0f3aa54647\" (UID: \"0938d657-15ac-4b62-91d5-1e0f3aa54647\") " Apr 16 16:25:15.396154 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:15.395984 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqd76\" (UniqueName: \"kubernetes.io/projected/0938d657-15ac-4b62-91d5-1e0f3aa54647-kube-api-access-mqd76\") pod \"0938d657-15ac-4b62-91d5-1e0f3aa54647\" (UID: \"0938d657-15ac-4b62-91d5-1e0f3aa54647\") " Apr 16 16:25:15.396154 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:15.396023 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0938d657-15ac-4b62-91d5-1e0f3aa54647-tls-certs\") pod \"0938d657-15ac-4b62-91d5-1e0f3aa54647\" (UID: \"0938d657-15ac-4b62-91d5-1e0f3aa54647\") " Apr 16 16:25:15.396154 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:15.396045 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0938d657-15ac-4b62-91d5-1e0f3aa54647-tokenizer-cache\") pod \"0938d657-15ac-4b62-91d5-1e0f3aa54647\" (UID: \"0938d657-15ac-4b62-91d5-1e0f3aa54647\") " Apr 16 16:25:15.396154 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:15.396086 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0938d657-15ac-4b62-91d5-1e0f3aa54647-tokenizer-uds\") pod \"0938d657-15ac-4b62-91d5-1e0f3aa54647\" (UID: \"0938d657-15ac-4b62-91d5-1e0f3aa54647\") " Apr 16 16:25:15.396390 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:15.396340 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0938d657-15ac-4b62-91d5-1e0f3aa54647-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "0938d657-15ac-4b62-91d5-1e0f3aa54647" (UID: "0938d657-15ac-4b62-91d5-1e0f3aa54647"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:25:15.396458 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:15.396386 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0938d657-15ac-4b62-91d5-1e0f3aa54647-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "0938d657-15ac-4b62-91d5-1e0f3aa54647" (UID: "0938d657-15ac-4b62-91d5-1e0f3aa54647"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:25:15.396458 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:15.396403 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0938d657-15ac-4b62-91d5-1e0f3aa54647-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "0938d657-15ac-4b62-91d5-1e0f3aa54647" (UID: "0938d657-15ac-4b62-91d5-1e0f3aa54647"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:25:15.396698 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:15.396663 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0938d657-15ac-4b62-91d5-1e0f3aa54647-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0938d657-15ac-4b62-91d5-1e0f3aa54647" (UID: "0938d657-15ac-4b62-91d5-1e0f3aa54647"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:25:15.398310 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:15.398290 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0938d657-15ac-4b62-91d5-1e0f3aa54647-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "0938d657-15ac-4b62-91d5-1e0f3aa54647" (UID: "0938d657-15ac-4b62-91d5-1e0f3aa54647"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:25:15.398414 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:15.398391 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0938d657-15ac-4b62-91d5-1e0f3aa54647-kube-api-access-mqd76" (OuterVolumeSpecName: "kube-api-access-mqd76") pod "0938d657-15ac-4b62-91d5-1e0f3aa54647" (UID: "0938d657-15ac-4b62-91d5-1e0f3aa54647"). InnerVolumeSpecName "kube-api-access-mqd76". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:25:15.497460 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:15.497440 2584 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0938d657-15ac-4b62-91d5-1e0f3aa54647-tls-certs\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:25:15.497460 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:15.497463 2584 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0938d657-15ac-4b62-91d5-1e0f3aa54647-tokenizer-cache\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:25:15.497601 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:15.497472 2584 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0938d657-15ac-4b62-91d5-1e0f3aa54647-tokenizer-uds\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:25:15.497601 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:15.497482 2584 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0938d657-15ac-4b62-91d5-1e0f3aa54647-kserve-provision-location\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:25:15.497601 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:15.497492 2584 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0938d657-15ac-4b62-91d5-1e0f3aa54647-tokenizer-tmp\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:25:15.497601 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:15.497501 2584 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mqd76\" (UniqueName: \"kubernetes.io/projected/0938d657-15ac-4b62-91d5-1e0f3aa54647-kube-api-access-mqd76\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:25:15.720915 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:15.720866 2584 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" podUID="0938d657-15ac-4b62-91d5-1e0f3aa54647" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.31:9003\" within 1s: context deadline exceeded" Apr 16 16:25:15.721049 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:25:15.720889 2584 logging.go:55] [core] [Channel #101 SubChannel #102]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.31:9003", ServerName: "10.133.0.31:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.31:9003: operation was canceled" Apr 16 16:25:16.170409 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:16.170377 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" event={"ID":"0938d657-15ac-4b62-91d5-1e0f3aa54647","Type":"ContainerDied","Data":"aebe46304faf2a0f8a681c2c001e5c3e50240a42a6358cf819d1667e578c644e"} Apr 16 16:25:16.170797 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:16.170426 2584 scope.go:117] "RemoveContainer" containerID="d47605192a2ae0e4a2dcd66998f70eb37c1fa94c0d871ed91b4503698857c07b" Apr 16 16:25:16.170797 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:16.170441 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm" Apr 16 16:25:16.179206 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:16.179189 2584 scope.go:117] "RemoveContainer" containerID="9498491551ed4d9df7fc11ae5037b115c4fd4011fbb8b30ab485184c7cf01dbf" Apr 16 16:25:16.186607 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:16.186590 2584 scope.go:117] "RemoveContainer" containerID="f61cee7c762906349a4427718d9c0c2ad2854481b897a440e9e299dd957edd3e" Apr 16 16:25:16.192435 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:16.192400 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm"] Apr 16 16:25:16.196425 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:16.196401 2584 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-gxssm"] Apr 16 16:25:16.698218 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:16.698190 2584 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0938d657-15ac-4b62-91d5-1e0f3aa54647" path="/var/lib/kubelet/pods/0938d657-15ac-4b62-91d5-1e0f3aa54647/volumes" Apr 16 16:25:43.634767 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.634683 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6"] Apr 16 16:25:43.635287 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.635152 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0938d657-15ac-4b62-91d5-1e0f3aa54647" containerName="main" Apr 16 16:25:43.635287 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.635170 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="0938d657-15ac-4b62-91d5-1e0f3aa54647" containerName="main" Apr 16 16:25:43.635287 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.635196 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0938d657-15ac-4b62-91d5-1e0f3aa54647" containerName="tokenizer" Apr 16 16:25:43.635287 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.635204 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="0938d657-15ac-4b62-91d5-1e0f3aa54647" containerName="tokenizer" Apr 16 16:25:43.635287 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.635222 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51e4e537-20b9-4094-8fc0-f5892380c80d" containerName="storage-initializer" Apr 16 16:25:43.635287 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.635232 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="51e4e537-20b9-4094-8fc0-f5892380c80d" containerName="storage-initializer" Apr 16 16:25:43.635287 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.635248 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="51e4e537-20b9-4094-8fc0-f5892380c80d" containerName="storage-initializer" Apr 16 16:25:43.635287 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.635258 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="51e4e537-20b9-4094-8fc0-f5892380c80d" containerName="storage-initializer" Apr 16 16:25:43.635287 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.635270 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0938d657-15ac-4b62-91d5-1e0f3aa54647" containerName="storage-initializer" Apr 16 16:25:43.635287 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.635277 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="0938d657-15ac-4b62-91d5-1e0f3aa54647" containerName="storage-initializer" Apr 16 16:25:43.635643 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.635359 2584 memory_manager.go:356] "RemoveStaleState removing state" podUID="0938d657-15ac-4b62-91d5-1e0f3aa54647" containerName="main" Apr 16 16:25:43.635643 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.635368 2584 memory_manager.go:356] "RemoveStaleState removing state" podUID="51e4e537-20b9-4094-8fc0-f5892380c80d" containerName="storage-initializer" Apr 16 16:25:43.635643 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.635377 2584 memory_manager.go:356] "RemoveStaleState removing state" podUID="0938d657-15ac-4b62-91d5-1e0f3aa54647" containerName="tokenizer" Apr 16 16:25:43.635643 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.635470 2584 memory_manager.go:356] "RemoveStaleState removing state" podUID="51e4e537-20b9-4094-8fc0-f5892380c80d" containerName="storage-initializer" Apr 16 16:25:43.637309 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.637293 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" Apr 16 16:25:43.639703 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.639676 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-txknl\"" Apr 16 16:25:43.639811 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.639728 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-2fdlg\"" Apr 16 16:25:43.639892 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.639879 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 16 16:25:43.648403 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.648382 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6"] Apr 16 16:25:43.706302 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.706272 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/817a4b9f-3120-4e4b-9863-e3c57358e135-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6\" (UID: \"817a4b9f-3120-4e4b-9863-e3c57358e135\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" Apr 16 16:25:43.706412 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.706311 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/817a4b9f-3120-4e4b-9863-e3c57358e135-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6\" (UID: \"817a4b9f-3120-4e4b-9863-e3c57358e135\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" Apr 16 16:25:43.706412 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.706349 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrrvb\" (UniqueName: \"kubernetes.io/projected/817a4b9f-3120-4e4b-9863-e3c57358e135-kube-api-access-xrrvb\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6\" (UID: \"817a4b9f-3120-4e4b-9863-e3c57358e135\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" Apr 16 16:25:43.706412 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.706407 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/817a4b9f-3120-4e4b-9863-e3c57358e135-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6\" (UID: \"817a4b9f-3120-4e4b-9863-e3c57358e135\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" Apr 16 16:25:43.706576 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.706437 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/817a4b9f-3120-4e4b-9863-e3c57358e135-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6\" (UID: \"817a4b9f-3120-4e4b-9863-e3c57358e135\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" Apr 16 16:25:43.706576 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.706484 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/817a4b9f-3120-4e4b-9863-e3c57358e135-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6\" (UID: \"817a4b9f-3120-4e4b-9863-e3c57358e135\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" Apr 16 16:25:43.807029 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.807001 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/817a4b9f-3120-4e4b-9863-e3c57358e135-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6\" (UID: \"817a4b9f-3120-4e4b-9863-e3c57358e135\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" Apr 16 16:25:43.807137 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.807039 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/817a4b9f-3120-4e4b-9863-e3c57358e135-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6\" (UID: \"817a4b9f-3120-4e4b-9863-e3c57358e135\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" Apr 16 16:25:43.807137 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.807080 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/817a4b9f-3120-4e4b-9863-e3c57358e135-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6\" (UID: \"817a4b9f-3120-4e4b-9863-e3c57358e135\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" Apr 16 16:25:43.807260 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.807153 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/817a4b9f-3120-4e4b-9863-e3c57358e135-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6\" (UID: \"817a4b9f-3120-4e4b-9863-e3c57358e135\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" Apr 16 16:25:43.807260 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.807200 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/817a4b9f-3120-4e4b-9863-e3c57358e135-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6\" (UID: \"817a4b9f-3120-4e4b-9863-e3c57358e135\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" Apr 16 16:25:43.807260 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.807241 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrrvb\" (UniqueName: \"kubernetes.io/projected/817a4b9f-3120-4e4b-9863-e3c57358e135-kube-api-access-xrrvb\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6\" (UID: \"817a4b9f-3120-4e4b-9863-e3c57358e135\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" Apr 16 16:25:43.807435 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.807416 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/817a4b9f-3120-4e4b-9863-e3c57358e135-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6\" (UID: \"817a4b9f-3120-4e4b-9863-e3c57358e135\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" Apr 16 16:25:43.807524 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.807500 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/817a4b9f-3120-4e4b-9863-e3c57358e135-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6\" (UID: \"817a4b9f-3120-4e4b-9863-e3c57358e135\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" Apr 16 16:25:43.807575 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.807540 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/817a4b9f-3120-4e4b-9863-e3c57358e135-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6\" (UID: \"817a4b9f-3120-4e4b-9863-e3c57358e135\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" Apr 16 16:25:43.807610 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.807578 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/817a4b9f-3120-4e4b-9863-e3c57358e135-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6\" (UID: \"817a4b9f-3120-4e4b-9863-e3c57358e135\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" Apr 16 16:25:43.809870 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.809849 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/817a4b9f-3120-4e4b-9863-e3c57358e135-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6\" (UID: \"817a4b9f-3120-4e4b-9863-e3c57358e135\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" Apr 16 16:25:43.815310 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.815290 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrrvb\" (UniqueName: \"kubernetes.io/projected/817a4b9f-3120-4e4b-9863-e3c57358e135-kube-api-access-xrrvb\") pod \"stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6\" (UID: \"817a4b9f-3120-4e4b-9863-e3c57358e135\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" Apr 16 16:25:43.946604 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:43.946576 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" Apr 16 16:25:44.075776 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:44.075535 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6"] Apr 16 16:25:44.078442 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:25:44.078412 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod817a4b9f_3120_4e4b_9863_e3c57358e135.slice/crio-0e802df240358c95d960735c7fad995f2eda940cfbb1b8638259d19cd9b4b172 WatchSource:0}: Error finding container 0e802df240358c95d960735c7fad995f2eda940cfbb1b8638259d19cd9b4b172: Status 404 returned error can't find the container with id 0e802df240358c95d960735c7fad995f2eda940cfbb1b8638259d19cd9b4b172 Apr 16 16:25:44.080930 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:44.080802 2584 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:25:44.266245 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:44.266122 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" event={"ID":"817a4b9f-3120-4e4b-9863-e3c57358e135","Type":"ContainerStarted","Data":"c15a5b771d8644287f90a88bc7410682ac6fe6e932542ac3cdd9a02823b10826"} Apr 16 16:25:44.266245 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:44.266162 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" event={"ID":"817a4b9f-3120-4e4b-9863-e3c57358e135","Type":"ContainerStarted","Data":"0e802df240358c95d960735c7fad995f2eda940cfbb1b8638259d19cd9b4b172"} Apr 16 16:25:45.271481 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:45.271393 2584 generic.go:358] "Generic (PLEG): container finished" podID="817a4b9f-3120-4e4b-9863-e3c57358e135" containerID="c15a5b771d8644287f90a88bc7410682ac6fe6e932542ac3cdd9a02823b10826" exitCode=0 Apr 16 16:25:45.271481 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:45.271439 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" event={"ID":"817a4b9f-3120-4e4b-9863-e3c57358e135","Type":"ContainerDied","Data":"c15a5b771d8644287f90a88bc7410682ac6fe6e932542ac3cdd9a02823b10826"} Apr 16 16:25:46.276573 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:46.276539 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" event={"ID":"817a4b9f-3120-4e4b-9863-e3c57358e135","Type":"ContainerStarted","Data":"b6a3fff16ef9d8f00dc246afbd4b56a148584c39684ccd2375350a1d7f2a112a"} Apr 16 16:25:46.277002 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:46.276579 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" event={"ID":"817a4b9f-3120-4e4b-9863-e3c57358e135","Type":"ContainerStarted","Data":"b675e9f50de03ddc7a926c1921f57403e5a748c286cae57a40972177e89986a3"} Apr 16 16:25:46.277002 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:46.276743 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" Apr 16 16:25:46.297543 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:46.297494 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" podStartSLOduration=3.297477165 podStartE2EDuration="3.297477165s" podCreationTimestamp="2026-04-16 16:25:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:25:46.297012218 +0000 UTC m=+1474.257203944" watchObservedRunningTime="2026-04-16 16:25:46.297477165 +0000 UTC m=+1474.257668891" Apr 16 16:25:53.947041 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:53.946998 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" Apr 16 16:25:53.947041 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:53.947052 2584 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" Apr 16 16:25:53.949696 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:53.949669 2584 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" Apr 16 16:25:54.303902 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:25:54.303825 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" Apr 16 16:26:12.669424 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:26:12.669393 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65x5s_82145526-4c6f-43c3-8850-84adf5e445e9/ovn-acl-logging/0.log" Apr 16 16:26:12.671939 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:26:12.671918 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65x5s_82145526-4c6f-43c3-8850-84adf5e445e9/ovn-acl-logging/0.log" Apr 16 16:26:15.308115 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:26:15.308079 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" Apr 16 16:27:24.920095 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:24.920048 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6"] Apr 16 16:27:24.920614 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:24.920396 2584 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" podUID="817a4b9f-3120-4e4b-9863-e3c57358e135" containerName="main" containerID="cri-o://b675e9f50de03ddc7a926c1921f57403e5a748c286cae57a40972177e89986a3" gracePeriod=30 Apr 16 16:27:24.920614 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:24.920449 2584 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" podUID="817a4b9f-3120-4e4b-9863-e3c57358e135" containerName="tokenizer" containerID="cri-o://b6a3fff16ef9d8f00dc246afbd4b56a148584c39684ccd2375350a1d7f2a112a" gracePeriod=30 Apr 16 16:27:25.306835 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:27:25.306746 2584 logging.go:55] [core] [Channel #154 SubChannel #155]grpc: addrConn.createTransport failed to connect to {Addr: "10.133.0.32:9003", ServerName: "10.133.0.32:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.133.0.32:9003: connect: connection refused" Apr 16 16:27:25.609298 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:25.609205 2584 generic.go:358] "Generic (PLEG): container finished" podID="817a4b9f-3120-4e4b-9863-e3c57358e135" containerID="b675e9f50de03ddc7a926c1921f57403e5a748c286cae57a40972177e89986a3" exitCode=0 Apr 16 16:27:25.609298 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:25.609282 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" event={"ID":"817a4b9f-3120-4e4b-9863-e3c57358e135","Type":"ContainerDied","Data":"b675e9f50de03ddc7a926c1921f57403e5a748c286cae57a40972177e89986a3"} Apr 16 16:27:26.068552 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.068529 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" Apr 16 16:27:26.187015 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.186987 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/817a4b9f-3120-4e4b-9863-e3c57358e135-tls-certs\") pod \"817a4b9f-3120-4e4b-9863-e3c57358e135\" (UID: \"817a4b9f-3120-4e4b-9863-e3c57358e135\") " Apr 16 16:27:26.187140 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.187053 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/817a4b9f-3120-4e4b-9863-e3c57358e135-tokenizer-cache\") pod \"817a4b9f-3120-4e4b-9863-e3c57358e135\" (UID: \"817a4b9f-3120-4e4b-9863-e3c57358e135\") " Apr 16 16:27:26.187140 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.187131 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/817a4b9f-3120-4e4b-9863-e3c57358e135-tokenizer-uds\") pod \"817a4b9f-3120-4e4b-9863-e3c57358e135\" (UID: \"817a4b9f-3120-4e4b-9863-e3c57358e135\") " Apr 16 16:27:26.187237 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.187187 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/817a4b9f-3120-4e4b-9863-e3c57358e135-tokenizer-tmp\") pod \"817a4b9f-3120-4e4b-9863-e3c57358e135\" (UID: \"817a4b9f-3120-4e4b-9863-e3c57358e135\") " Apr 16 16:27:26.187237 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.187229 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrrvb\" (UniqueName: \"kubernetes.io/projected/817a4b9f-3120-4e4b-9863-e3c57358e135-kube-api-access-xrrvb\") pod \"817a4b9f-3120-4e4b-9863-e3c57358e135\" (UID: \"817a4b9f-3120-4e4b-9863-e3c57358e135\") " Apr 16 16:27:26.187318 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.187255 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/817a4b9f-3120-4e4b-9863-e3c57358e135-kserve-provision-location\") pod \"817a4b9f-3120-4e4b-9863-e3c57358e135\" (UID: \"817a4b9f-3120-4e4b-9863-e3c57358e135\") " Apr 16 16:27:26.187369 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.187317 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/817a4b9f-3120-4e4b-9863-e3c57358e135-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "817a4b9f-3120-4e4b-9863-e3c57358e135" (UID: "817a4b9f-3120-4e4b-9863-e3c57358e135"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:27:26.187422 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.187398 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/817a4b9f-3120-4e4b-9863-e3c57358e135-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "817a4b9f-3120-4e4b-9863-e3c57358e135" (UID: "817a4b9f-3120-4e4b-9863-e3c57358e135"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:27:26.187501 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.187476 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/817a4b9f-3120-4e4b-9863-e3c57358e135-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "817a4b9f-3120-4e4b-9863-e3c57358e135" (UID: "817a4b9f-3120-4e4b-9863-e3c57358e135"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:27:26.187628 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.187521 2584 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/817a4b9f-3120-4e4b-9863-e3c57358e135-tokenizer-cache\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:27:26.187628 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.187533 2584 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/817a4b9f-3120-4e4b-9863-e3c57358e135-tokenizer-uds\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:27:26.187944 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.187920 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/817a4b9f-3120-4e4b-9863-e3c57358e135-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "817a4b9f-3120-4e4b-9863-e3c57358e135" (UID: "817a4b9f-3120-4e4b-9863-e3c57358e135"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:27:26.189178 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.189163 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/817a4b9f-3120-4e4b-9863-e3c57358e135-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "817a4b9f-3120-4e4b-9863-e3c57358e135" (UID: "817a4b9f-3120-4e4b-9863-e3c57358e135"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:27:26.189467 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.189452 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/817a4b9f-3120-4e4b-9863-e3c57358e135-kube-api-access-xrrvb" (OuterVolumeSpecName: "kube-api-access-xrrvb") pod "817a4b9f-3120-4e4b-9863-e3c57358e135" (UID: "817a4b9f-3120-4e4b-9863-e3c57358e135"). InnerVolumeSpecName "kube-api-access-xrrvb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:27:26.288549 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.288522 2584 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xrrvb\" (UniqueName: \"kubernetes.io/projected/817a4b9f-3120-4e4b-9863-e3c57358e135-kube-api-access-xrrvb\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:27:26.288549 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.288546 2584 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/817a4b9f-3120-4e4b-9863-e3c57358e135-kserve-provision-location\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:27:26.288676 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.288556 2584 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/817a4b9f-3120-4e4b-9863-e3c57358e135-tls-certs\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:27:26.288676 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.288568 2584 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/817a4b9f-3120-4e4b-9863-e3c57358e135-tokenizer-tmp\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:27:26.306495 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.306455 2584 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" podUID="817a4b9f-3120-4e4b-9863-e3c57358e135" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.133.0.32:9003\" within 1s: context deadline exceeded" Apr 16 16:27:26.615725 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.615649 2584 generic.go:358] "Generic (PLEG): container finished" podID="817a4b9f-3120-4e4b-9863-e3c57358e135" containerID="b6a3fff16ef9d8f00dc246afbd4b56a148584c39684ccd2375350a1d7f2a112a" exitCode=0 Apr 16 16:27:26.615860 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.615759 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" event={"ID":"817a4b9f-3120-4e4b-9863-e3c57358e135","Type":"ContainerDied","Data":"b6a3fff16ef9d8f00dc246afbd4b56a148584c39684ccd2375350a1d7f2a112a"} Apr 16 16:27:26.615860 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.615778 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" Apr 16 16:27:26.615860 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.615798 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6" event={"ID":"817a4b9f-3120-4e4b-9863-e3c57358e135","Type":"ContainerDied","Data":"0e802df240358c95d960735c7fad995f2eda940cfbb1b8638259d19cd9b4b172"} Apr 16 16:27:26.615860 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.615820 2584 scope.go:117] "RemoveContainer" containerID="b6a3fff16ef9d8f00dc246afbd4b56a148584c39684ccd2375350a1d7f2a112a" Apr 16 16:27:26.624756 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.624737 2584 scope.go:117] "RemoveContainer" containerID="b675e9f50de03ddc7a926c1921f57403e5a748c286cae57a40972177e89986a3" Apr 16 16:27:26.637606 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.637585 2584 scope.go:117] "RemoveContainer" containerID="c15a5b771d8644287f90a88bc7410682ac6fe6e932542ac3cdd9a02823b10826" Apr 16 16:27:26.645153 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.645138 2584 scope.go:117] "RemoveContainer" containerID="b6a3fff16ef9d8f00dc246afbd4b56a148584c39684ccd2375350a1d7f2a112a" Apr 16 16:27:26.645443 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:27:26.645415 2584 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6a3fff16ef9d8f00dc246afbd4b56a148584c39684ccd2375350a1d7f2a112a\": container with ID starting with b6a3fff16ef9d8f00dc246afbd4b56a148584c39684ccd2375350a1d7f2a112a not found: ID does not exist" containerID="b6a3fff16ef9d8f00dc246afbd4b56a148584c39684ccd2375350a1d7f2a112a" Apr 16 16:27:26.645526 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.645453 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6a3fff16ef9d8f00dc246afbd4b56a148584c39684ccd2375350a1d7f2a112a"} err="failed to get container status \"b6a3fff16ef9d8f00dc246afbd4b56a148584c39684ccd2375350a1d7f2a112a\": rpc error: code = NotFound desc = could not find container \"b6a3fff16ef9d8f00dc246afbd4b56a148584c39684ccd2375350a1d7f2a112a\": container with ID starting with b6a3fff16ef9d8f00dc246afbd4b56a148584c39684ccd2375350a1d7f2a112a not found: ID does not exist" Apr 16 16:27:26.645526 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.645478 2584 scope.go:117] "RemoveContainer" containerID="b675e9f50de03ddc7a926c1921f57403e5a748c286cae57a40972177e89986a3" Apr 16 16:27:26.645831 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:27:26.645805 2584 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b675e9f50de03ddc7a926c1921f57403e5a748c286cae57a40972177e89986a3\": container with ID starting with b675e9f50de03ddc7a926c1921f57403e5a748c286cae57a40972177e89986a3 not found: ID does not exist" containerID="b675e9f50de03ddc7a926c1921f57403e5a748c286cae57a40972177e89986a3" Apr 16 16:27:26.645926 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.645841 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b675e9f50de03ddc7a926c1921f57403e5a748c286cae57a40972177e89986a3"} err="failed to get container status \"b675e9f50de03ddc7a926c1921f57403e5a748c286cae57a40972177e89986a3\": rpc error: code = NotFound desc = could not find container \"b675e9f50de03ddc7a926c1921f57403e5a748c286cae57a40972177e89986a3\": container with ID starting with b675e9f50de03ddc7a926c1921f57403e5a748c286cae57a40972177e89986a3 not found: ID does not exist" Apr 16 16:27:26.645926 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.645863 2584 scope.go:117] "RemoveContainer" containerID="c15a5b771d8644287f90a88bc7410682ac6fe6e932542ac3cdd9a02823b10826" Apr 16 16:27:26.646261 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:27:26.646235 2584 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c15a5b771d8644287f90a88bc7410682ac6fe6e932542ac3cdd9a02823b10826\": container with ID starting with c15a5b771d8644287f90a88bc7410682ac6fe6e932542ac3cdd9a02823b10826 not found: ID does not exist" containerID="c15a5b771d8644287f90a88bc7410682ac6fe6e932542ac3cdd9a02823b10826" Apr 16 16:27:26.646369 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.646345 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c15a5b771d8644287f90a88bc7410682ac6fe6e932542ac3cdd9a02823b10826"} err="failed to get container status \"c15a5b771d8644287f90a88bc7410682ac6fe6e932542ac3cdd9a02823b10826\": rpc error: code = NotFound desc = could not find container \"c15a5b771d8644287f90a88bc7410682ac6fe6e932542ac3cdd9a02823b10826\": container with ID starting with c15a5b771d8644287f90a88bc7410682ac6fe6e932542ac3cdd9a02823b10826 not found: ID does not exist" Apr 16 16:27:26.647878 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.647856 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6"] Apr 16 16:27:26.651755 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.651732 2584 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-76dc856bd7-m2cz6"] Apr 16 16:27:26.697714 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:26.697690 2584 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="817a4b9f-3120-4e4b-9863-e3c57358e135" path="/var/lib/kubelet/pods/817a4b9f-3120-4e4b-9863-e3c57358e135/volumes" Apr 16 16:27:27.201849 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:27.201818 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-65b6c49669-qdpbh"] Apr 16 16:27:27.202237 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:27.202162 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="817a4b9f-3120-4e4b-9863-e3c57358e135" containerName="storage-initializer" Apr 16 16:27:27.202237 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:27.202176 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="817a4b9f-3120-4e4b-9863-e3c57358e135" containerName="storage-initializer" Apr 16 16:27:27.202237 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:27.202191 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="817a4b9f-3120-4e4b-9863-e3c57358e135" containerName="tokenizer" Apr 16 16:27:27.202237 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:27.202199 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="817a4b9f-3120-4e4b-9863-e3c57358e135" containerName="tokenizer" Apr 16 16:27:27.202237 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:27.202208 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="817a4b9f-3120-4e4b-9863-e3c57358e135" containerName="main" Apr 16 16:27:27.202237 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:27.202217 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="817a4b9f-3120-4e4b-9863-e3c57358e135" containerName="main" Apr 16 16:27:27.202475 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:27.202283 2584 memory_manager.go:356] "RemoveStaleState removing state" podUID="817a4b9f-3120-4e4b-9863-e3c57358e135" containerName="main" Apr 16 16:27:27.202475 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:27.202296 2584 memory_manager.go:356] "RemoveStaleState removing state" podUID="817a4b9f-3120-4e4b-9863-e3c57358e135" containerName="tokenizer" Apr 16 16:27:27.207263 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:27.207245 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-65b6c49669-qdpbh" Apr 16 16:27:27.209873 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:27.209847 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 16:27:27.210028 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:27.210009 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-kj68m\"" Apr 16 16:27:27.210107 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:27.210088 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 16:27:27.210176 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:27.210159 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 16:27:27.220976 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:27.220935 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-65b6c49669-qdpbh"] Apr 16 16:27:27.295455 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:27.295425 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jcbx\" (UniqueName: \"kubernetes.io/projected/bd0b8538-fd89-44e9-b50f-3f8e1b84e2b5-kube-api-access-9jcbx\") pod \"llmisvc-controller-manager-65b6c49669-qdpbh\" (UID: \"bd0b8538-fd89-44e9-b50f-3f8e1b84e2b5\") " pod="kserve/llmisvc-controller-manager-65b6c49669-qdpbh" Apr 16 16:27:27.295590 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:27.295478 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd0b8538-fd89-44e9-b50f-3f8e1b84e2b5-cert\") pod \"llmisvc-controller-manager-65b6c49669-qdpbh\" (UID: \"bd0b8538-fd89-44e9-b50f-3f8e1b84e2b5\") " pod="kserve/llmisvc-controller-manager-65b6c49669-qdpbh" Apr 16 16:27:27.396363 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:27.396337 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jcbx\" (UniqueName: \"kubernetes.io/projected/bd0b8538-fd89-44e9-b50f-3f8e1b84e2b5-kube-api-access-9jcbx\") pod \"llmisvc-controller-manager-65b6c49669-qdpbh\" (UID: \"bd0b8538-fd89-44e9-b50f-3f8e1b84e2b5\") " pod="kserve/llmisvc-controller-manager-65b6c49669-qdpbh" Apr 16 16:27:27.396487 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:27.396383 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd0b8538-fd89-44e9-b50f-3f8e1b84e2b5-cert\") pod \"llmisvc-controller-manager-65b6c49669-qdpbh\" (UID: \"bd0b8538-fd89-44e9-b50f-3f8e1b84e2b5\") " pod="kserve/llmisvc-controller-manager-65b6c49669-qdpbh" Apr 16 16:27:27.398795 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:27.398773 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bd0b8538-fd89-44e9-b50f-3f8e1b84e2b5-cert\") pod \"llmisvc-controller-manager-65b6c49669-qdpbh\" (UID: \"bd0b8538-fd89-44e9-b50f-3f8e1b84e2b5\") " pod="kserve/llmisvc-controller-manager-65b6c49669-qdpbh" Apr 16 16:27:27.405111 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:27.405093 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jcbx\" (UniqueName: \"kubernetes.io/projected/bd0b8538-fd89-44e9-b50f-3f8e1b84e2b5-kube-api-access-9jcbx\") pod \"llmisvc-controller-manager-65b6c49669-qdpbh\" (UID: \"bd0b8538-fd89-44e9-b50f-3f8e1b84e2b5\") " pod="kserve/llmisvc-controller-manager-65b6c49669-qdpbh" Apr 16 16:27:27.524587 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:27.524532 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-65b6c49669-qdpbh" Apr 16 16:27:27.645134 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:27.645113 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-65b6c49669-qdpbh"] Apr 16 16:27:27.647149 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:27:27.647119 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbd0b8538_fd89_44e9_b50f_3f8e1b84e2b5.slice/crio-b2e5710c309b4af3b1afc17a294f474f19f24805f3084b0d4cf639827326ec88 WatchSource:0}: Error finding container b2e5710c309b4af3b1afc17a294f474f19f24805f3084b0d4cf639827326ec88: Status 404 returned error can't find the container with id b2e5710c309b4af3b1afc17a294f474f19f24805f3084b0d4cf639827326ec88 Apr 16 16:27:28.625722 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:28.625688 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-65b6c49669-qdpbh" event={"ID":"bd0b8538-fd89-44e9-b50f-3f8e1b84e2b5","Type":"ContainerStarted","Data":"b2e5710c309b4af3b1afc17a294f474f19f24805f3084b0d4cf639827326ec88"} Apr 16 16:27:31.638476 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:31.638423 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-65b6c49669-qdpbh" event={"ID":"bd0b8538-fd89-44e9-b50f-3f8e1b84e2b5","Type":"ContainerStarted","Data":"6b930991be18af2fc8ac60772d1ea1b28c6626494ba79c7e30450eb033015652"} Apr 16 16:27:31.638922 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:31.638491 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-65b6c49669-qdpbh" Apr 16 16:27:31.654577 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:27:31.654522 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-65b6c49669-qdpbh" podStartSLOduration=1.290662009 podStartE2EDuration="4.65450891s" podCreationTimestamp="2026-04-16 16:27:27 +0000 UTC" firstStartedPulling="2026-04-16 16:27:27.648500768 +0000 UTC m=+1575.608692486" lastFinishedPulling="2026-04-16 16:27:31.012347685 +0000 UTC m=+1578.972539387" observedRunningTime="2026-04-16 16:27:31.654051302 +0000 UTC m=+1579.614243029" watchObservedRunningTime="2026-04-16 16:27:31.65450891 +0000 UTC m=+1579.614700633" Apr 16 16:28:02.643782 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:02.643699 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-65b6c49669-qdpbh" Apr 16 16:28:43.323834 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.323797 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9"] Apr 16 16:28:43.327805 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.327783 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" Apr 16 16:28:43.330402 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.330380 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-2-openshift-default-dockercfg-7sppf\"" Apr 16 16:28:43.340084 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.340063 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9"] Apr 16 16:28:43.396314 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.396276 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/c68f54b4-fbf0-4f39-aa9d-980a4ba029f3-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-hwts9\" (UID: \"c68f54b4-fbf0-4f39-aa9d-980a4ba029f3\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" Apr 16 16:28:43.396487 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.396323 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/c68f54b4-fbf0-4f39-aa9d-980a4ba029f3-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-hwts9\" (UID: \"c68f54b4-fbf0-4f39-aa9d-980a4ba029f3\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" Apr 16 16:28:43.396487 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.396375 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/c68f54b4-fbf0-4f39-aa9d-980a4ba029f3-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-hwts9\" (UID: \"c68f54b4-fbf0-4f39-aa9d-980a4ba029f3\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" Apr 16 16:28:43.396487 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.396426 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/c68f54b4-fbf0-4f39-aa9d-980a4ba029f3-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-hwts9\" (UID: \"c68f54b4-fbf0-4f39-aa9d-980a4ba029f3\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" Apr 16 16:28:43.396487 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.396456 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65c5q\" (UniqueName: \"kubernetes.io/projected/c68f54b4-fbf0-4f39-aa9d-980a4ba029f3-kube-api-access-65c5q\") pod \"router-gateway-2-openshift-default-6866b85949-hwts9\" (UID: \"c68f54b4-fbf0-4f39-aa9d-980a4ba029f3\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" Apr 16 16:28:43.396639 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.396488 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c68f54b4-fbf0-4f39-aa9d-980a4ba029f3-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-hwts9\" (UID: \"c68f54b4-fbf0-4f39-aa9d-980a4ba029f3\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" Apr 16 16:28:43.396639 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.396523 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c68f54b4-fbf0-4f39-aa9d-980a4ba029f3-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-hwts9\" (UID: \"c68f54b4-fbf0-4f39-aa9d-980a4ba029f3\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" Apr 16 16:28:43.396639 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.396551 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/c68f54b4-fbf0-4f39-aa9d-980a4ba029f3-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-hwts9\" (UID: \"c68f54b4-fbf0-4f39-aa9d-980a4ba029f3\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" Apr 16 16:28:43.396639 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.396618 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/c68f54b4-fbf0-4f39-aa9d-980a4ba029f3-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-hwts9\" (UID: \"c68f54b4-fbf0-4f39-aa9d-980a4ba029f3\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" Apr 16 16:28:43.497262 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.497231 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/c68f54b4-fbf0-4f39-aa9d-980a4ba029f3-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-hwts9\" (UID: \"c68f54b4-fbf0-4f39-aa9d-980a4ba029f3\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" Apr 16 16:28:43.497451 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.497278 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/c68f54b4-fbf0-4f39-aa9d-980a4ba029f3-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-hwts9\" (UID: \"c68f54b4-fbf0-4f39-aa9d-980a4ba029f3\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" Apr 16 16:28:43.497451 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.497402 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/c68f54b4-fbf0-4f39-aa9d-980a4ba029f3-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-hwts9\" (UID: \"c68f54b4-fbf0-4f39-aa9d-980a4ba029f3\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" Apr 16 16:28:43.497577 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.497461 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/c68f54b4-fbf0-4f39-aa9d-980a4ba029f3-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-hwts9\" (UID: \"c68f54b4-fbf0-4f39-aa9d-980a4ba029f3\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" Apr 16 16:28:43.497577 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.497504 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/c68f54b4-fbf0-4f39-aa9d-980a4ba029f3-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-hwts9\" (UID: \"c68f54b4-fbf0-4f39-aa9d-980a4ba029f3\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" Apr 16 16:28:43.497577 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.497548 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-65c5q\" (UniqueName: \"kubernetes.io/projected/c68f54b4-fbf0-4f39-aa9d-980a4ba029f3-kube-api-access-65c5q\") pod \"router-gateway-2-openshift-default-6866b85949-hwts9\" (UID: \"c68f54b4-fbf0-4f39-aa9d-980a4ba029f3\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" Apr 16 16:28:43.497727 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.497588 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c68f54b4-fbf0-4f39-aa9d-980a4ba029f3-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-hwts9\" (UID: \"c68f54b4-fbf0-4f39-aa9d-980a4ba029f3\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" Apr 16 16:28:43.497783 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.497766 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/c68f54b4-fbf0-4f39-aa9d-980a4ba029f3-workload-socket\") pod \"router-gateway-2-openshift-default-6866b85949-hwts9\" (UID: \"c68f54b4-fbf0-4f39-aa9d-980a4ba029f3\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" Apr 16 16:28:43.497840 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.497815 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/c68f54b4-fbf0-4f39-aa9d-980a4ba029f3-credential-socket\") pod \"router-gateway-2-openshift-default-6866b85949-hwts9\" (UID: \"c68f54b4-fbf0-4f39-aa9d-980a4ba029f3\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" Apr 16 16:28:43.497840 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.497820 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c68f54b4-fbf0-4f39-aa9d-980a4ba029f3-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-hwts9\" (UID: \"c68f54b4-fbf0-4f39-aa9d-980a4ba029f3\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" Apr 16 16:28:43.498027 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.497870 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/c68f54b4-fbf0-4f39-aa9d-980a4ba029f3-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-hwts9\" (UID: \"c68f54b4-fbf0-4f39-aa9d-980a4ba029f3\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" Apr 16 16:28:43.498027 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.497906 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/c68f54b4-fbf0-4f39-aa9d-980a4ba029f3-workload-certs\") pod \"router-gateway-2-openshift-default-6866b85949-hwts9\" (UID: \"c68f54b4-fbf0-4f39-aa9d-980a4ba029f3\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" Apr 16 16:28:43.498183 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.498161 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/c68f54b4-fbf0-4f39-aa9d-980a4ba029f3-istio-data\") pod \"router-gateway-2-openshift-default-6866b85949-hwts9\" (UID: \"c68f54b4-fbf0-4f39-aa9d-980a4ba029f3\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" Apr 16 16:28:43.498520 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.498504 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/c68f54b4-fbf0-4f39-aa9d-980a4ba029f3-istiod-ca-cert\") pod \"router-gateway-2-openshift-default-6866b85949-hwts9\" (UID: \"c68f54b4-fbf0-4f39-aa9d-980a4ba029f3\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" Apr 16 16:28:43.499615 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.499594 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/c68f54b4-fbf0-4f39-aa9d-980a4ba029f3-istio-envoy\") pod \"router-gateway-2-openshift-default-6866b85949-hwts9\" (UID: \"c68f54b4-fbf0-4f39-aa9d-980a4ba029f3\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" Apr 16 16:28:43.500151 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.500129 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c68f54b4-fbf0-4f39-aa9d-980a4ba029f3-istio-podinfo\") pod \"router-gateway-2-openshift-default-6866b85949-hwts9\" (UID: \"c68f54b4-fbf0-4f39-aa9d-980a4ba029f3\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" Apr 16 16:28:43.505934 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.505911 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c68f54b4-fbf0-4f39-aa9d-980a4ba029f3-istio-token\") pod \"router-gateway-2-openshift-default-6866b85949-hwts9\" (UID: \"c68f54b4-fbf0-4f39-aa9d-980a4ba029f3\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" Apr 16 16:28:43.506061 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.505976 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-65c5q\" (UniqueName: \"kubernetes.io/projected/c68f54b4-fbf0-4f39-aa9d-980a4ba029f3-kube-api-access-65c5q\") pod \"router-gateway-2-openshift-default-6866b85949-hwts9\" (UID: \"c68f54b4-fbf0-4f39-aa9d-980a4ba029f3\") " pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" Apr 16 16:28:43.641743 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.641655 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" Apr 16 16:28:43.770052 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.770025 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9"] Apr 16 16:28:43.772209 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:28:43.772182 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc68f54b4_fbf0_4f39_aa9d_980a4ba029f3.slice/crio-fb290a8508c136314fca69c5591a1ac91afd675e2c02d9265e598ef36cd6656e WatchSource:0}: Error finding container fb290a8508c136314fca69c5591a1ac91afd675e2c02d9265e598ef36cd6656e: Status 404 returned error can't find the container with id fb290a8508c136314fca69c5591a1ac91afd675e2c02d9265e598ef36cd6656e Apr 16 16:28:43.774345 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.774312 2584 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 16:28:43.774435 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.774384 2584 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 16:28:43.774435 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.774416 2584 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892160Ki","pods":"250"} Apr 16 16:28:43.871850 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.871803 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" event={"ID":"c68f54b4-fbf0-4f39-aa9d-980a4ba029f3","Type":"ContainerStarted","Data":"c4d24f0fc6bd94b0f4ee5d8f33b40c9e8e1ae3bf42286998c35d6a88c3d8dc93"} Apr 16 16:28:43.871850 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.871844 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" event={"ID":"c68f54b4-fbf0-4f39-aa9d-980a4ba029f3","Type":"ContainerStarted","Data":"fb290a8508c136314fca69c5591a1ac91afd675e2c02d9265e598ef36cd6656e"} Apr 16 16:28:43.892074 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:43.891968 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" podStartSLOduration=0.891935316 podStartE2EDuration="891.935316ms" podCreationTimestamp="2026-04-16 16:28:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:28:43.889883961 +0000 UTC m=+1651.850075681" watchObservedRunningTime="2026-04-16 16:28:43.891935316 +0000 UTC m=+1651.852127040" Apr 16 16:28:44.642752 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:44.642716 2584 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" Apr 16 16:28:44.647575 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:44.647555 2584 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" Apr 16 16:28:44.875449 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:44.875414 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" Apr 16 16:28:44.876588 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:28:44.876572 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-2-openshift-default-6866b85949-hwts9" Apr 16 16:29:01.030853 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:01.030816 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx"] Apr 16 16:29:01.034562 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:01.034543 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" Apr 16 16:29:01.038078 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:01.038053 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-epp-sa-dockercfg-5j7wz\"" Apr 16 16:29:01.038188 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:01.038053 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 16 16:29:01.038188 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:01.038059 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-2fdlg\"" Apr 16 16:29:01.047236 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:01.047209 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx"] Apr 16 16:29:01.067173 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:01.067136 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b886e25f-5306-437f-979e-e7491ad0f09c-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx\" (UID: \"b886e25f-5306-437f-979e-e7491ad0f09c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" Apr 16 16:29:01.067364 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:01.067187 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b886e25f-5306-437f-979e-e7491ad0f09c-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx\" (UID: \"b886e25f-5306-437f-979e-e7491ad0f09c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" Apr 16 16:29:01.067364 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:01.067253 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b886e25f-5306-437f-979e-e7491ad0f09c-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx\" (UID: \"b886e25f-5306-437f-979e-e7491ad0f09c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" Apr 16 16:29:01.067477 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:01.067379 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrbj7\" (UniqueName: \"kubernetes.io/projected/b886e25f-5306-437f-979e-e7491ad0f09c-kube-api-access-lrbj7\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx\" (UID: \"b886e25f-5306-437f-979e-e7491ad0f09c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" Apr 16 16:29:01.067477 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:01.067415 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b886e25f-5306-437f-979e-e7491ad0f09c-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx\" (UID: \"b886e25f-5306-437f-979e-e7491ad0f09c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" Apr 16 16:29:01.067477 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:01.067471 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b886e25f-5306-437f-979e-e7491ad0f09c-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx\" (UID: \"b886e25f-5306-437f-979e-e7491ad0f09c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" Apr 16 16:29:01.168112 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:01.168079 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b886e25f-5306-437f-979e-e7491ad0f09c-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx\" (UID: \"b886e25f-5306-437f-979e-e7491ad0f09c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" Apr 16 16:29:01.168112 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:01.168115 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b886e25f-5306-437f-979e-e7491ad0f09c-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx\" (UID: \"b886e25f-5306-437f-979e-e7491ad0f09c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" Apr 16 16:29:01.168309 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:01.168142 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b886e25f-5306-437f-979e-e7491ad0f09c-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx\" (UID: \"b886e25f-5306-437f-979e-e7491ad0f09c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" Apr 16 16:29:01.168309 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:01.168182 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrbj7\" (UniqueName: \"kubernetes.io/projected/b886e25f-5306-437f-979e-e7491ad0f09c-kube-api-access-lrbj7\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx\" (UID: \"b886e25f-5306-437f-979e-e7491ad0f09c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" Apr 16 16:29:01.168309 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:01.168212 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b886e25f-5306-437f-979e-e7491ad0f09c-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx\" (UID: \"b886e25f-5306-437f-979e-e7491ad0f09c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" Apr 16 16:29:01.168309 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:01.168260 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b886e25f-5306-437f-979e-e7491ad0f09c-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx\" (UID: \"b886e25f-5306-437f-979e-e7491ad0f09c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" Apr 16 16:29:01.168610 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:01.168571 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b886e25f-5306-437f-979e-e7491ad0f09c-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx\" (UID: \"b886e25f-5306-437f-979e-e7491ad0f09c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" Apr 16 16:29:01.168610 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:01.168602 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b886e25f-5306-437f-979e-e7491ad0f09c-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx\" (UID: \"b886e25f-5306-437f-979e-e7491ad0f09c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" Apr 16 16:29:01.168752 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:01.168636 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b886e25f-5306-437f-979e-e7491ad0f09c-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx\" (UID: \"b886e25f-5306-437f-979e-e7491ad0f09c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" Apr 16 16:29:01.168752 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:01.168720 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b886e25f-5306-437f-979e-e7491ad0f09c-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx\" (UID: \"b886e25f-5306-437f-979e-e7491ad0f09c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" Apr 16 16:29:01.170771 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:01.170753 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b886e25f-5306-437f-979e-e7491ad0f09c-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx\" (UID: \"b886e25f-5306-437f-979e-e7491ad0f09c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" Apr 16 16:29:01.176702 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:01.176677 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrbj7\" (UniqueName: \"kubernetes.io/projected/b886e25f-5306-437f-979e-e7491ad0f09c-kube-api-access-lrbj7\") pod \"router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx\" (UID: \"b886e25f-5306-437f-979e-e7491ad0f09c\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" Apr 16 16:29:01.344735 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:01.344673 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" Apr 16 16:29:01.468079 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:01.468056 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx"] Apr 16 16:29:01.469883 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:29:01.469850 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb886e25f_5306_437f_979e_e7491ad0f09c.slice/crio-c230b564a8c011defd2a0a81dd85b093f127292ed7aa290baca399d759c24a61 WatchSource:0}: Error finding container c230b564a8c011defd2a0a81dd85b093f127292ed7aa290baca399d759c24a61: Status 404 returned error can't find the container with id c230b564a8c011defd2a0a81dd85b093f127292ed7aa290baca399d759c24a61 Apr 16 16:29:01.934475 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:01.934432 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" event={"ID":"b886e25f-5306-437f-979e-e7491ad0f09c","Type":"ContainerStarted","Data":"e603f96a26cf136b323c4ce7bc4054ca7776e1fb4846ba1f98407a262a06a61a"} Apr 16 16:29:01.934653 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:01.934482 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" event={"ID":"b886e25f-5306-437f-979e-e7491ad0f09c","Type":"ContainerStarted","Data":"c230b564a8c011defd2a0a81dd85b093f127292ed7aa290baca399d759c24a61"} Apr 16 16:29:02.938063 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:02.938009 2584 generic.go:358] "Generic (PLEG): container finished" podID="b886e25f-5306-437f-979e-e7491ad0f09c" containerID="e603f96a26cf136b323c4ce7bc4054ca7776e1fb4846ba1f98407a262a06a61a" exitCode=0 Apr 16 16:29:02.938063 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:02.938057 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" event={"ID":"b886e25f-5306-437f-979e-e7491ad0f09c","Type":"ContainerDied","Data":"e603f96a26cf136b323c4ce7bc4054ca7776e1fb4846ba1f98407a262a06a61a"} Apr 16 16:29:03.943888 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:03.943854 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" event={"ID":"b886e25f-5306-437f-979e-e7491ad0f09c","Type":"ContainerStarted","Data":"80e1bcd042a070c742b24fc2d9f91cf5a0838f993ab0e199862c802aec1db7cf"} Apr 16 16:29:03.943888 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:03.943892 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" event={"ID":"b886e25f-5306-437f-979e-e7491ad0f09c","Type":"ContainerStarted","Data":"1787c36e83b94b9a15d39fc171c0598613715561d816a56557a030524ebea7ea"} Apr 16 16:29:03.944339 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:03.943976 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" Apr 16 16:29:03.964598 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:03.964559 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" podStartSLOduration=2.964547622 podStartE2EDuration="2.964547622s" podCreationTimestamp="2026-04-16 16:29:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:29:03.963119964 +0000 UTC m=+1671.923311689" watchObservedRunningTime="2026-04-16 16:29:03.964547622 +0000 UTC m=+1671.924739387" Apr 16 16:29:11.345066 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:11.345026 2584 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" Apr 16 16:29:11.345066 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:11.345071 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" Apr 16 16:29:11.347658 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:11.347633 2584 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" Apr 16 16:29:11.980172 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:11.980146 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" Apr 16 16:29:32.983495 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:29:32.983424 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" Apr 16 16:31:12.695455 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:12.695398 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65x5s_82145526-4c6f-43c3-8850-84adf5e445e9/ovn-acl-logging/0.log" Apr 16 16:31:12.698918 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:12.698894 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65x5s_82145526-4c6f-43c3-8850-84adf5e445e9/ovn-acl-logging/0.log" Apr 16 16:31:33.143528 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:33.143494 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx"] Apr 16 16:31:33.143997 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:33.143801 2584 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" podUID="b886e25f-5306-437f-979e-e7491ad0f09c" containerName="main" containerID="cri-o://1787c36e83b94b9a15d39fc171c0598613715561d816a56557a030524ebea7ea" gracePeriod=30 Apr 16 16:31:33.143997 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:33.143846 2584 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" podUID="b886e25f-5306-437f-979e-e7491ad0f09c" containerName="tokenizer" containerID="cri-o://80e1bcd042a070c742b24fc2d9f91cf5a0838f993ab0e199862c802aec1db7cf" gracePeriod=30 Apr 16 16:31:33.445690 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:33.445658 2584 generic.go:358] "Generic (PLEG): container finished" podID="b886e25f-5306-437f-979e-e7491ad0f09c" containerID="1787c36e83b94b9a15d39fc171c0598613715561d816a56557a030524ebea7ea" exitCode=0 Apr 16 16:31:33.445844 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:33.445729 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" event={"ID":"b886e25f-5306-437f-979e-e7491ad0f09c","Type":"ContainerDied","Data":"1787c36e83b94b9a15d39fc171c0598613715561d816a56557a030524ebea7ea"} Apr 16 16:31:34.286298 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:34.286276 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" Apr 16 16:31:34.356831 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:34.356775 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b886e25f-5306-437f-979e-e7491ad0f09c-tokenizer-tmp\") pod \"b886e25f-5306-437f-979e-e7491ad0f09c\" (UID: \"b886e25f-5306-437f-979e-e7491ad0f09c\") " Apr 16 16:31:34.356831 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:34.356826 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b886e25f-5306-437f-979e-e7491ad0f09c-tokenizer-cache\") pod \"b886e25f-5306-437f-979e-e7491ad0f09c\" (UID: \"b886e25f-5306-437f-979e-e7491ad0f09c\") " Apr 16 16:31:34.357038 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:34.356847 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrbj7\" (UniqueName: \"kubernetes.io/projected/b886e25f-5306-437f-979e-e7491ad0f09c-kube-api-access-lrbj7\") pod \"b886e25f-5306-437f-979e-e7491ad0f09c\" (UID: \"b886e25f-5306-437f-979e-e7491ad0f09c\") " Apr 16 16:31:34.357038 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:34.356876 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b886e25f-5306-437f-979e-e7491ad0f09c-tokenizer-uds\") pod \"b886e25f-5306-437f-979e-e7491ad0f09c\" (UID: \"b886e25f-5306-437f-979e-e7491ad0f09c\") " Apr 16 16:31:34.357038 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:34.356891 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b886e25f-5306-437f-979e-e7491ad0f09c-tls-certs\") pod \"b886e25f-5306-437f-979e-e7491ad0f09c\" (UID: \"b886e25f-5306-437f-979e-e7491ad0f09c\") " Apr 16 16:31:34.357038 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:34.356920 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b886e25f-5306-437f-979e-e7491ad0f09c-kserve-provision-location\") pod \"b886e25f-5306-437f-979e-e7491ad0f09c\" (UID: \"b886e25f-5306-437f-979e-e7491ad0f09c\") " Apr 16 16:31:34.357254 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:34.357092 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b886e25f-5306-437f-979e-e7491ad0f09c-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "b886e25f-5306-437f-979e-e7491ad0f09c" (UID: "b886e25f-5306-437f-979e-e7491ad0f09c"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:31:34.357254 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:34.357178 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b886e25f-5306-437f-979e-e7491ad0f09c-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "b886e25f-5306-437f-979e-e7491ad0f09c" (UID: "b886e25f-5306-437f-979e-e7491ad0f09c"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:31:34.357331 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:34.357203 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b886e25f-5306-437f-979e-e7491ad0f09c-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "b886e25f-5306-437f-979e-e7491ad0f09c" (UID: "b886e25f-5306-437f-979e-e7491ad0f09c"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:31:34.357621 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:34.357601 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b886e25f-5306-437f-979e-e7491ad0f09c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b886e25f-5306-437f-979e-e7491ad0f09c" (UID: "b886e25f-5306-437f-979e-e7491ad0f09c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:31:34.359054 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:34.359029 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b886e25f-5306-437f-979e-e7491ad0f09c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b886e25f-5306-437f-979e-e7491ad0f09c" (UID: "b886e25f-5306-437f-979e-e7491ad0f09c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:31:34.359143 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:34.359121 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b886e25f-5306-437f-979e-e7491ad0f09c-kube-api-access-lrbj7" (OuterVolumeSpecName: "kube-api-access-lrbj7") pod "b886e25f-5306-437f-979e-e7491ad0f09c" (UID: "b886e25f-5306-437f-979e-e7491ad0f09c"). InnerVolumeSpecName "kube-api-access-lrbj7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:31:34.449879 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:34.449857 2584 generic.go:358] "Generic (PLEG): container finished" podID="b886e25f-5306-437f-979e-e7491ad0f09c" containerID="80e1bcd042a070c742b24fc2d9f91cf5a0838f993ab0e199862c802aec1db7cf" exitCode=0 Apr 16 16:31:34.450013 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:34.449926 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" Apr 16 16:31:34.450013 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:34.449941 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" event={"ID":"b886e25f-5306-437f-979e-e7491ad0f09c","Type":"ContainerDied","Data":"80e1bcd042a070c742b24fc2d9f91cf5a0838f993ab0e199862c802aec1db7cf"} Apr 16 16:31:34.450013 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:34.450005 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx" event={"ID":"b886e25f-5306-437f-979e-e7491ad0f09c","Type":"ContainerDied","Data":"c230b564a8c011defd2a0a81dd85b093f127292ed7aa290baca399d759c24a61"} Apr 16 16:31:34.450114 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:34.450021 2584 scope.go:117] "RemoveContainer" containerID="80e1bcd042a070c742b24fc2d9f91cf5a0838f993ab0e199862c802aec1db7cf" Apr 16 16:31:34.457803 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:34.457780 2584 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b886e25f-5306-437f-979e-e7491ad0f09c-tokenizer-uds\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:31:34.457893 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:34.457811 2584 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b886e25f-5306-437f-979e-e7491ad0f09c-tls-certs\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:31:34.457893 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:34.457826 2584 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b886e25f-5306-437f-979e-e7491ad0f09c-kserve-provision-location\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:31:34.457893 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:34.457841 2584 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b886e25f-5306-437f-979e-e7491ad0f09c-tokenizer-tmp\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:31:34.457893 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:34.457855 2584 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b886e25f-5306-437f-979e-e7491ad0f09c-tokenizer-cache\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:31:34.457893 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:34.457868 2584 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lrbj7\" (UniqueName: \"kubernetes.io/projected/b886e25f-5306-437f-979e-e7491ad0f09c-kube-api-access-lrbj7\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:31:34.458853 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:34.458835 2584 scope.go:117] "RemoveContainer" containerID="1787c36e83b94b9a15d39fc171c0598613715561d816a56557a030524ebea7ea" Apr 16 16:31:34.465796 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:34.465780 2584 scope.go:117] "RemoveContainer" containerID="e603f96a26cf136b323c4ce7bc4054ca7776e1fb4846ba1f98407a262a06a61a" Apr 16 16:31:34.471769 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:34.471748 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx"] Apr 16 16:31:34.473044 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:34.473029 2584 scope.go:117] "RemoveContainer" containerID="80e1bcd042a070c742b24fc2d9f91cf5a0838f993ab0e199862c802aec1db7cf" Apr 16 16:31:34.473387 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:31:34.473327 2584 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80e1bcd042a070c742b24fc2d9f91cf5a0838f993ab0e199862c802aec1db7cf\": container with ID starting with 80e1bcd042a070c742b24fc2d9f91cf5a0838f993ab0e199862c802aec1db7cf not found: ID does not exist" containerID="80e1bcd042a070c742b24fc2d9f91cf5a0838f993ab0e199862c802aec1db7cf" Apr 16 16:31:34.473474 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:34.473399 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80e1bcd042a070c742b24fc2d9f91cf5a0838f993ab0e199862c802aec1db7cf"} err="failed to get container status \"80e1bcd042a070c742b24fc2d9f91cf5a0838f993ab0e199862c802aec1db7cf\": rpc error: code = NotFound desc = could not find container \"80e1bcd042a070c742b24fc2d9f91cf5a0838f993ab0e199862c802aec1db7cf\": container with ID starting with 80e1bcd042a070c742b24fc2d9f91cf5a0838f993ab0e199862c802aec1db7cf not found: ID does not exist" Apr 16 16:31:34.473474 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:34.473432 2584 scope.go:117] "RemoveContainer" containerID="1787c36e83b94b9a15d39fc171c0598613715561d816a56557a030524ebea7ea" Apr 16 16:31:34.473760 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:31:34.473738 2584 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1787c36e83b94b9a15d39fc171c0598613715561d816a56557a030524ebea7ea\": container with ID starting with 1787c36e83b94b9a15d39fc171c0598613715561d816a56557a030524ebea7ea not found: ID does not exist" containerID="1787c36e83b94b9a15d39fc171c0598613715561d816a56557a030524ebea7ea" Apr 16 16:31:34.473869 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:34.473849 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1787c36e83b94b9a15d39fc171c0598613715561d816a56557a030524ebea7ea"} err="failed to get container status \"1787c36e83b94b9a15d39fc171c0598613715561d816a56557a030524ebea7ea\": rpc error: code = NotFound desc = could not find container \"1787c36e83b94b9a15d39fc171c0598613715561d816a56557a030524ebea7ea\": container with ID starting with 1787c36e83b94b9a15d39fc171c0598613715561d816a56557a030524ebea7ea not found: ID does not exist" Apr 16 16:31:34.474021 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:34.474000 2584 scope.go:117] "RemoveContainer" containerID="e603f96a26cf136b323c4ce7bc4054ca7776e1fb4846ba1f98407a262a06a61a" Apr 16 16:31:34.474285 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:31:34.474267 2584 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e603f96a26cf136b323c4ce7bc4054ca7776e1fb4846ba1f98407a262a06a61a\": container with ID starting with e603f96a26cf136b323c4ce7bc4054ca7776e1fb4846ba1f98407a262a06a61a not found: ID does not exist" containerID="e603f96a26cf136b323c4ce7bc4054ca7776e1fb4846ba1f98407a262a06a61a" Apr 16 16:31:34.474375 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:34.474289 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e603f96a26cf136b323c4ce7bc4054ca7776e1fb4846ba1f98407a262a06a61a"} err="failed to get container status \"e603f96a26cf136b323c4ce7bc4054ca7776e1fb4846ba1f98407a262a06a61a\": rpc error: code = NotFound desc = could not find container \"e603f96a26cf136b323c4ce7bc4054ca7776e1fb4846ba1f98407a262a06a61a\": container with ID starting with e603f96a26cf136b323c4ce7bc4054ca7776e1fb4846ba1f98407a262a06a61a not found: ID does not exist" Apr 16 16:31:34.475511 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:34.475494 2584 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-fd6b4b5445jqvx"] Apr 16 16:31:34.698200 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:31:34.698179 2584 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b886e25f-5306-437f-979e-e7491ad0f09c" path="/var/lib/kubelet/pods/b886e25f-5306-437f-979e-e7491ad0f09c/volumes" Apr 16 16:33:21.836977 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:21.836932 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6"] Apr 16 16:33:21.837390 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:21.837252 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b886e25f-5306-437f-979e-e7491ad0f09c" containerName="tokenizer" Apr 16 16:33:21.837390 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:21.837263 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="b886e25f-5306-437f-979e-e7491ad0f09c" containerName="tokenizer" Apr 16 16:33:21.837390 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:21.837278 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b886e25f-5306-437f-979e-e7491ad0f09c" containerName="main" Apr 16 16:33:21.837390 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:21.837284 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="b886e25f-5306-437f-979e-e7491ad0f09c" containerName="main" Apr 16 16:33:21.837390 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:21.837294 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b886e25f-5306-437f-979e-e7491ad0f09c" containerName="storage-initializer" Apr 16 16:33:21.837390 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:21.837300 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="b886e25f-5306-437f-979e-e7491ad0f09c" containerName="storage-initializer" Apr 16 16:33:21.837390 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:21.837386 2584 memory_manager.go:356] "RemoveStaleState removing state" podUID="b886e25f-5306-437f-979e-e7491ad0f09c" containerName="main" Apr 16 16:33:21.837674 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:21.837405 2584 memory_manager.go:356] "RemoveStaleState removing state" podUID="b886e25f-5306-437f-979e-e7491ad0f09c" containerName="tokenizer" Apr 16 16:33:21.840790 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:21.840613 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" Apr 16 16:33:21.844229 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:21.844208 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-epp-sa-dockercfg-bcpdv\"" Apr 16 16:33:21.844334 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:21.844207 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 16 16:33:21.844334 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:21.844212 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-2fdlg\"" Apr 16 16:33:21.850824 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:21.850801 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6"] Apr 16 16:33:21.937200 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:21.937171 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f35faaee-6719-4939-8ca7-f9cb1275f4db-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6\" (UID: \"f35faaee-6719-4939-8ca7-f9cb1275f4db\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" Apr 16 16:33:21.937321 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:21.937208 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f35faaee-6719-4939-8ca7-f9cb1275f4db-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6\" (UID: \"f35faaee-6719-4939-8ca7-f9cb1275f4db\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" Apr 16 16:33:21.937321 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:21.937235 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f35faaee-6719-4939-8ca7-f9cb1275f4db-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6\" (UID: \"f35faaee-6719-4939-8ca7-f9cb1275f4db\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" Apr 16 16:33:21.937321 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:21.937284 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dnspr\" (UniqueName: \"kubernetes.io/projected/f35faaee-6719-4939-8ca7-f9cb1275f4db-kube-api-access-dnspr\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6\" (UID: \"f35faaee-6719-4939-8ca7-f9cb1275f4db\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" Apr 16 16:33:21.937321 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:21.937309 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f35faaee-6719-4939-8ca7-f9cb1275f4db-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6\" (UID: \"f35faaee-6719-4939-8ca7-f9cb1275f4db\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" Apr 16 16:33:21.937459 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:21.937362 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f35faaee-6719-4939-8ca7-f9cb1275f4db-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6\" (UID: \"f35faaee-6719-4939-8ca7-f9cb1275f4db\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" Apr 16 16:33:22.038412 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:22.038379 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f35faaee-6719-4939-8ca7-f9cb1275f4db-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6\" (UID: \"f35faaee-6719-4939-8ca7-f9cb1275f4db\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" Apr 16 16:33:22.038581 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:22.038418 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f35faaee-6719-4939-8ca7-f9cb1275f4db-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6\" (UID: \"f35faaee-6719-4939-8ca7-f9cb1275f4db\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" Apr 16 16:33:22.038623 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:22.038593 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dnspr\" (UniqueName: \"kubernetes.io/projected/f35faaee-6719-4939-8ca7-f9cb1275f4db-kube-api-access-dnspr\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6\" (UID: \"f35faaee-6719-4939-8ca7-f9cb1275f4db\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" Apr 16 16:33:22.038662 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:22.038625 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f35faaee-6719-4939-8ca7-f9cb1275f4db-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6\" (UID: \"f35faaee-6719-4939-8ca7-f9cb1275f4db\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" Apr 16 16:33:22.038662 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:22.038655 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f35faaee-6719-4939-8ca7-f9cb1275f4db-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6\" (UID: \"f35faaee-6719-4939-8ca7-f9cb1275f4db\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" Apr 16 16:33:22.038733 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:22.038718 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f35faaee-6719-4939-8ca7-f9cb1275f4db-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6\" (UID: \"f35faaee-6719-4939-8ca7-f9cb1275f4db\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" Apr 16 16:33:22.038827 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:22.038804 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f35faaee-6719-4939-8ca7-f9cb1275f4db-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6\" (UID: \"f35faaee-6719-4939-8ca7-f9cb1275f4db\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" Apr 16 16:33:22.038884 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:22.038834 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f35faaee-6719-4939-8ca7-f9cb1275f4db-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6\" (UID: \"f35faaee-6719-4939-8ca7-f9cb1275f4db\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" Apr 16 16:33:22.039097 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:22.039033 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f35faaee-6719-4939-8ca7-f9cb1275f4db-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6\" (UID: \"f35faaee-6719-4939-8ca7-f9cb1275f4db\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" Apr 16 16:33:22.039097 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:22.039047 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f35faaee-6719-4939-8ca7-f9cb1275f4db-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6\" (UID: \"f35faaee-6719-4939-8ca7-f9cb1275f4db\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" Apr 16 16:33:22.041336 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:22.041317 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f35faaee-6719-4939-8ca7-f9cb1275f4db-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6\" (UID: \"f35faaee-6719-4939-8ca7-f9cb1275f4db\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" Apr 16 16:33:22.047316 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:22.047295 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dnspr\" (UniqueName: \"kubernetes.io/projected/f35faaee-6719-4939-8ca7-f9cb1275f4db-kube-api-access-dnspr\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6\" (UID: \"f35faaee-6719-4939-8ca7-f9cb1275f4db\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" Apr 16 16:33:22.151012 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:22.150913 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" Apr 16 16:33:22.281647 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:22.281608 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6"] Apr 16 16:33:22.284476 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:33:22.284449 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf35faaee_6719_4939_8ca7_f9cb1275f4db.slice/crio-800072d9c8948a0d3f8cac15cadcde98832dc66ca3898892f19c7f47b9248d16 WatchSource:0}: Error finding container 800072d9c8948a0d3f8cac15cadcde98832dc66ca3898892f19c7f47b9248d16: Status 404 returned error can't find the container with id 800072d9c8948a0d3f8cac15cadcde98832dc66ca3898892f19c7f47b9248d16 Apr 16 16:33:22.286683 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:22.286668 2584 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 16:33:22.792894 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:22.792853 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" event={"ID":"f35faaee-6719-4939-8ca7-f9cb1275f4db","Type":"ContainerStarted","Data":"84bc2f6069c456047f82998afe8298bba4333e18651b3ee0f869e70f9ee26d65"} Apr 16 16:33:22.792894 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:22.792888 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" event={"ID":"f35faaee-6719-4939-8ca7-f9cb1275f4db","Type":"ContainerStarted","Data":"800072d9c8948a0d3f8cac15cadcde98832dc66ca3898892f19c7f47b9248d16"} Apr 16 16:33:23.797415 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:23.797376 2584 generic.go:358] "Generic (PLEG): container finished" podID="f35faaee-6719-4939-8ca7-f9cb1275f4db" containerID="84bc2f6069c456047f82998afe8298bba4333e18651b3ee0f869e70f9ee26d65" exitCode=0 Apr 16 16:33:23.797812 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:23.797460 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" event={"ID":"f35faaee-6719-4939-8ca7-f9cb1275f4db","Type":"ContainerDied","Data":"84bc2f6069c456047f82998afe8298bba4333e18651b3ee0f869e70f9ee26d65"} Apr 16 16:33:24.802875 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:24.802837 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" event={"ID":"f35faaee-6719-4939-8ca7-f9cb1275f4db","Type":"ContainerStarted","Data":"ac295ceda2df2f8dea7a250df49e9b5cde2720866aaa6721ea77b1d48dad02f8"} Apr 16 16:33:24.802875 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:24.802876 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" event={"ID":"f35faaee-6719-4939-8ca7-f9cb1275f4db","Type":"ContainerStarted","Data":"2e5288b5f36cfaf3f183928a4234278bf6b83f83a74c4ed8238cf3bc1fbc00b4"} Apr 16 16:33:24.803345 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:24.803108 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" Apr 16 16:33:32.151538 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:32.151499 2584 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" Apr 16 16:33:32.151538 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:32.151543 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" Apr 16 16:33:32.153923 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:32.153900 2584 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" Apr 16 16:33:32.174741 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:32.174702 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" podStartSLOduration=11.174690619 podStartE2EDuration="11.174690619s" podCreationTimestamp="2026-04-16 16:33:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:33:24.824771793 +0000 UTC m=+1932.784963519" watchObservedRunningTime="2026-04-16 16:33:32.174690619 +0000 UTC m=+1940.134882342" Apr 16 16:33:32.836901 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:32.836872 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" Apr 16 16:33:53.836655 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:33:53.836630 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" Apr 16 16:35:19.189245 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:19.189214 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6"] Apr 16 16:35:19.189752 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:19.189555 2584 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" podUID="f35faaee-6719-4939-8ca7-f9cb1275f4db" containerName="main" containerID="cri-o://2e5288b5f36cfaf3f183928a4234278bf6b83f83a74c4ed8238cf3bc1fbc00b4" gracePeriod=30 Apr 16 16:35:19.189752 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:19.189604 2584 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" podUID="f35faaee-6719-4939-8ca7-f9cb1275f4db" containerName="tokenizer" containerID="cri-o://ac295ceda2df2f8dea7a250df49e9b5cde2720866aaa6721ea77b1d48dad02f8" gracePeriod=30 Apr 16 16:35:20.180112 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:20.180069 2584 generic.go:358] "Generic (PLEG): container finished" podID="f35faaee-6719-4939-8ca7-f9cb1275f4db" containerID="2e5288b5f36cfaf3f183928a4234278bf6b83f83a74c4ed8238cf3bc1fbc00b4" exitCode=0 Apr 16 16:35:20.180241 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:20.180136 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" event={"ID":"f35faaee-6719-4939-8ca7-f9cb1275f4db","Type":"ContainerDied","Data":"2e5288b5f36cfaf3f183928a4234278bf6b83f83a74c4ed8238cf3bc1fbc00b4"} Apr 16 16:35:20.328001 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:20.327979 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" Apr 16 16:35:20.449653 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:20.449628 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f35faaee-6719-4939-8ca7-f9cb1275f4db-tokenizer-uds\") pod \"f35faaee-6719-4939-8ca7-f9cb1275f4db\" (UID: \"f35faaee-6719-4939-8ca7-f9cb1275f4db\") " Apr 16 16:35:20.452194 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:20.449690 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f35faaee-6719-4939-8ca7-f9cb1275f4db-kserve-provision-location\") pod \"f35faaee-6719-4939-8ca7-f9cb1275f4db\" (UID: \"f35faaee-6719-4939-8ca7-f9cb1275f4db\") " Apr 16 16:35:20.452194 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:20.449710 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f35faaee-6719-4939-8ca7-f9cb1275f4db-tls-certs\") pod \"f35faaee-6719-4939-8ca7-f9cb1275f4db\" (UID: \"f35faaee-6719-4939-8ca7-f9cb1275f4db\") " Apr 16 16:35:20.452194 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:20.449733 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f35faaee-6719-4939-8ca7-f9cb1275f4db-tokenizer-cache\") pod \"f35faaee-6719-4939-8ca7-f9cb1275f4db\" (UID: \"f35faaee-6719-4939-8ca7-f9cb1275f4db\") " Apr 16 16:35:20.452194 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:20.449751 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f35faaee-6719-4939-8ca7-f9cb1275f4db-tokenizer-tmp\") pod \"f35faaee-6719-4939-8ca7-f9cb1275f4db\" (UID: \"f35faaee-6719-4939-8ca7-f9cb1275f4db\") " Apr 16 16:35:20.452194 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:20.449783 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dnspr\" (UniqueName: \"kubernetes.io/projected/f35faaee-6719-4939-8ca7-f9cb1275f4db-kube-api-access-dnspr\") pod \"f35faaee-6719-4939-8ca7-f9cb1275f4db\" (UID: \"f35faaee-6719-4939-8ca7-f9cb1275f4db\") " Apr 16 16:35:20.452194 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:20.449854 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f35faaee-6719-4939-8ca7-f9cb1275f4db-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "f35faaee-6719-4939-8ca7-f9cb1275f4db" (UID: "f35faaee-6719-4939-8ca7-f9cb1275f4db"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:35:20.452194 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:20.450015 2584 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f35faaee-6719-4939-8ca7-f9cb1275f4db-tokenizer-uds\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:35:20.452194 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:20.450016 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f35faaee-6719-4939-8ca7-f9cb1275f4db-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "f35faaee-6719-4939-8ca7-f9cb1275f4db" (UID: "f35faaee-6719-4939-8ca7-f9cb1275f4db"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:35:20.452194 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:20.450109 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f35faaee-6719-4939-8ca7-f9cb1275f4db-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "f35faaee-6719-4939-8ca7-f9cb1275f4db" (UID: "f35faaee-6719-4939-8ca7-f9cb1275f4db"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:35:20.452194 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:20.450382 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f35faaee-6719-4939-8ca7-f9cb1275f4db-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f35faaee-6719-4939-8ca7-f9cb1275f4db" (UID: "f35faaee-6719-4939-8ca7-f9cb1275f4db"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:35:20.452194 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:20.451893 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f35faaee-6719-4939-8ca7-f9cb1275f4db-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f35faaee-6719-4939-8ca7-f9cb1275f4db" (UID: "f35faaee-6719-4939-8ca7-f9cb1275f4db"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:35:20.452194 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:20.451906 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f35faaee-6719-4939-8ca7-f9cb1275f4db-kube-api-access-dnspr" (OuterVolumeSpecName: "kube-api-access-dnspr") pod "f35faaee-6719-4939-8ca7-f9cb1275f4db" (UID: "f35faaee-6719-4939-8ca7-f9cb1275f4db"). InnerVolumeSpecName "kube-api-access-dnspr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:35:20.550404 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:20.550382 2584 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f35faaee-6719-4939-8ca7-f9cb1275f4db-kserve-provision-location\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:35:20.550404 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:20.550403 2584 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f35faaee-6719-4939-8ca7-f9cb1275f4db-tls-certs\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:35:20.550529 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:20.550414 2584 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f35faaee-6719-4939-8ca7-f9cb1275f4db-tokenizer-cache\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:35:20.550529 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:20.550422 2584 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f35faaee-6719-4939-8ca7-f9cb1275f4db-tokenizer-tmp\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:35:20.550529 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:20.550430 2584 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dnspr\" (UniqueName: \"kubernetes.io/projected/f35faaee-6719-4939-8ca7-f9cb1275f4db-kube-api-access-dnspr\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:35:21.185211 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:21.185178 2584 generic.go:358] "Generic (PLEG): container finished" podID="f35faaee-6719-4939-8ca7-f9cb1275f4db" containerID="ac295ceda2df2f8dea7a250df49e9b5cde2720866aaa6721ea77b1d48dad02f8" exitCode=0 Apr 16 16:35:21.185371 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:21.185212 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" event={"ID":"f35faaee-6719-4939-8ca7-f9cb1275f4db","Type":"ContainerDied","Data":"ac295ceda2df2f8dea7a250df49e9b5cde2720866aaa6721ea77b1d48dad02f8"} Apr 16 16:35:21.185371 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:21.185257 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" event={"ID":"f35faaee-6719-4939-8ca7-f9cb1275f4db","Type":"ContainerDied","Data":"800072d9c8948a0d3f8cac15cadcde98832dc66ca3898892f19c7f47b9248d16"} Apr 16 16:35:21.185371 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:21.185259 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6" Apr 16 16:35:21.185371 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:21.185273 2584 scope.go:117] "RemoveContainer" containerID="ac295ceda2df2f8dea7a250df49e9b5cde2720866aaa6721ea77b1d48dad02f8" Apr 16 16:35:21.193400 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:21.193382 2584 scope.go:117] "RemoveContainer" containerID="2e5288b5f36cfaf3f183928a4234278bf6b83f83a74c4ed8238cf3bc1fbc00b4" Apr 16 16:35:21.200412 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:21.200393 2584 scope.go:117] "RemoveContainer" containerID="84bc2f6069c456047f82998afe8298bba4333e18651b3ee0f869e70f9ee26d65" Apr 16 16:35:21.202910 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:21.202889 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6"] Apr 16 16:35:21.208129 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:21.208106 2584 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-d5bb5r7zh6"] Apr 16 16:35:21.210869 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:21.208990 2584 scope.go:117] "RemoveContainer" containerID="ac295ceda2df2f8dea7a250df49e9b5cde2720866aaa6721ea77b1d48dad02f8" Apr 16 16:35:21.211084 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:35:21.211064 2584 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac295ceda2df2f8dea7a250df49e9b5cde2720866aaa6721ea77b1d48dad02f8\": container with ID starting with ac295ceda2df2f8dea7a250df49e9b5cde2720866aaa6721ea77b1d48dad02f8 not found: ID does not exist" containerID="ac295ceda2df2f8dea7a250df49e9b5cde2720866aaa6721ea77b1d48dad02f8" Apr 16 16:35:21.211148 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:21.211097 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac295ceda2df2f8dea7a250df49e9b5cde2720866aaa6721ea77b1d48dad02f8"} err="failed to get container status \"ac295ceda2df2f8dea7a250df49e9b5cde2720866aaa6721ea77b1d48dad02f8\": rpc error: code = NotFound desc = could not find container \"ac295ceda2df2f8dea7a250df49e9b5cde2720866aaa6721ea77b1d48dad02f8\": container with ID starting with ac295ceda2df2f8dea7a250df49e9b5cde2720866aaa6721ea77b1d48dad02f8 not found: ID does not exist" Apr 16 16:35:21.211148 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:21.211115 2584 scope.go:117] "RemoveContainer" containerID="2e5288b5f36cfaf3f183928a4234278bf6b83f83a74c4ed8238cf3bc1fbc00b4" Apr 16 16:35:21.211348 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:35:21.211330 2584 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e5288b5f36cfaf3f183928a4234278bf6b83f83a74c4ed8238cf3bc1fbc00b4\": container with ID starting with 2e5288b5f36cfaf3f183928a4234278bf6b83f83a74c4ed8238cf3bc1fbc00b4 not found: ID does not exist" containerID="2e5288b5f36cfaf3f183928a4234278bf6b83f83a74c4ed8238cf3bc1fbc00b4" Apr 16 16:35:21.211430 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:21.211356 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e5288b5f36cfaf3f183928a4234278bf6b83f83a74c4ed8238cf3bc1fbc00b4"} err="failed to get container status \"2e5288b5f36cfaf3f183928a4234278bf6b83f83a74c4ed8238cf3bc1fbc00b4\": rpc error: code = NotFound desc = could not find container \"2e5288b5f36cfaf3f183928a4234278bf6b83f83a74c4ed8238cf3bc1fbc00b4\": container with ID starting with 2e5288b5f36cfaf3f183928a4234278bf6b83f83a74c4ed8238cf3bc1fbc00b4 not found: ID does not exist" Apr 16 16:35:21.211430 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:21.211374 2584 scope.go:117] "RemoveContainer" containerID="84bc2f6069c456047f82998afe8298bba4333e18651b3ee0f869e70f9ee26d65" Apr 16 16:35:21.211612 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:35:21.211595 2584 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84bc2f6069c456047f82998afe8298bba4333e18651b3ee0f869e70f9ee26d65\": container with ID starting with 84bc2f6069c456047f82998afe8298bba4333e18651b3ee0f869e70f9ee26d65 not found: ID does not exist" containerID="84bc2f6069c456047f82998afe8298bba4333e18651b3ee0f869e70f9ee26d65" Apr 16 16:35:21.211676 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:21.211616 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84bc2f6069c456047f82998afe8298bba4333e18651b3ee0f869e70f9ee26d65"} err="failed to get container status \"84bc2f6069c456047f82998afe8298bba4333e18651b3ee0f869e70f9ee26d65\": rpc error: code = NotFound desc = could not find container \"84bc2f6069c456047f82998afe8298bba4333e18651b3ee0f869e70f9ee26d65\": container with ID starting with 84bc2f6069c456047f82998afe8298bba4333e18651b3ee0f869e70f9ee26d65 not found: ID does not exist" Apr 16 16:35:22.699849 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:22.699817 2584 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f35faaee-6719-4939-8ca7-f9cb1275f4db" path="/var/lib/kubelet/pods/f35faaee-6719-4939-8ca7-f9cb1275f4db/volumes" Apr 16 16:35:34.623845 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.623428 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5"] Apr 16 16:35:34.624387 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.623929 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f35faaee-6719-4939-8ca7-f9cb1275f4db" containerName="tokenizer" Apr 16 16:35:34.624387 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.623947 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="f35faaee-6719-4939-8ca7-f9cb1275f4db" containerName="tokenizer" Apr 16 16:35:34.624387 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.624008 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f35faaee-6719-4939-8ca7-f9cb1275f4db" containerName="main" Apr 16 16:35:34.624387 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.624017 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="f35faaee-6719-4939-8ca7-f9cb1275f4db" containerName="main" Apr 16 16:35:34.624387 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.624037 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f35faaee-6719-4939-8ca7-f9cb1275f4db" containerName="storage-initializer" Apr 16 16:35:34.624387 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.624046 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="f35faaee-6719-4939-8ca7-f9cb1275f4db" containerName="storage-initializer" Apr 16 16:35:34.624387 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.624124 2584 memory_manager.go:356] "RemoveStaleState removing state" podUID="f35faaee-6719-4939-8ca7-f9cb1275f4db" containerName="main" Apr 16 16:35:34.624387 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.624138 2584 memory_manager.go:356] "RemoveStaleState removing state" podUID="f35faaee-6719-4939-8ca7-f9cb1275f4db" containerName="tokenizer" Apr 16 16:35:34.629212 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.629193 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5" Apr 16 16:35:34.632654 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.632635 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-2fdlg\"" Apr 16 16:35:34.632766 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.632703 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 16 16:35:34.636839 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.636730 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5"] Apr 16 16:35:34.755476 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.755433 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/924a69ca-4621-44a4-9521-2c3060cc36f9-dshm\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5\" (UID: \"924a69ca-4621-44a4-9521-2c3060cc36f9\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5" Apr 16 16:35:34.755644 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.755480 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqcwq\" (UniqueName: \"kubernetes.io/projected/924a69ca-4621-44a4-9521-2c3060cc36f9-kube-api-access-gqcwq\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5\" (UID: \"924a69ca-4621-44a4-9521-2c3060cc36f9\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5" Apr 16 16:35:34.755644 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.755599 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/924a69ca-4621-44a4-9521-2c3060cc36f9-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5\" (UID: \"924a69ca-4621-44a4-9521-2c3060cc36f9\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5" Apr 16 16:35:34.755781 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.755658 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/924a69ca-4621-44a4-9521-2c3060cc36f9-model-cache\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5\" (UID: \"924a69ca-4621-44a4-9521-2c3060cc36f9\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5" Apr 16 16:35:34.755781 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.755735 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/924a69ca-4621-44a4-9521-2c3060cc36f9-tls-certs\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5\" (UID: \"924a69ca-4621-44a4-9521-2c3060cc36f9\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5" Apr 16 16:35:34.755781 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.755770 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/924a69ca-4621-44a4-9521-2c3060cc36f9-home\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5\" (UID: \"924a69ca-4621-44a4-9521-2c3060cc36f9\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5" Apr 16 16:35:34.827293 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.827264 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd"] Apr 16 16:35:34.831181 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.831161 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" Apr 16 16:35:34.833531 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.833509 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-epp-sa-dockercfg-n9rd8\"" Apr 16 16:35:34.843561 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.843540 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd"] Apr 16 16:35:34.856603 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.856581 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/924a69ca-4621-44a4-9521-2c3060cc36f9-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5\" (UID: \"924a69ca-4621-44a4-9521-2c3060cc36f9\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5" Apr 16 16:35:34.856702 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.856629 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/924a69ca-4621-44a4-9521-2c3060cc36f9-model-cache\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5\" (UID: \"924a69ca-4621-44a4-9521-2c3060cc36f9\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5" Apr 16 16:35:34.856769 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.856712 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/924a69ca-4621-44a4-9521-2c3060cc36f9-tls-certs\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5\" (UID: \"924a69ca-4621-44a4-9521-2c3060cc36f9\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5" Apr 16 16:35:34.856769 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.856760 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/924a69ca-4621-44a4-9521-2c3060cc36f9-home\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5\" (UID: \"924a69ca-4621-44a4-9521-2c3060cc36f9\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5" Apr 16 16:35:34.856872 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.856789 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/924a69ca-4621-44a4-9521-2c3060cc36f9-dshm\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5\" (UID: \"924a69ca-4621-44a4-9521-2c3060cc36f9\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5" Apr 16 16:35:34.856872 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.856823 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gqcwq\" (UniqueName: \"kubernetes.io/projected/924a69ca-4621-44a4-9521-2c3060cc36f9-kube-api-access-gqcwq\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5\" (UID: \"924a69ca-4621-44a4-9521-2c3060cc36f9\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5" Apr 16 16:35:34.857005 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.856940 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/924a69ca-4621-44a4-9521-2c3060cc36f9-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5\" (UID: \"924a69ca-4621-44a4-9521-2c3060cc36f9\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5" Apr 16 16:35:34.857069 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.857048 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/924a69ca-4621-44a4-9521-2c3060cc36f9-model-cache\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5\" (UID: \"924a69ca-4621-44a4-9521-2c3060cc36f9\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5" Apr 16 16:35:34.857118 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.857098 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/924a69ca-4621-44a4-9521-2c3060cc36f9-home\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5\" (UID: \"924a69ca-4621-44a4-9521-2c3060cc36f9\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5" Apr 16 16:35:34.859088 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.859059 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/924a69ca-4621-44a4-9521-2c3060cc36f9-dshm\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5\" (UID: \"924a69ca-4621-44a4-9521-2c3060cc36f9\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5" Apr 16 16:35:34.859354 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.859336 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/924a69ca-4621-44a4-9521-2c3060cc36f9-tls-certs\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5\" (UID: \"924a69ca-4621-44a4-9521-2c3060cc36f9\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5" Apr 16 16:35:34.868211 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.868188 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqcwq\" (UniqueName: \"kubernetes.io/projected/924a69ca-4621-44a4-9521-2c3060cc36f9-kube-api-access-gqcwq\") pod \"scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5\" (UID: \"924a69ca-4621-44a4-9521-2c3060cc36f9\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5" Apr 16 16:35:34.941159 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.941136 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5" Apr 16 16:35:34.958038 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.958016 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5092080e-c8c1-42b1-84ac-478f8684af0a-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd\" (UID: \"5092080e-c8c1-42b1-84ac-478f8684af0a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" Apr 16 16:35:34.958139 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.958065 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5092080e-c8c1-42b1-84ac-478f8684af0a-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd\" (UID: \"5092080e-c8c1-42b1-84ac-478f8684af0a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" Apr 16 16:35:34.958139 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.958106 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5092080e-c8c1-42b1-84ac-478f8684af0a-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd\" (UID: \"5092080e-c8c1-42b1-84ac-478f8684af0a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" Apr 16 16:35:34.958351 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.958153 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5092080e-c8c1-42b1-84ac-478f8684af0a-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd\" (UID: \"5092080e-c8c1-42b1-84ac-478f8684af0a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" Apr 16 16:35:34.958351 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.958180 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5092080e-c8c1-42b1-84ac-478f8684af0a-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd\" (UID: \"5092080e-c8c1-42b1-84ac-478f8684af0a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" Apr 16 16:35:34.958351 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:34.958218 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jp5tq\" (UniqueName: \"kubernetes.io/projected/5092080e-c8c1-42b1-84ac-478f8684af0a-kube-api-access-jp5tq\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd\" (UID: \"5092080e-c8c1-42b1-84ac-478f8684af0a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" Apr 16 16:35:35.059697 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:35.059666 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5092080e-c8c1-42b1-84ac-478f8684af0a-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd\" (UID: \"5092080e-c8c1-42b1-84ac-478f8684af0a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" Apr 16 16:35:35.059851 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:35.059715 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5092080e-c8c1-42b1-84ac-478f8684af0a-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd\" (UID: \"5092080e-c8c1-42b1-84ac-478f8684af0a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" Apr 16 16:35:35.059851 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:35.059738 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5092080e-c8c1-42b1-84ac-478f8684af0a-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd\" (UID: \"5092080e-c8c1-42b1-84ac-478f8684af0a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" Apr 16 16:35:35.059851 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:35.059771 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jp5tq\" (UniqueName: \"kubernetes.io/projected/5092080e-c8c1-42b1-84ac-478f8684af0a-kube-api-access-jp5tq\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd\" (UID: \"5092080e-c8c1-42b1-84ac-478f8684af0a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" Apr 16 16:35:35.059851 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:35.059826 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5092080e-c8c1-42b1-84ac-478f8684af0a-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd\" (UID: \"5092080e-c8c1-42b1-84ac-478f8684af0a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" Apr 16 16:35:35.060049 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:35.059855 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5092080e-c8c1-42b1-84ac-478f8684af0a-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd\" (UID: \"5092080e-c8c1-42b1-84ac-478f8684af0a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" Apr 16 16:35:35.060176 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:35.060152 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5092080e-c8c1-42b1-84ac-478f8684af0a-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd\" (UID: \"5092080e-c8c1-42b1-84ac-478f8684af0a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" Apr 16 16:35:35.060238 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:35.060196 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5092080e-c8c1-42b1-84ac-478f8684af0a-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd\" (UID: \"5092080e-c8c1-42b1-84ac-478f8684af0a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" Apr 16 16:35:35.060294 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:35.060263 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5092080e-c8c1-42b1-84ac-478f8684af0a-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd\" (UID: \"5092080e-c8c1-42b1-84ac-478f8684af0a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" Apr 16 16:35:35.060342 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:35.060291 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5092080e-c8c1-42b1-84ac-478f8684af0a-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd\" (UID: \"5092080e-c8c1-42b1-84ac-478f8684af0a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" Apr 16 16:35:35.062277 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:35.062259 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5092080e-c8c1-42b1-84ac-478f8684af0a-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd\" (UID: \"5092080e-c8c1-42b1-84ac-478f8684af0a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" Apr 16 16:35:35.072065 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:35.072048 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jp5tq\" (UniqueName: \"kubernetes.io/projected/5092080e-c8c1-42b1-84ac-478f8684af0a-kube-api-access-jp5tq\") pod \"scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd\" (UID: \"5092080e-c8c1-42b1-84ac-478f8684af0a\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" Apr 16 16:35:35.072262 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:35.072245 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5"] Apr 16 16:35:35.073365 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:35:35.073335 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod924a69ca_4621_44a4_9521_2c3060cc36f9.slice/crio-a43242bba3b9b70a09963c992da79cc245686a12528fe30d19b0858d6a99bec8 WatchSource:0}: Error finding container a43242bba3b9b70a09963c992da79cc245686a12528fe30d19b0858d6a99bec8: Status 404 returned error can't find the container with id a43242bba3b9b70a09963c992da79cc245686a12528fe30d19b0858d6a99bec8 Apr 16 16:35:35.141495 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:35.141476 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" Apr 16 16:35:35.233157 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:35.233120 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5" event={"ID":"924a69ca-4621-44a4-9521-2c3060cc36f9","Type":"ContainerStarted","Data":"429e5c448d29f020b59be7a3d44ab2321930f5a01cbe63eb44b30741ab09344d"} Apr 16 16:35:35.233303 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:35.233171 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5" event={"ID":"924a69ca-4621-44a4-9521-2c3060cc36f9","Type":"ContainerStarted","Data":"a43242bba3b9b70a09963c992da79cc245686a12528fe30d19b0858d6a99bec8"} Apr 16 16:35:35.268516 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:35.268491 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd"] Apr 16 16:35:35.271087 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:35:35.271063 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5092080e_c8c1_42b1_84ac_478f8684af0a.slice/crio-3bbe2903a3483fbc19112191daf7fb4d90a49f87971d35b4cba69af63da42357 WatchSource:0}: Error finding container 3bbe2903a3483fbc19112191daf7fb4d90a49f87971d35b4cba69af63da42357: Status 404 returned error can't find the container with id 3bbe2903a3483fbc19112191daf7fb4d90a49f87971d35b4cba69af63da42357 Apr 16 16:35:36.238874 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:36.238835 2584 generic.go:358] "Generic (PLEG): container finished" podID="5092080e-c8c1-42b1-84ac-478f8684af0a" containerID="04fd8d3f09c918c5e69654be16ab0887c6f1c3889857950084e16466a86aab8d" exitCode=0 Apr 16 16:35:36.239400 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:36.238930 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" event={"ID":"5092080e-c8c1-42b1-84ac-478f8684af0a","Type":"ContainerDied","Data":"04fd8d3f09c918c5e69654be16ab0887c6f1c3889857950084e16466a86aab8d"} Apr 16 16:35:36.239400 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:36.238994 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" event={"ID":"5092080e-c8c1-42b1-84ac-478f8684af0a","Type":"ContainerStarted","Data":"3bbe2903a3483fbc19112191daf7fb4d90a49f87971d35b4cba69af63da42357"} Apr 16 16:35:37.244497 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:37.244459 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" event={"ID":"5092080e-c8c1-42b1-84ac-478f8684af0a","Type":"ContainerStarted","Data":"022d856de7a80b86efd7481e73dba40199e2d5ad0c5e586d7cfb30fd39f05604"} Apr 16 16:35:37.244497 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:37.244498 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" event={"ID":"5092080e-c8c1-42b1-84ac-478f8684af0a","Type":"ContainerStarted","Data":"a700a582b091cc6a566e16b4ae5acc168ac776e8ba90e32d3a374edf1890af64"} Apr 16 16:35:37.244908 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:37.244689 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" Apr 16 16:35:37.270881 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:37.270825 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" podStartSLOduration=3.270809225 podStartE2EDuration="3.270809225s" podCreationTimestamp="2026-04-16 16:35:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:35:37.268147593 +0000 UTC m=+2065.228339317" watchObservedRunningTime="2026-04-16 16:35:37.270809225 +0000 UTC m=+2065.231000947" Apr 16 16:35:40.259098 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:40.259060 2584 generic.go:358] "Generic (PLEG): container finished" podID="924a69ca-4621-44a4-9521-2c3060cc36f9" containerID="429e5c448d29f020b59be7a3d44ab2321930f5a01cbe63eb44b30741ab09344d" exitCode=0 Apr 16 16:35:40.259461 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:40.259134 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5" event={"ID":"924a69ca-4621-44a4-9521-2c3060cc36f9","Type":"ContainerDied","Data":"429e5c448d29f020b59be7a3d44ab2321930f5a01cbe63eb44b30741ab09344d"} Apr 16 16:35:41.264889 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:41.264828 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5" event={"ID":"924a69ca-4621-44a4-9521-2c3060cc36f9","Type":"ContainerStarted","Data":"140fe239f17661a8c3cce1d433da0611b924b36f7843ee193226cdf99cf19aaf"} Apr 16 16:35:41.284760 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:41.284710 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5" podStartSLOduration=7.284696227 podStartE2EDuration="7.284696227s" podCreationTimestamp="2026-04-16 16:35:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:35:41.283799868 +0000 UTC m=+2069.243991593" watchObservedRunningTime="2026-04-16 16:35:41.284696227 +0000 UTC m=+2069.244887950" Apr 16 16:35:44.941878 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:44.941850 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5" Apr 16 16:35:44.942294 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:44.941908 2584 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5" Apr 16 16:35:44.953836 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:44.953817 2584 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5" Apr 16 16:35:45.142644 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:45.142620 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" Apr 16 16:35:45.142791 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:45.142759 2584 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" Apr 16 16:35:45.144913 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:45.144896 2584 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" Apr 16 16:35:45.279813 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:45.279747 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" Apr 16 16:35:45.289885 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:35:45.289866 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5" Apr 16 16:36:07.288380 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:07.288348 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" Apr 16 16:36:08.253147 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:08.253114 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5"] Apr 16 16:36:08.253412 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:08.253376 2584 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5" podUID="924a69ca-4621-44a4-9521-2c3060cc36f9" containerName="main" containerID="cri-o://140fe239f17661a8c3cce1d433da0611b924b36f7843ee193226cdf99cf19aaf" gracePeriod=30 Apr 16 16:36:08.262512 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:08.262487 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd"] Apr 16 16:36:08.262770 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:08.262752 2584 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" podUID="5092080e-c8c1-42b1-84ac-478f8684af0a" containerName="main" containerID="cri-o://a700a582b091cc6a566e16b4ae5acc168ac776e8ba90e32d3a374edf1890af64" gracePeriod=30 Apr 16 16:36:08.262933 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:08.262862 2584 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" podUID="5092080e-c8c1-42b1-84ac-478f8684af0a" containerName="tokenizer" containerID="cri-o://022d856de7a80b86efd7481e73dba40199e2d5ad0c5e586d7cfb30fd39f05604" gracePeriod=30 Apr 16 16:36:08.505222 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:08.505169 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5" Apr 16 16:36:08.643108 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:08.643061 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/924a69ca-4621-44a4-9521-2c3060cc36f9-kserve-provision-location\") pod \"924a69ca-4621-44a4-9521-2c3060cc36f9\" (UID: \"924a69ca-4621-44a4-9521-2c3060cc36f9\") " Apr 16 16:36:08.643301 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:08.643160 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/924a69ca-4621-44a4-9521-2c3060cc36f9-dshm\") pod \"924a69ca-4621-44a4-9521-2c3060cc36f9\" (UID: \"924a69ca-4621-44a4-9521-2c3060cc36f9\") " Apr 16 16:36:08.643301 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:08.643193 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqcwq\" (UniqueName: \"kubernetes.io/projected/924a69ca-4621-44a4-9521-2c3060cc36f9-kube-api-access-gqcwq\") pod \"924a69ca-4621-44a4-9521-2c3060cc36f9\" (UID: \"924a69ca-4621-44a4-9521-2c3060cc36f9\") " Apr 16 16:36:08.643301 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:08.643228 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/924a69ca-4621-44a4-9521-2c3060cc36f9-home\") pod \"924a69ca-4621-44a4-9521-2c3060cc36f9\" (UID: \"924a69ca-4621-44a4-9521-2c3060cc36f9\") " Apr 16 16:36:08.643484 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:08.643359 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/924a69ca-4621-44a4-9521-2c3060cc36f9-tls-certs\") pod \"924a69ca-4621-44a4-9521-2c3060cc36f9\" (UID: \"924a69ca-4621-44a4-9521-2c3060cc36f9\") " Apr 16 16:36:08.643484 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:08.643410 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/924a69ca-4621-44a4-9521-2c3060cc36f9-model-cache\") pod \"924a69ca-4621-44a4-9521-2c3060cc36f9\" (UID: \"924a69ca-4621-44a4-9521-2c3060cc36f9\") " Apr 16 16:36:08.643484 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:08.643448 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/924a69ca-4621-44a4-9521-2c3060cc36f9-home" (OuterVolumeSpecName: "home") pod "924a69ca-4621-44a4-9521-2c3060cc36f9" (UID: "924a69ca-4621-44a4-9521-2c3060cc36f9"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:36:08.643710 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:08.643681 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/924a69ca-4621-44a4-9521-2c3060cc36f9-model-cache" (OuterVolumeSpecName: "model-cache") pod "924a69ca-4621-44a4-9521-2c3060cc36f9" (UID: "924a69ca-4621-44a4-9521-2c3060cc36f9"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:36:08.643803 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:08.643784 2584 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/924a69ca-4621-44a4-9521-2c3060cc36f9-home\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:36:08.643870 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:08.643809 2584 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/924a69ca-4621-44a4-9521-2c3060cc36f9-model-cache\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:36:08.645555 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:08.645522 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/924a69ca-4621-44a4-9521-2c3060cc36f9-dshm" (OuterVolumeSpecName: "dshm") pod "924a69ca-4621-44a4-9521-2c3060cc36f9" (UID: "924a69ca-4621-44a4-9521-2c3060cc36f9"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:36:08.645676 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:08.645610 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/924a69ca-4621-44a4-9521-2c3060cc36f9-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "924a69ca-4621-44a4-9521-2c3060cc36f9" (UID: "924a69ca-4621-44a4-9521-2c3060cc36f9"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:36:08.645676 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:08.645610 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/924a69ca-4621-44a4-9521-2c3060cc36f9-kube-api-access-gqcwq" (OuterVolumeSpecName: "kube-api-access-gqcwq") pod "924a69ca-4621-44a4-9521-2c3060cc36f9" (UID: "924a69ca-4621-44a4-9521-2c3060cc36f9"). InnerVolumeSpecName "kube-api-access-gqcwq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:36:08.706617 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:08.706573 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/924a69ca-4621-44a4-9521-2c3060cc36f9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "924a69ca-4621-44a4-9521-2c3060cc36f9" (UID: "924a69ca-4621-44a4-9521-2c3060cc36f9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:36:08.744560 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:08.744538 2584 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/924a69ca-4621-44a4-9521-2c3060cc36f9-dshm\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:36:08.744560 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:08.744562 2584 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gqcwq\" (UniqueName: \"kubernetes.io/projected/924a69ca-4621-44a4-9521-2c3060cc36f9-kube-api-access-gqcwq\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:36:08.744698 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:08.744572 2584 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/924a69ca-4621-44a4-9521-2c3060cc36f9-tls-certs\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:36:08.744698 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:08.744584 2584 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/924a69ca-4621-44a4-9521-2c3060cc36f9-kserve-provision-location\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:36:09.361056 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:09.361026 2584 generic.go:358] "Generic (PLEG): container finished" podID="5092080e-c8c1-42b1-84ac-478f8684af0a" containerID="022d856de7a80b86efd7481e73dba40199e2d5ad0c5e586d7cfb30fd39f05604" exitCode=0 Apr 16 16:36:09.361056 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:09.361049 2584 generic.go:358] "Generic (PLEG): container finished" podID="5092080e-c8c1-42b1-84ac-478f8684af0a" containerID="a700a582b091cc6a566e16b4ae5acc168ac776e8ba90e32d3a374edf1890af64" exitCode=0 Apr 16 16:36:09.361262 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:09.361101 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" event={"ID":"5092080e-c8c1-42b1-84ac-478f8684af0a","Type":"ContainerDied","Data":"022d856de7a80b86efd7481e73dba40199e2d5ad0c5e586d7cfb30fd39f05604"} Apr 16 16:36:09.361262 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:09.361146 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" event={"ID":"5092080e-c8c1-42b1-84ac-478f8684af0a","Type":"ContainerDied","Data":"a700a582b091cc6a566e16b4ae5acc168ac776e8ba90e32d3a374edf1890af64"} Apr 16 16:36:09.362401 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:09.362375 2584 generic.go:358] "Generic (PLEG): container finished" podID="924a69ca-4621-44a4-9521-2c3060cc36f9" containerID="140fe239f17661a8c3cce1d433da0611b924b36f7843ee193226cdf99cf19aaf" exitCode=0 Apr 16 16:36:09.362528 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:09.362413 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5" event={"ID":"924a69ca-4621-44a4-9521-2c3060cc36f9","Type":"ContainerDied","Data":"140fe239f17661a8c3cce1d433da0611b924b36f7843ee193226cdf99cf19aaf"} Apr 16 16:36:09.362528 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:09.362437 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5" event={"ID":"924a69ca-4621-44a4-9521-2c3060cc36f9","Type":"ContainerDied","Data":"a43242bba3b9b70a09963c992da79cc245686a12528fe30d19b0858d6a99bec8"} Apr 16 16:36:09.362528 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:09.362450 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5" Apr 16 16:36:09.362528 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:09.362457 2584 scope.go:117] "RemoveContainer" containerID="140fe239f17661a8c3cce1d433da0611b924b36f7843ee193226cdf99cf19aaf" Apr 16 16:36:09.371808 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:09.371793 2584 scope.go:117] "RemoveContainer" containerID="429e5c448d29f020b59be7a3d44ab2321930f5a01cbe63eb44b30741ab09344d" Apr 16 16:36:09.385696 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:09.385670 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5"] Apr 16 16:36:09.388493 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:09.388473 2584 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-7585bb6cdb-ttjr5"] Apr 16 16:36:09.396420 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:09.396388 2584 scope.go:117] "RemoveContainer" containerID="140fe239f17661a8c3cce1d433da0611b924b36f7843ee193226cdf99cf19aaf" Apr 16 16:36:09.396686 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:36:09.396663 2584 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"140fe239f17661a8c3cce1d433da0611b924b36f7843ee193226cdf99cf19aaf\": container with ID starting with 140fe239f17661a8c3cce1d433da0611b924b36f7843ee193226cdf99cf19aaf not found: ID does not exist" containerID="140fe239f17661a8c3cce1d433da0611b924b36f7843ee193226cdf99cf19aaf" Apr 16 16:36:09.396758 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:09.396696 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"140fe239f17661a8c3cce1d433da0611b924b36f7843ee193226cdf99cf19aaf"} err="failed to get container status \"140fe239f17661a8c3cce1d433da0611b924b36f7843ee193226cdf99cf19aaf\": rpc error: code = NotFound desc = could not find container \"140fe239f17661a8c3cce1d433da0611b924b36f7843ee193226cdf99cf19aaf\": container with ID starting with 140fe239f17661a8c3cce1d433da0611b924b36f7843ee193226cdf99cf19aaf not found: ID does not exist" Apr 16 16:36:09.396758 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:09.396713 2584 scope.go:117] "RemoveContainer" containerID="429e5c448d29f020b59be7a3d44ab2321930f5a01cbe63eb44b30741ab09344d" Apr 16 16:36:09.396919 ip-10-0-131-24 kubenswrapper[2584]: E0416 16:36:09.396904 2584 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"429e5c448d29f020b59be7a3d44ab2321930f5a01cbe63eb44b30741ab09344d\": container with ID starting with 429e5c448d29f020b59be7a3d44ab2321930f5a01cbe63eb44b30741ab09344d not found: ID does not exist" containerID="429e5c448d29f020b59be7a3d44ab2321930f5a01cbe63eb44b30741ab09344d" Apr 16 16:36:09.397015 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:09.396921 2584 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"429e5c448d29f020b59be7a3d44ab2321930f5a01cbe63eb44b30741ab09344d"} err="failed to get container status \"429e5c448d29f020b59be7a3d44ab2321930f5a01cbe63eb44b30741ab09344d\": rpc error: code = NotFound desc = could not find container \"429e5c448d29f020b59be7a3d44ab2321930f5a01cbe63eb44b30741ab09344d\": container with ID starting with 429e5c448d29f020b59be7a3d44ab2321930f5a01cbe63eb44b30741ab09344d not found: ID does not exist" Apr 16 16:36:09.448341 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:09.448323 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" Apr 16 16:36:09.552505 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:09.552452 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5092080e-c8c1-42b1-84ac-478f8684af0a-tokenizer-tmp\") pod \"5092080e-c8c1-42b1-84ac-478f8684af0a\" (UID: \"5092080e-c8c1-42b1-84ac-478f8684af0a\") " Apr 16 16:36:09.552813 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:09.552520 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5092080e-c8c1-42b1-84ac-478f8684af0a-tokenizer-uds\") pod \"5092080e-c8c1-42b1-84ac-478f8684af0a\" (UID: \"5092080e-c8c1-42b1-84ac-478f8684af0a\") " Apr 16 16:36:09.552813 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:09.552545 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jp5tq\" (UniqueName: \"kubernetes.io/projected/5092080e-c8c1-42b1-84ac-478f8684af0a-kube-api-access-jp5tq\") pod \"5092080e-c8c1-42b1-84ac-478f8684af0a\" (UID: \"5092080e-c8c1-42b1-84ac-478f8684af0a\") " Apr 16 16:36:09.552813 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:09.552579 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5092080e-c8c1-42b1-84ac-478f8684af0a-tls-certs\") pod \"5092080e-c8c1-42b1-84ac-478f8684af0a\" (UID: \"5092080e-c8c1-42b1-84ac-478f8684af0a\") " Apr 16 16:36:09.552813 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:09.552601 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5092080e-c8c1-42b1-84ac-478f8684af0a-tokenizer-cache\") pod \"5092080e-c8c1-42b1-84ac-478f8684af0a\" (UID: \"5092080e-c8c1-42b1-84ac-478f8684af0a\") " Apr 16 16:36:09.552813 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:09.552631 2584 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5092080e-c8c1-42b1-84ac-478f8684af0a-kserve-provision-location\") pod \"5092080e-c8c1-42b1-84ac-478f8684af0a\" (UID: \"5092080e-c8c1-42b1-84ac-478f8684af0a\") " Apr 16 16:36:09.552813 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:09.552772 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5092080e-c8c1-42b1-84ac-478f8684af0a-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "5092080e-c8c1-42b1-84ac-478f8684af0a" (UID: "5092080e-c8c1-42b1-84ac-478f8684af0a"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:36:09.552813 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:09.552810 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5092080e-c8c1-42b1-84ac-478f8684af0a-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "5092080e-c8c1-42b1-84ac-478f8684af0a" (UID: "5092080e-c8c1-42b1-84ac-478f8684af0a"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:36:09.553187 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:09.552922 2584 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5092080e-c8c1-42b1-84ac-478f8684af0a-tokenizer-tmp\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:36:09.553187 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:09.552939 2584 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5092080e-c8c1-42b1-84ac-478f8684af0a-tokenizer-uds\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:36:09.553187 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:09.552979 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5092080e-c8c1-42b1-84ac-478f8684af0a-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "5092080e-c8c1-42b1-84ac-478f8684af0a" (UID: "5092080e-c8c1-42b1-84ac-478f8684af0a"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:36:09.553495 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:09.553472 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5092080e-c8c1-42b1-84ac-478f8684af0a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5092080e-c8c1-42b1-84ac-478f8684af0a" (UID: "5092080e-c8c1-42b1-84ac-478f8684af0a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 16:36:09.554846 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:09.554827 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5092080e-c8c1-42b1-84ac-478f8684af0a-kube-api-access-jp5tq" (OuterVolumeSpecName: "kube-api-access-jp5tq") pod "5092080e-c8c1-42b1-84ac-478f8684af0a" (UID: "5092080e-c8c1-42b1-84ac-478f8684af0a"). InnerVolumeSpecName "kube-api-access-jp5tq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 16:36:09.554895 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:09.554876 2584 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5092080e-c8c1-42b1-84ac-478f8684af0a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "5092080e-c8c1-42b1-84ac-478f8684af0a" (UID: "5092080e-c8c1-42b1-84ac-478f8684af0a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 16:36:09.653660 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:09.653637 2584 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jp5tq\" (UniqueName: \"kubernetes.io/projected/5092080e-c8c1-42b1-84ac-478f8684af0a-kube-api-access-jp5tq\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:36:09.653660 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:09.653661 2584 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5092080e-c8c1-42b1-84ac-478f8684af0a-tls-certs\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:36:09.653802 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:09.653676 2584 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5092080e-c8c1-42b1-84ac-478f8684af0a-tokenizer-cache\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:36:09.653802 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:09.653688 2584 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5092080e-c8c1-42b1-84ac-478f8684af0a-kserve-provision-location\") on node \"ip-10-0-131-24.ec2.internal\" DevicePath \"\"" Apr 16 16:36:10.371298 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:10.371262 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" event={"ID":"5092080e-c8c1-42b1-84ac-478f8684af0a","Type":"ContainerDied","Data":"3bbe2903a3483fbc19112191daf7fb4d90a49f87971d35b4cba69af63da42357"} Apr 16 16:36:10.371544 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:10.371526 2584 scope.go:117] "RemoveContainer" containerID="022d856de7a80b86efd7481e73dba40199e2d5ad0c5e586d7cfb30fd39f05604" Apr 16 16:36:10.372567 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:10.372515 2584 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd" Apr 16 16:36:10.380791 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:10.380774 2584 scope.go:117] "RemoveContainer" containerID="a700a582b091cc6a566e16b4ae5acc168ac776e8ba90e32d3a374edf1890af64" Apr 16 16:36:10.387831 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:10.387816 2584 scope.go:117] "RemoveContainer" containerID="04fd8d3f09c918c5e69654be16ab0887c6f1c3889857950084e16466a86aab8d" Apr 16 16:36:10.394056 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:10.394029 2584 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd"] Apr 16 16:36:10.397849 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:10.397828 2584 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-677c8z6dwd"] Apr 16 16:36:10.698792 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:10.698760 2584 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5092080e-c8c1-42b1-84ac-478f8684af0a" path="/var/lib/kubelet/pods/5092080e-c8c1-42b1-84ac-478f8684af0a/volumes" Apr 16 16:36:10.699223 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:10.699208 2584 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="924a69ca-4621-44a4-9521-2c3060cc36f9" path="/var/lib/kubelet/pods/924a69ca-4621-44a4-9521-2c3060cc36f9/volumes" Apr 16 16:36:12.717830 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:12.717797 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65x5s_82145526-4c6f-43c3-8850-84adf5e445e9/ovn-acl-logging/0.log" Apr 16 16:36:12.721015 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:12.720991 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65x5s_82145526-4c6f-43c3-8850-84adf5e445e9/ovn-acl-logging/0.log" Apr 16 16:36:23.803534 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:23.803506 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-4lthw_74131d38-6ea6-4653-9bc8-d1804ed203fc/istio-proxy/0.log" Apr 16 16:36:23.830609 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:23.830590 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-hwts9_c68f54b4-fbf0-4f39-aa9d-980a4ba029f3/istio-proxy/0.log" Apr 16 16:36:24.747984 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:24.747940 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-4lthw_74131d38-6ea6-4653-9bc8-d1804ed203fc/istio-proxy/0.log" Apr 16 16:36:24.764499 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:24.764478 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-hwts9_c68f54b4-fbf0-4f39-aa9d-980a4ba029f3/istio-proxy/0.log" Apr 16 16:36:25.686114 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:25.686091 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-4lthw_74131d38-6ea6-4653-9bc8-d1804ed203fc/istio-proxy/0.log" Apr 16 16:36:25.701266 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:25.701242 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-hwts9_c68f54b4-fbf0-4f39-aa9d-980a4ba029f3/istio-proxy/0.log" Apr 16 16:36:26.625155 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:26.625126 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-4lthw_74131d38-6ea6-4653-9bc8-d1804ed203fc/istio-proxy/0.log" Apr 16 16:36:26.642160 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:26.642133 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-hwts9_c68f54b4-fbf0-4f39-aa9d-980a4ba029f3/istio-proxy/0.log" Apr 16 16:36:27.519522 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:27.519494 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-4lthw_74131d38-6ea6-4653-9bc8-d1804ed203fc/istio-proxy/0.log" Apr 16 16:36:27.534397 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:27.534376 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-hwts9_c68f54b4-fbf0-4f39-aa9d-980a4ba029f3/istio-proxy/0.log" Apr 16 16:36:28.428204 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:28.428179 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-4lthw_74131d38-6ea6-4653-9bc8-d1804ed203fc/istio-proxy/0.log" Apr 16 16:36:28.442394 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:28.442374 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-hwts9_c68f54b4-fbf0-4f39-aa9d-980a4ba029f3/istio-proxy/0.log" Apr 16 16:36:29.361946 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:29.361917 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-4lthw_74131d38-6ea6-4653-9bc8-d1804ed203fc/istio-proxy/0.log" Apr 16 16:36:29.377013 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:29.376990 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-hwts9_c68f54b4-fbf0-4f39-aa9d-980a4ba029f3/istio-proxy/0.log" Apr 16 16:36:30.294742 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:30.294710 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-4lthw_74131d38-6ea6-4653-9bc8-d1804ed203fc/istio-proxy/0.log" Apr 16 16:36:30.308918 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:30.308894 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-hwts9_c68f54b4-fbf0-4f39-aa9d-980a4ba029f3/istio-proxy/0.log" Apr 16 16:36:31.212035 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:31.212010 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-4lthw_74131d38-6ea6-4653-9bc8-d1804ed203fc/istio-proxy/0.log" Apr 16 16:36:31.228230 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:31.228206 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-hwts9_c68f54b4-fbf0-4f39-aa9d-980a4ba029f3/istio-proxy/0.log" Apr 16 16:36:32.207152 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:32.207126 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-4lthw_74131d38-6ea6-4653-9bc8-d1804ed203fc/istio-proxy/0.log" Apr 16 16:36:32.222417 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:32.222402 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-hwts9_c68f54b4-fbf0-4f39-aa9d-980a4ba029f3/istio-proxy/0.log" Apr 16 16:36:33.142233 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:33.142206 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-4lthw_74131d38-6ea6-4653-9bc8-d1804ed203fc/istio-proxy/0.log" Apr 16 16:36:33.156111 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:33.156092 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-hwts9_c68f54b4-fbf0-4f39-aa9d-980a4ba029f3/istio-proxy/0.log" Apr 16 16:36:34.041835 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:34.041806 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-4lthw_74131d38-6ea6-4653-9bc8-d1804ed203fc/istio-proxy/0.log" Apr 16 16:36:34.059540 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:34.059520 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-hwts9_c68f54b4-fbf0-4f39-aa9d-980a4ba029f3/istio-proxy/0.log" Apr 16 16:36:34.993267 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:34.993241 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-4lthw_74131d38-6ea6-4653-9bc8-d1804ed203fc/istio-proxy/0.log" Apr 16 16:36:35.022988 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:35.022947 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-hwts9_c68f54b4-fbf0-4f39-aa9d-980a4ba029f3/istio-proxy/0.log" Apr 16 16:36:35.939097 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:35.939069 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-4lthw_74131d38-6ea6-4653-9bc8-d1804ed203fc/istio-proxy/0.log" Apr 16 16:36:35.964251 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:35.964229 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-2-openshift-default-6866b85949-hwts9_c68f54b4-fbf0-4f39-aa9d-980a4ba029f3/istio-proxy/0.log" Apr 16 16:36:36.926837 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:36.926809 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-wv6xc_632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f/istio-proxy/0.log" Apr 16 16:36:37.743899 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:37.743868 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-wv6xc_632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f/istio-proxy/0.log" Apr 16 16:36:38.520944 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:38.520917 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-bkrvd_d741c8d3-f544-45b6-95fe-2fab2f9d8bf6/authorino/0.log" Apr 16 16:36:38.536185 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:38.536161 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-skggk_cfb990b1-22b4-4fc6-be44-09cd1c9aeadb/manager/0.log" Apr 16 16:36:44.367258 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:44.367224 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-d48fn_01732c3c-7221-4102-8699-6e097f947672/global-pull-secret-syncer/0.log" Apr 16 16:36:44.489760 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:44.489726 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-mx7rd_a7e8c4ea-3157-441e-873c-cf283ecb2c2a/konnectivity-agent/0.log" Apr 16 16:36:44.538418 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:44.538396 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-131-24.ec2.internal_bb825418a7522e6122bfaed620c21321/haproxy/0.log" Apr 16 16:36:48.430895 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:48.430867 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-bkrvd_d741c8d3-f544-45b6-95fe-2fab2f9d8bf6/authorino/0.log" Apr 16 16:36:48.463462 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:48.463440 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-skggk_cfb990b1-22b4-4fc6-be44-09cd1c9aeadb/manager/0.log" Apr 16 16:36:49.796970 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:49.796939 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-qdg8s_90876a88-2791-4518-a822-b1b69a071e6f/kube-state-metrics/0.log" Apr 16 16:36:49.816996 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:49.816974 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-qdg8s_90876a88-2791-4518-a822-b1b69a071e6f/kube-rbac-proxy-main/0.log" Apr 16 16:36:49.839548 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:49.839527 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-qdg8s_90876a88-2791-4518-a822-b1b69a071e6f/kube-rbac-proxy-self/0.log" Apr 16 16:36:49.895499 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:49.895477 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-5876b4bbc7-wdmfj_09ea5b31-549a-496a-9830-4728c4f2ca13/monitoring-plugin/0.log" Apr 16 16:36:50.007067 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:50.007052 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jnqnv_86c14ee5-7e88-4559-a96d-9147a4e36c13/node-exporter/0.log" Apr 16 16:36:50.031235 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:50.031213 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jnqnv_86c14ee5-7e88-4559-a96d-9147a4e36c13/kube-rbac-proxy/0.log" Apr 16 16:36:50.062180 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:50.062166 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-jnqnv_86c14ee5-7e88-4559-a96d-9147a4e36c13/init-textfile/0.log" Apr 16 16:36:50.267635 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:50.267616 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_231bc417-6470-4556-9dbe-67ed2c3f2063/prometheus/0.log" Apr 16 16:36:50.290527 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:50.290500 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_231bc417-6470-4556-9dbe-67ed2c3f2063/config-reloader/0.log" Apr 16 16:36:50.316098 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:50.316075 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_231bc417-6470-4556-9dbe-67ed2c3f2063/thanos-sidecar/0.log" Apr 16 16:36:50.346461 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:50.346436 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_231bc417-6470-4556-9dbe-67ed2c3f2063/kube-rbac-proxy-web/0.log" Apr 16 16:36:50.373546 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:50.373529 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_231bc417-6470-4556-9dbe-67ed2c3f2063/kube-rbac-proxy/0.log" Apr 16 16:36:50.399262 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:50.399244 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_231bc417-6470-4556-9dbe-67ed2c3f2063/kube-rbac-proxy-thanos/0.log" Apr 16 16:36:50.425497 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:50.425477 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_231bc417-6470-4556-9dbe-67ed2c3f2063/init-config-reloader/0.log" Apr 16 16:36:50.676341 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:50.676315 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-679899d9dd-hkjsf_63f686c4-c679-48b9-bd65-7a861737f564/thanos-query/0.log" Apr 16 16:36:50.703913 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:50.703892 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-679899d9dd-hkjsf_63f686c4-c679-48b9-bd65-7a861737f564/kube-rbac-proxy-web/0.log" Apr 16 16:36:50.725002 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:50.724984 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-679899d9dd-hkjsf_63f686c4-c679-48b9-bd65-7a861737f564/kube-rbac-proxy/0.log" Apr 16 16:36:50.744946 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:50.744925 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-679899d9dd-hkjsf_63f686c4-c679-48b9-bd65-7a861737f564/prom-label-proxy/0.log" Apr 16 16:36:50.767704 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:50.767683 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-679899d9dd-hkjsf_63f686c4-c679-48b9-bd65-7a861737f564/kube-rbac-proxy-rules/0.log" Apr 16 16:36:50.786931 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:50.786914 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-679899d9dd-hkjsf_63f686c4-c679-48b9-bd65-7a861737f564/kube-rbac-proxy-metrics/0.log" Apr 16 16:36:52.885984 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:52.885942 2584 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xr7j8/perf-node-gather-daemonset-484xk"] Apr 16 16:36:52.886339 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:52.886250 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5092080e-c8c1-42b1-84ac-478f8684af0a" containerName="storage-initializer" Apr 16 16:36:52.886339 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:52.886261 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="5092080e-c8c1-42b1-84ac-478f8684af0a" containerName="storage-initializer" Apr 16 16:36:52.886339 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:52.886271 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5092080e-c8c1-42b1-84ac-478f8684af0a" containerName="tokenizer" Apr 16 16:36:52.886339 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:52.886277 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="5092080e-c8c1-42b1-84ac-478f8684af0a" containerName="tokenizer" Apr 16 16:36:52.886339 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:52.886294 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5092080e-c8c1-42b1-84ac-478f8684af0a" containerName="main" Apr 16 16:36:52.886339 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:52.886299 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="5092080e-c8c1-42b1-84ac-478f8684af0a" containerName="main" Apr 16 16:36:52.886339 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:52.886307 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="924a69ca-4621-44a4-9521-2c3060cc36f9" containerName="storage-initializer" Apr 16 16:36:52.886339 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:52.886311 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="924a69ca-4621-44a4-9521-2c3060cc36f9" containerName="storage-initializer" Apr 16 16:36:52.886339 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:52.886317 2584 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="924a69ca-4621-44a4-9521-2c3060cc36f9" containerName="main" Apr 16 16:36:52.886339 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:52.886322 2584 state_mem.go:107] "Deleted CPUSet assignment" podUID="924a69ca-4621-44a4-9521-2c3060cc36f9" containerName="main" Apr 16 16:36:52.886627 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:52.886367 2584 memory_manager.go:356] "RemoveStaleState removing state" podUID="5092080e-c8c1-42b1-84ac-478f8684af0a" containerName="main" Apr 16 16:36:52.886627 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:52.886377 2584 memory_manager.go:356] "RemoveStaleState removing state" podUID="924a69ca-4621-44a4-9521-2c3060cc36f9" containerName="main" Apr 16 16:36:52.886627 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:52.886386 2584 memory_manager.go:356] "RemoveStaleState removing state" podUID="5092080e-c8c1-42b1-84ac-478f8684af0a" containerName="tokenizer" Apr 16 16:36:52.890738 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:52.890686 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-484xk" Apr 16 16:36:52.893720 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:52.893616 2584 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xr7j8\"/\"default-dockercfg-dtjgr\"" Apr 16 16:36:52.893872 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:52.893793 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xr7j8\"/\"openshift-service-ca.crt\"" Apr 16 16:36:52.894840 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:52.894566 2584 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xr7j8\"/\"kube-root-ca.crt\"" Apr 16 16:36:52.899918 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:52.899890 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xr7j8/perf-node-gather-daemonset-484xk"] Apr 16 16:36:52.902557 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:52.902533 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-gk2jm_22ab41d5-25ed-43e4-bf00-489eafd172e1/download-server/0.log" Apr 16 16:36:52.972344 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:52.972320 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4df588ac-98d4-46d4-91d9-431705634d14-lib-modules\") pod \"perf-node-gather-daemonset-484xk\" (UID: \"4df588ac-98d4-46d4-91d9-431705634d14\") " pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-484xk" Apr 16 16:36:52.972456 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:52.972362 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4df588ac-98d4-46d4-91d9-431705634d14-sys\") pod \"perf-node-gather-daemonset-484xk\" (UID: \"4df588ac-98d4-46d4-91d9-431705634d14\") " pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-484xk" Apr 16 16:36:52.972456 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:52.972397 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4df588ac-98d4-46d4-91d9-431705634d14-proc\") pod \"perf-node-gather-daemonset-484xk\" (UID: \"4df588ac-98d4-46d4-91d9-431705634d14\") " pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-484xk" Apr 16 16:36:52.972456 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:52.972435 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4df588ac-98d4-46d4-91d9-431705634d14-podres\") pod \"perf-node-gather-daemonset-484xk\" (UID: \"4df588ac-98d4-46d4-91d9-431705634d14\") " pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-484xk" Apr 16 16:36:52.972586 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:52.972503 2584 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnccg\" (UniqueName: \"kubernetes.io/projected/4df588ac-98d4-46d4-91d9-431705634d14-kube-api-access-gnccg\") pod \"perf-node-gather-daemonset-484xk\" (UID: \"4df588ac-98d4-46d4-91d9-431705634d14\") " pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-484xk" Apr 16 16:36:53.073269 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:53.073245 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gnccg\" (UniqueName: \"kubernetes.io/projected/4df588ac-98d4-46d4-91d9-431705634d14-kube-api-access-gnccg\") pod \"perf-node-gather-daemonset-484xk\" (UID: \"4df588ac-98d4-46d4-91d9-431705634d14\") " pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-484xk" Apr 16 16:36:53.073376 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:53.073275 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4df588ac-98d4-46d4-91d9-431705634d14-lib-modules\") pod \"perf-node-gather-daemonset-484xk\" (UID: \"4df588ac-98d4-46d4-91d9-431705634d14\") " pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-484xk" Apr 16 16:36:53.073376 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:53.073310 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4df588ac-98d4-46d4-91d9-431705634d14-sys\") pod \"perf-node-gather-daemonset-484xk\" (UID: \"4df588ac-98d4-46d4-91d9-431705634d14\") " pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-484xk" Apr 16 16:36:53.073376 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:53.073335 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4df588ac-98d4-46d4-91d9-431705634d14-proc\") pod \"perf-node-gather-daemonset-484xk\" (UID: \"4df588ac-98d4-46d4-91d9-431705634d14\") " pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-484xk" Apr 16 16:36:53.073376 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:53.073368 2584 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4df588ac-98d4-46d4-91d9-431705634d14-podres\") pod \"perf-node-gather-daemonset-484xk\" (UID: \"4df588ac-98d4-46d4-91d9-431705634d14\") " pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-484xk" Apr 16 16:36:53.073535 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:53.073386 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4df588ac-98d4-46d4-91d9-431705634d14-sys\") pod \"perf-node-gather-daemonset-484xk\" (UID: \"4df588ac-98d4-46d4-91d9-431705634d14\") " pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-484xk" Apr 16 16:36:53.073535 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:53.073439 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4df588ac-98d4-46d4-91d9-431705634d14-lib-modules\") pod \"perf-node-gather-daemonset-484xk\" (UID: \"4df588ac-98d4-46d4-91d9-431705634d14\") " pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-484xk" Apr 16 16:36:53.073535 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:53.073445 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/4df588ac-98d4-46d4-91d9-431705634d14-proc\") pod \"perf-node-gather-daemonset-484xk\" (UID: \"4df588ac-98d4-46d4-91d9-431705634d14\") " pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-484xk" Apr 16 16:36:53.073535 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:53.073471 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/4df588ac-98d4-46d4-91d9-431705634d14-podres\") pod \"perf-node-gather-daemonset-484xk\" (UID: \"4df588ac-98d4-46d4-91d9-431705634d14\") " pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-484xk" Apr 16 16:36:53.080569 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:53.080549 2584 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnccg\" (UniqueName: \"kubernetes.io/projected/4df588ac-98d4-46d4-91d9-431705634d14-kube-api-access-gnccg\") pod \"perf-node-gather-daemonset-484xk\" (UID: \"4df588ac-98d4-46d4-91d9-431705634d14\") " pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-484xk" Apr 16 16:36:53.204210 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:53.204188 2584 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-484xk" Apr 16 16:36:53.323259 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:53.323232 2584 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xr7j8/perf-node-gather-daemonset-484xk"] Apr 16 16:36:53.325274 ip-10-0-131-24 kubenswrapper[2584]: W0416 16:36:53.325246 2584 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod4df588ac_98d4_46d4_91d9_431705634d14.slice/crio-02e5fe5fcffe7f569b41d457d80f229f030d2218d91fda1614b67c100ca93305 WatchSource:0}: Error finding container 02e5fe5fcffe7f569b41d457d80f229f030d2218d91fda1614b67c100ca93305: Status 404 returned error can't find the container with id 02e5fe5fcffe7f569b41d457d80f229f030d2218d91fda1614b67c100ca93305 Apr 16 16:36:53.505617 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:53.505550 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-484xk" event={"ID":"4df588ac-98d4-46d4-91d9-431705634d14","Type":"ContainerStarted","Data":"14cd8ebb2f8c0487997e34c46cdfbfdd01ac082946b4c48ef6fba2091e5b3345"} Apr 16 16:36:53.505617 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:53.505584 2584 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-484xk" event={"ID":"4df588ac-98d4-46d4-91d9-431705634d14","Type":"ContainerStarted","Data":"02e5fe5fcffe7f569b41d457d80f229f030d2218d91fda1614b67c100ca93305"} Apr 16 16:36:53.505766 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:53.505702 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-484xk" Apr 16 16:36:53.521502 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:53.521458 2584 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-484xk" podStartSLOduration=1.52144464 podStartE2EDuration="1.52144464s" podCreationTimestamp="2026-04-16 16:36:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 16:36:53.519989215 +0000 UTC m=+2141.480180938" watchObservedRunningTime="2026-04-16 16:36:53.52144464 +0000 UTC m=+2141.481636362" Apr 16 16:36:54.116302 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:54.116268 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hsj24_c7c2949f-5c5f-49f8-8fa9-bb2e36cd9787/dns/0.log" Apr 16 16:36:54.135878 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:54.135859 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-hsj24_c7c2949f-5c5f-49f8-8fa9-bb2e36cd9787/kube-rbac-proxy/0.log" Apr 16 16:36:54.217236 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:54.217219 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-lrjg9_80b825a5-15e5-402b-ae57-d2e283b0e8f8/dns-node-resolver/0.log" Apr 16 16:36:54.665995 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:54.665971 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-9f8c8b94d-kt6mf_126c769e-8fc5-445a-a80a-2576bf17ce18/registry/0.log" Apr 16 16:36:54.707597 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:54.707575 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-nffwb_07434d52-3b5d-4eaf-ba37-9f1b957e938a/node-ca/0.log" Apr 16 16:36:55.568114 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:55.568094 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-wv6xc_632fe0a6-d8a9-4cf2-b1bd-e086ee0e738f/istio-proxy/0.log" Apr 16 16:36:56.075031 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:56.075004 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-ndvgr_60c25e0c-4dbd-4686-8ab2-4bd45e9f960b/serve-healthcheck-canary/0.log" Apr 16 16:36:56.682216 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:56.682197 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-x7lk6_efe37373-e8a6-4016-bbaf-58d22d4d4fcf/kube-rbac-proxy/0.log" Apr 16 16:36:56.703369 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:56.703344 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-x7lk6_efe37373-e8a6-4016-bbaf-58d22d4d4fcf/exporter/0.log" Apr 16 16:36:56.735175 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:56.735123 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-x7lk6_efe37373-e8a6-4016-bbaf-58d22d4d4fcf/extractor/0.log" Apr 16 16:36:59.166822 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:59.166752 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5879d548d6-72fnv_320a3360-ff52-4838-9486-9d08a32bda77/manager/0.log" Apr 16 16:36:59.519565 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:59.519503 2584 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-xr7j8/perf-node-gather-daemonset-484xk" Apr 16 16:36:59.774240 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:36:59.774164 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_llmisvc-controller-manager-65b6c49669-qdpbh_bd0b8538-fd89-44e9-b50f-3f8e1b84e2b5/manager/0.log" Apr 16 16:37:00.033575 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:37:00.033502 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-tx69r_990a703c-862e-42b0-a6c3-16cf98da639f/s3-init/0.log" Apr 16 16:37:06.528395 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:37:06.528368 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zhztn_97141e56-55f0-4d10-ba67-fabe3d76d95d/kube-multus-additional-cni-plugins/0.log" Apr 16 16:37:06.570312 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:37:06.570290 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zhztn_97141e56-55f0-4d10-ba67-fabe3d76d95d/egress-router-binary-copy/0.log" Apr 16 16:37:06.609009 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:37:06.608987 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zhztn_97141e56-55f0-4d10-ba67-fabe3d76d95d/cni-plugins/0.log" Apr 16 16:37:06.643431 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:37:06.643411 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zhztn_97141e56-55f0-4d10-ba67-fabe3d76d95d/bond-cni-plugin/0.log" Apr 16 16:37:06.683261 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:37:06.683241 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zhztn_97141e56-55f0-4d10-ba67-fabe3d76d95d/routeoverride-cni/0.log" Apr 16 16:37:06.728517 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:37:06.728494 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zhztn_97141e56-55f0-4d10-ba67-fabe3d76d95d/whereabouts-cni-bincopy/0.log" Apr 16 16:37:06.757191 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:37:06.757171 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-zhztn_97141e56-55f0-4d10-ba67-fabe3d76d95d/whereabouts-cni/0.log" Apr 16 16:37:06.799476 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:37:06.799422 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-trctq_8e6b59ea-f783-49ae-902d-b33f1ca6c234/kube-multus/0.log" Apr 16 16:37:07.009446 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:37:07.009425 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-snrt9_d280ea9a-de22-4d14-8870-0fbcbb459f8f/network-metrics-daemon/0.log" Apr 16 16:37:07.054139 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:37:07.054082 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-snrt9_d280ea9a-de22-4d14-8870-0fbcbb459f8f/kube-rbac-proxy/0.log" Apr 16 16:37:07.826099 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:37:07.826075 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65x5s_82145526-4c6f-43c3-8850-84adf5e445e9/ovn-controller/0.log" Apr 16 16:37:07.863592 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:37:07.863569 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65x5s_82145526-4c6f-43c3-8850-84adf5e445e9/ovn-acl-logging/0.log" Apr 16 16:37:07.871593 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:37:07.871573 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65x5s_82145526-4c6f-43c3-8850-84adf5e445e9/ovn-acl-logging/1.log" Apr 16 16:37:07.902144 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:37:07.902121 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65x5s_82145526-4c6f-43c3-8850-84adf5e445e9/kube-rbac-proxy-node/0.log" Apr 16 16:37:07.938704 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:37:07.938687 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65x5s_82145526-4c6f-43c3-8850-84adf5e445e9/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 16:37:07.979699 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:37:07.979681 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65x5s_82145526-4c6f-43c3-8850-84adf5e445e9/northd/0.log" Apr 16 16:37:08.007610 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:37:08.007591 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65x5s_82145526-4c6f-43c3-8850-84adf5e445e9/nbdb/0.log" Apr 16 16:37:08.031034 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:37:08.031016 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65x5s_82145526-4c6f-43c3-8850-84adf5e445e9/sbdb/0.log" Apr 16 16:37:08.146208 ip-10-0-131-24 kubenswrapper[2584]: I0416 16:37:08.146143 2584 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-65x5s_82145526-4c6f-43c3-8850-84adf5e445e9/ovnkube-controller/0.log"