Apr 23 16:34:54.196046 ip-10-0-137-14 systemd[1]: Starting Kubernetes Kubelet... Apr 23 16:34:54.602759 ip-10-0-137-14 kubenswrapper[2573]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:34:54.602759 ip-10-0-137-14 kubenswrapper[2573]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 16:34:54.602759 ip-10-0-137-14 kubenswrapper[2573]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:34:54.602759 ip-10-0-137-14 kubenswrapper[2573]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 16:34:54.602759 ip-10-0-137-14 kubenswrapper[2573]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 16:34:54.605090 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.604992 2573 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 16:34:54.607337 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607321 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:34:54.607337 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607338 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:34:54.607407 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607342 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:34:54.607407 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607345 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:34:54.607407 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607349 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:34:54.607407 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607354 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:34:54.607407 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607357 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:34:54.607407 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607360 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:34:54.607407 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607364 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:34:54.607407 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607368 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:34:54.607407 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607371 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:34:54.607407 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607374 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:34:54.607407 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607377 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:34:54.607407 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607382 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:34:54.607407 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607386 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:34:54.607407 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607389 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:34:54.607407 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607392 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:34:54.607407 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607395 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:34:54.607407 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607398 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:34:54.607407 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607400 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:34:54.607407 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607404 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:34:54.608001 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607407 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:34:54.608001 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607410 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:34:54.608001 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607413 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:34:54.608001 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607416 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:34:54.608001 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607419 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:34:54.608001 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607422 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:34:54.608001 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607425 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:34:54.608001 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607428 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:34:54.608001 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607430 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:34:54.608001 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607433 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:34:54.608001 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607436 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:34:54.608001 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607438 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:34:54.608001 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607441 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:34:54.608001 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607444 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:34:54.608001 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607446 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:34:54.608001 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607449 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:34:54.608001 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607451 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:34:54.608001 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607454 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:34:54.608001 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607456 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:34:54.608001 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607459 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:34:54.608531 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607462 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:34:54.608531 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607465 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:34:54.608531 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607467 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:34:54.608531 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607470 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:34:54.608531 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607472 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:34:54.608531 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607475 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:34:54.608531 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607477 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:34:54.608531 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607480 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:34:54.608531 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607482 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:34:54.608531 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607486 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:34:54.608531 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607488 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:34:54.608531 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607491 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:34:54.608531 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607494 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:34:54.608531 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607497 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:34:54.608531 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607500 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:34:54.608531 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607511 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:34:54.608531 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607515 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:34:54.608531 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607519 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:34:54.608531 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607521 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:34:54.609027 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607526 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:34:54.609027 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607529 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:34:54.609027 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607532 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:34:54.609027 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607534 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:34:54.609027 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607537 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:34:54.609027 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607540 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:34:54.609027 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607543 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:34:54.609027 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607545 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:34:54.609027 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607548 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:34:54.609027 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607550 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:34:54.609027 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607553 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:34:54.609027 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607556 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:34:54.609027 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607558 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:34:54.609027 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607561 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:34:54.609027 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607564 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:34:54.609027 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607567 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:34:54.609027 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607569 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:34:54.609027 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607572 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:34:54.609027 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607575 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:34:54.609027 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607577 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:34:54.609544 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607580 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:34:54.609544 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607582 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:34:54.609544 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607585 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:34:54.609544 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607587 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:34:54.609544 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607590 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:34:54.609544 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.607592 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:34:54.609544 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608027 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:34:54.609544 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608033 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:34:54.609544 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608036 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:34:54.609544 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608039 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:34:54.609544 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608041 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:34:54.609544 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608045 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:34:54.609544 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608047 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:34:54.609544 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608050 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:34:54.609544 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608053 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:34:54.609544 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608056 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:34:54.609544 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608058 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:34:54.609544 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608061 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:34:54.609544 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608064 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:34:54.610315 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608066 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:34:54.610315 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608069 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:34:54.610315 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608071 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:34:54.610315 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608074 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:34:54.610315 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608077 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:34:54.610315 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608079 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:34:54.610315 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608082 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:34:54.610315 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608085 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:34:54.610315 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608088 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:34:54.610315 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608091 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:34:54.610315 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608093 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:34:54.610315 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608096 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:34:54.610315 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608099 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:34:54.610315 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608101 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:34:54.610315 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608104 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:34:54.610315 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608106 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:34:54.610315 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608109 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:34:54.610315 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608112 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:34:54.610315 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608114 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:34:54.610315 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608118 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:34:54.610994 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608120 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:34:54.610994 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608123 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:34:54.610994 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608125 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:34:54.610994 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608128 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:34:54.610994 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608130 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:34:54.610994 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608133 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:34:54.610994 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608135 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:34:54.610994 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608138 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:34:54.610994 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608140 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:34:54.610994 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608143 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:34:54.610994 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608145 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:34:54.610994 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608148 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:34:54.610994 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608150 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:34:54.610994 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608153 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:34:54.610994 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608155 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:34:54.610994 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608158 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:34:54.610994 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608160 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:34:54.610994 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608163 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:34:54.610994 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608167 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:34:54.610994 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608170 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:34:54.611504 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608173 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:34:54.611504 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608176 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:34:54.611504 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608179 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:34:54.611504 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608181 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:34:54.611504 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608185 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:34:54.611504 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608189 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:34:54.611504 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608193 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:34:54.611504 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608195 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:34:54.611504 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608198 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:34:54.611504 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608201 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:34:54.611504 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608204 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:34:54.611504 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608207 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:34:54.611504 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608210 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:34:54.611504 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608213 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:34:54.611504 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608215 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:34:54.611504 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608218 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:34:54.611504 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608221 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:34:54.611504 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608224 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:34:54.611504 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608227 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:34:54.611998 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608230 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:34:54.611998 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608233 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:34:54.611998 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608235 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:34:54.611998 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608238 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:34:54.611998 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608241 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:34:54.611998 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608243 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:34:54.611998 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608247 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:34:54.611998 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608249 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:34:54.611998 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608252 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:34:54.611998 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608254 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:34:54.611998 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608257 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:34:54.611998 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608259 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:34:54.611998 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608262 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:34:54.611998 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.608264 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:34:54.611998 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608828 2573 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 16:34:54.611998 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608838 2573 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 16:34:54.611998 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608845 2573 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 16:34:54.611998 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608850 2573 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 16:34:54.611998 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608854 2573 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 16:34:54.611998 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608858 2573 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 16:34:54.611998 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608862 2573 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 16:34:54.612506 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608867 2573 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 16:34:54.612506 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608871 2573 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 16:34:54.612506 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608874 2573 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 16:34:54.612506 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608879 2573 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 16:34:54.612506 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608883 2573 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 16:34:54.612506 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608886 2573 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 16:34:54.612506 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608889 2573 flags.go:64] FLAG: --cgroup-root="" Apr 23 16:34:54.612506 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608893 2573 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 16:34:54.612506 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608896 2573 flags.go:64] FLAG: --client-ca-file="" Apr 23 16:34:54.612506 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608898 2573 flags.go:64] FLAG: --cloud-config="" Apr 23 16:34:54.612506 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608911 2573 flags.go:64] FLAG: --cloud-provider="external" Apr 23 16:34:54.612506 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608916 2573 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 16:34:54.612506 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608920 2573 flags.go:64] FLAG: --cluster-domain="" Apr 23 16:34:54.612506 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608923 2573 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 16:34:54.612506 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608926 2573 flags.go:64] FLAG: --config-dir="" Apr 23 16:34:54.612506 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608929 2573 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 16:34:54.612506 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608933 2573 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 16:34:54.612506 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608937 2573 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 16:34:54.612506 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608940 2573 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 16:34:54.612506 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608943 2573 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 16:34:54.612506 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608947 2573 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 16:34:54.612506 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608950 2573 flags.go:64] FLAG: --contention-profiling="false" Apr 23 16:34:54.612506 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608953 2573 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 16:34:54.612506 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608956 2573 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 16:34:54.613111 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608959 2573 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 16:34:54.613111 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608967 2573 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 16:34:54.613111 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608972 2573 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 16:34:54.613111 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608976 2573 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 16:34:54.613111 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608979 2573 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 16:34:54.613111 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608984 2573 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 16:34:54.613111 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608987 2573 flags.go:64] FLAG: --enable-server="true" Apr 23 16:34:54.613111 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608990 2573 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 16:34:54.613111 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608995 2573 flags.go:64] FLAG: --event-burst="100" Apr 23 16:34:54.613111 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.608998 2573 flags.go:64] FLAG: --event-qps="50" Apr 23 16:34:54.613111 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609001 2573 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 16:34:54.613111 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609005 2573 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 16:34:54.613111 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609008 2573 flags.go:64] FLAG: --eviction-hard="" Apr 23 16:34:54.613111 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609012 2573 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 16:34:54.613111 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609015 2573 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 16:34:54.613111 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609018 2573 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 16:34:54.613111 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609021 2573 flags.go:64] FLAG: --eviction-soft="" Apr 23 16:34:54.613111 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609024 2573 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 16:34:54.613111 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609027 2573 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 16:34:54.613111 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609030 2573 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 16:34:54.613111 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609033 2573 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 16:34:54.613111 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609037 2573 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 16:34:54.613111 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609040 2573 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 16:34:54.613111 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609043 2573 flags.go:64] FLAG: --feature-gates="" Apr 23 16:34:54.613111 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609047 2573 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 16:34:54.613745 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609050 2573 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 16:34:54.613745 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609054 2573 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 16:34:54.613745 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609058 2573 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 16:34:54.613745 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609061 2573 flags.go:64] FLAG: --healthz-port="10248" Apr 23 16:34:54.613745 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609064 2573 flags.go:64] FLAG: --help="false" Apr 23 16:34:54.613745 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609067 2573 flags.go:64] FLAG: --hostname-override="ip-10-0-137-14.ec2.internal" Apr 23 16:34:54.613745 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609071 2573 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 16:34:54.613745 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609074 2573 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 16:34:54.613745 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609078 2573 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 16:34:54.613745 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609081 2573 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 16:34:54.613745 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609085 2573 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 16:34:54.613745 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609088 2573 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 16:34:54.613745 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609092 2573 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 16:34:54.613745 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609094 2573 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 16:34:54.613745 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609097 2573 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 16:34:54.613745 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609101 2573 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 16:34:54.613745 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609104 2573 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 16:34:54.613745 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609108 2573 flags.go:64] FLAG: --kube-reserved="" Apr 23 16:34:54.613745 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609111 2573 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 16:34:54.613745 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609114 2573 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 16:34:54.613745 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609117 2573 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 16:34:54.613745 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609121 2573 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 16:34:54.613745 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609124 2573 flags.go:64] FLAG: --lock-file="" Apr 23 16:34:54.613745 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609127 2573 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 16:34:54.614330 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609130 2573 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 16:34:54.614330 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609133 2573 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 16:34:54.614330 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609138 2573 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 16:34:54.614330 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609141 2573 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 16:34:54.614330 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609144 2573 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 16:34:54.614330 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609148 2573 flags.go:64] FLAG: --logging-format="text" Apr 23 16:34:54.614330 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609150 2573 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 16:34:54.614330 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609154 2573 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 16:34:54.614330 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609157 2573 flags.go:64] FLAG: --manifest-url="" Apr 23 16:34:54.614330 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609160 2573 flags.go:64] FLAG: --manifest-url-header="" Apr 23 16:34:54.614330 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609165 2573 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 16:34:54.614330 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609168 2573 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 16:34:54.614330 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609172 2573 flags.go:64] FLAG: --max-pods="110" Apr 23 16:34:54.614330 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609175 2573 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 16:34:54.614330 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609178 2573 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 16:34:54.614330 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609181 2573 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 16:34:54.614330 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609186 2573 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 16:34:54.614330 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609189 2573 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 16:34:54.614330 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609192 2573 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 16:34:54.614330 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609195 2573 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 16:34:54.614330 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609204 2573 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 16:34:54.614330 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609207 2573 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 16:34:54.614330 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609211 2573 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 16:34:54.614330 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609214 2573 flags.go:64] FLAG: --pod-cidr="" Apr 23 16:34:54.614937 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609217 2573 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 16:34:54.614937 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609223 2573 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 16:34:54.614937 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609226 2573 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 16:34:54.614937 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609230 2573 flags.go:64] FLAG: --pods-per-core="0" Apr 23 16:34:54.614937 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609233 2573 flags.go:64] FLAG: --port="10250" Apr 23 16:34:54.614937 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609236 2573 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 16:34:54.614937 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609239 2573 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0e1f14d8697961ee0" Apr 23 16:34:54.614937 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609242 2573 flags.go:64] FLAG: --qos-reserved="" Apr 23 16:34:54.614937 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609245 2573 flags.go:64] FLAG: --read-only-port="10255" Apr 23 16:34:54.614937 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609249 2573 flags.go:64] FLAG: --register-node="true" Apr 23 16:34:54.614937 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609252 2573 flags.go:64] FLAG: --register-schedulable="true" Apr 23 16:34:54.614937 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609255 2573 flags.go:64] FLAG: --register-with-taints="" Apr 23 16:34:54.614937 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609258 2573 flags.go:64] FLAG: --registry-burst="10" Apr 23 16:34:54.614937 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609261 2573 flags.go:64] FLAG: --registry-qps="5" Apr 23 16:34:54.614937 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609265 2573 flags.go:64] FLAG: --reserved-cpus="" Apr 23 16:34:54.614937 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609267 2573 flags.go:64] FLAG: --reserved-memory="" Apr 23 16:34:54.614937 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609271 2573 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 16:34:54.614937 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609275 2573 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 16:34:54.614937 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609278 2573 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 16:34:54.614937 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609281 2573 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 16:34:54.614937 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609284 2573 flags.go:64] FLAG: --runonce="false" Apr 23 16:34:54.614937 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609287 2573 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 16:34:54.614937 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609290 2573 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 16:34:54.614937 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609293 2573 flags.go:64] FLAG: --seccomp-default="false" Apr 23 16:34:54.614937 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609296 2573 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 16:34:54.615529 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609300 2573 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 16:34:54.615529 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609304 2573 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 16:34:54.615529 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609307 2573 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 16:34:54.615529 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609311 2573 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 16:34:54.615529 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609314 2573 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 16:34:54.615529 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609317 2573 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 16:34:54.615529 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609320 2573 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 16:34:54.615529 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609323 2573 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 16:34:54.615529 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609326 2573 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 16:34:54.615529 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609329 2573 flags.go:64] FLAG: --system-cgroups="" Apr 23 16:34:54.615529 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609335 2573 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 16:34:54.615529 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609340 2573 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 16:34:54.615529 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609343 2573 flags.go:64] FLAG: --tls-cert-file="" Apr 23 16:34:54.615529 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609346 2573 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 16:34:54.615529 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609349 2573 flags.go:64] FLAG: --tls-min-version="" Apr 23 16:34:54.615529 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609352 2573 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 16:34:54.615529 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609355 2573 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 16:34:54.615529 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609358 2573 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 16:34:54.615529 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609361 2573 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 16:34:54.615529 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609364 2573 flags.go:64] FLAG: --v="2" Apr 23 16:34:54.615529 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609369 2573 flags.go:64] FLAG: --version="false" Apr 23 16:34:54.615529 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609373 2573 flags.go:64] FLAG: --vmodule="" Apr 23 16:34:54.615529 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609378 2573 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 16:34:54.615529 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.609381 2573 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 16:34:54.615529 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611486 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:34:54.616161 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611502 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:34:54.616161 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611505 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:34:54.616161 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611509 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:34:54.616161 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611512 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:34:54.616161 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611516 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:34:54.616161 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611520 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:34:54.616161 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611523 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:34:54.616161 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611525 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:34:54.616161 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611529 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:34:54.616161 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611531 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:34:54.616161 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611534 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:34:54.616161 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611537 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:34:54.616161 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611540 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:34:54.616161 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611543 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:34:54.616161 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611546 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:34:54.616161 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611548 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:34:54.616161 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611551 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:34:54.616161 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611554 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:34:54.616161 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611556 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:34:54.616161 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611559 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:34:54.616713 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611562 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:34:54.616713 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611564 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:34:54.616713 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611567 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:34:54.616713 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611570 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:34:54.616713 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611572 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:34:54.616713 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611576 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:34:54.616713 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611578 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:34:54.616713 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611581 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:34:54.616713 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611584 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:34:54.616713 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611587 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:34:54.616713 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611590 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:34:54.616713 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611595 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:34:54.616713 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611599 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:34:54.616713 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611602 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:34:54.616713 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611606 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:34:54.616713 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611608 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:34:54.616713 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611611 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:34:54.616713 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611614 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:34:54.616713 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611616 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:34:54.617189 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611619 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:34:54.617189 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611622 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:34:54.617189 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611624 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:34:54.617189 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611627 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:34:54.617189 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611629 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:34:54.617189 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611632 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:34:54.617189 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611635 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:34:54.617189 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611639 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:34:54.617189 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611643 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:34:54.617189 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611646 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:34:54.617189 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611649 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:34:54.617189 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611652 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:34:54.617189 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611655 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:34:54.617189 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611658 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:34:54.617189 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611660 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:34:54.617189 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611663 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:34:54.617189 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611666 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:34:54.617189 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611668 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:34:54.617189 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611671 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:34:54.617189 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611674 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:34:54.617674 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611677 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:34:54.617674 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611680 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:34:54.617674 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611682 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:34:54.617674 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611685 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:34:54.617674 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611688 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:34:54.617674 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611705 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:34:54.617674 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611708 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:34:54.617674 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611711 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:34:54.617674 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611714 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:34:54.617674 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611716 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:34:54.617674 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611719 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:34:54.617674 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611722 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:34:54.617674 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611725 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:34:54.617674 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611727 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:34:54.617674 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611730 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:34:54.617674 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611733 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:34:54.617674 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611735 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:34:54.617674 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611738 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:34:54.617674 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611742 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:34:54.618167 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611744 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:34:54.618167 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611747 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:34:54.618167 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611749 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:34:54.618167 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611752 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:34:54.618167 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611755 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:34:54.618167 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611758 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:34:54.618167 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.611761 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:34:54.618167 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.612571 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:34:54.619798 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.619651 2573 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 16:34:54.619842 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.619800 2573 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 16:34:54.619872 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619853 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:34:54.619872 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619858 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:34:54.619872 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619862 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:34:54.619872 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619865 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:34:54.619872 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619868 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:34:54.619872 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619871 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:34:54.619872 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619875 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:34:54.620052 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619879 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:34:54.620052 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619882 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:34:54.620052 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619886 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:34:54.620052 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619891 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:34:54.620052 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619894 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:34:54.620052 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619897 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:34:54.620052 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619900 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:34:54.620052 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619903 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:34:54.620052 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619906 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:34:54.620052 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619908 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:34:54.620052 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619911 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:34:54.620052 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619914 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:34:54.620052 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619916 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:34:54.620052 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619919 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:34:54.620052 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619922 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:34:54.620052 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619925 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:34:54.620052 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619928 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:34:54.620052 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619930 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:34:54.620052 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619933 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:34:54.620518 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619936 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:34:54.620518 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619939 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:34:54.620518 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619941 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:34:54.620518 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619944 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:34:54.620518 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619947 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:34:54.620518 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619950 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:34:54.620518 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619953 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:34:54.620518 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619956 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:34:54.620518 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619958 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:34:54.620518 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619961 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:34:54.620518 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619963 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:34:54.620518 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619966 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:34:54.620518 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619968 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:34:54.620518 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619971 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:34:54.620518 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619975 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:34:54.620518 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619978 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:34:54.620518 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619980 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:34:54.620518 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619983 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:34:54.620518 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619985 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:34:54.621012 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619988 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:34:54.621012 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619991 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:34:54.621012 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619994 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:34:54.621012 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619996 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:34:54.621012 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.619999 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:34:54.621012 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620001 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:34:54.621012 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620004 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:34:54.621012 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620007 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:34:54.621012 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620009 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:34:54.621012 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620012 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:34:54.621012 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620015 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:34:54.621012 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620018 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:34:54.621012 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620020 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:34:54.621012 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620023 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:34:54.621012 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620026 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:34:54.621012 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620028 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:34:54.621012 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620031 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:34:54.621012 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620033 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:34:54.621012 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620036 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:34:54.621012 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620039 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:34:54.621538 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620042 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:34:54.621538 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620045 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:34:54.621538 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620048 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:34:54.621538 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620050 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:34:54.621538 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620053 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:34:54.621538 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620056 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:34:54.621538 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620059 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:34:54.621538 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620062 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:34:54.621538 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620066 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:34:54.621538 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620068 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:34:54.621538 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620071 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:34:54.621538 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620073 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:34:54.621538 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620076 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:34:54.621538 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620079 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:34:54.621538 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620082 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:34:54.621538 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620084 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:34:54.621538 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620087 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:34:54.621538 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620090 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:34:54.621538 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620093 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:34:54.621538 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620095 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:34:54.622042 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620099 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:34:54.622042 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.620105 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:34:54.622042 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620209 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 16:34:54.622042 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620215 2573 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 16:34:54.622042 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620218 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 16:34:54.622042 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620222 2573 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 16:34:54.622042 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620225 2573 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 16:34:54.622042 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620228 2573 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 16:34:54.622042 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620231 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 16:34:54.622042 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620234 2573 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 16:34:54.622042 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620237 2573 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 16:34:54.622042 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620240 2573 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 16:34:54.622042 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620243 2573 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 16:34:54.622042 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620246 2573 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 16:34:54.622042 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620249 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 16:34:54.622406 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620251 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 16:34:54.622406 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620254 2573 feature_gate.go:328] unrecognized feature gate: Example Apr 23 16:34:54.622406 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620257 2573 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 16:34:54.622406 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620259 2573 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 16:34:54.622406 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620262 2573 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 16:34:54.622406 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620265 2573 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 16:34:54.622406 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620267 2573 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 16:34:54.622406 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620270 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 16:34:54.622406 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620273 2573 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 16:34:54.622406 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620275 2573 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 16:34:54.622406 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620278 2573 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 16:34:54.622406 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620281 2573 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 16:34:54.622406 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620283 2573 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 16:34:54.622406 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620286 2573 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 16:34:54.622406 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620289 2573 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 16:34:54.622406 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620292 2573 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 16:34:54.622406 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620294 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 16:34:54.622406 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620297 2573 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 16:34:54.622406 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620300 2573 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 16:34:54.622406 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620304 2573 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 16:34:54.622910 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620307 2573 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 16:34:54.622910 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620310 2573 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 16:34:54.622910 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620313 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 16:34:54.622910 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620315 2573 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 16:34:54.622910 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620318 2573 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 16:34:54.622910 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620320 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 16:34:54.622910 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620324 2573 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 16:34:54.622910 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620326 2573 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 16:34:54.622910 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620329 2573 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 16:34:54.622910 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620331 2573 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 16:34:54.622910 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620335 2573 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 16:34:54.622910 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620338 2573 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 16:34:54.622910 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620340 2573 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 16:34:54.622910 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620343 2573 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 16:34:54.622910 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620346 2573 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 16:34:54.622910 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620348 2573 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 16:34:54.622910 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620351 2573 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 16:34:54.622910 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620354 2573 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 16:34:54.622910 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620357 2573 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 16:34:54.622910 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620359 2573 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 16:34:54.623396 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620362 2573 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 16:34:54.623396 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620364 2573 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 16:34:54.623396 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620367 2573 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 16:34:54.623396 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620369 2573 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 16:34:54.623396 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620372 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 16:34:54.623396 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620374 2573 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 16:34:54.623396 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620377 2573 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 16:34:54.623396 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620380 2573 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 16:34:54.623396 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620382 2573 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 16:34:54.623396 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620385 2573 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 16:34:54.623396 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620387 2573 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 16:34:54.623396 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620391 2573 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 16:34:54.623396 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620395 2573 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 16:34:54.623396 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620398 2573 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 16:34:54.623396 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620401 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 16:34:54.623396 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620404 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 16:34:54.623396 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620407 2573 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 16:34:54.623396 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620409 2573 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 16:34:54.623396 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620412 2573 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 16:34:54.623875 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620414 2573 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 16:34:54.623875 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620417 2573 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 16:34:54.623875 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620420 2573 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 16:34:54.623875 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620422 2573 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 16:34:54.623875 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620425 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 16:34:54.623875 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620428 2573 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 16:34:54.623875 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620430 2573 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 16:34:54.623875 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620433 2573 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 16:34:54.623875 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620436 2573 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 16:34:54.623875 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620439 2573 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 16:34:54.623875 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620442 2573 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 16:34:54.623875 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620444 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 16:34:54.623875 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620447 2573 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 16:34:54.623875 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:54.620450 2573 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 16:34:54.623875 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.620455 2573 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 16:34:54.624256 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.621103 2573 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 16:34:54.624256 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.623884 2573 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 16:34:54.624664 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.624651 2573 server.go:1019] "Starting client certificate rotation" Apr 23 16:34:54.624768 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.624749 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 16:34:54.624801 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.624796 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 16:34:54.648067 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.648042 2573 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 16:34:54.650653 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.650633 2573 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 16:34:54.667311 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.667287 2573 log.go:25] "Validated CRI v1 runtime API" Apr 23 16:34:54.673275 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.673258 2573 log.go:25] "Validated CRI v1 image API" Apr 23 16:34:54.674465 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.674447 2573 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 16:34:54.678154 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.678125 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 16:34:54.678416 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.678396 2573 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 ae0a35ef-9ee9-4463-b771-01d6ef77a058:/dev/nvme0n1p4 e9736d3c-ef31-43c5-a654-54b7615b08f1:/dev/nvme0n1p3] Apr 23 16:34:54.678468 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.678416 2573 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 16:34:54.684348 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.684234 2573 manager.go:217] Machine: {Timestamp:2026-04-23 16:34:54.682251232 +0000 UTC m=+0.372037570 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3098623 MemoryCapacity:33164484608 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2bf20fe104cdb1ae8ac3f67a97a763 SystemUUID:ec2bf20f-e104-cdb1-ae8a-c3f67a97a763 BootID:a997124b-61fc-4077-b560-e5c6ea27d8fd Filesystems:[{Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582242304 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:39:87:c2:d0:39 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:39:87:c2:d0:39 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:3a:59:01:0d:13:13 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164484608 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 16:34:54.684348 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.684342 2573 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 16:34:54.684447 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.684428 2573 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 16:34:54.685496 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.685469 2573 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 16:34:54.685678 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.685498 2573 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-137-14.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 16:34:54.685744 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.685715 2573 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 16:34:54.685744 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.685725 2573 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 16:34:54.685744 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.685742 2573 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 16:34:54.685827 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.685761 2573 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 16:34:54.687354 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.687343 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 23 16:34:54.687460 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.687450 2573 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 16:34:54.689490 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.689480 2573 kubelet.go:491] "Attempting to sync node with API server" Apr 23 16:34:54.689525 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.689494 2573 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 16:34:54.689525 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.689507 2573 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 16:34:54.689525 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.689517 2573 kubelet.go:397] "Adding apiserver pod source" Apr 23 16:34:54.689615 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.689526 2573 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 16:34:54.690618 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.690606 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 16:34:54.690674 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.690624 2573 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 16:34:54.693424 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.693409 2573 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 16:34:54.694773 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.694760 2573 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 16:34:54.696408 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.696394 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 16:34:54.696492 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.696422 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 16:34:54.696492 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.696429 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 16:34:54.696492 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.696435 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 16:34:54.696492 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.696441 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 16:34:54.696492 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.696449 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 16:34:54.696492 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.696456 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 16:34:54.696492 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.696461 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 16:34:54.696492 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.696468 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 16:34:54.696492 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.696474 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 16:34:54.696492 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.696491 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 16:34:54.696920 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.696501 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 16:34:54.698349 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.698337 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 16:34:54.698385 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.698351 2573 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 16:34:54.701276 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.701255 2573 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-137-14.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 16:34:54.702031 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:54.702009 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-137-14.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 16:34:54.702104 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:54.702009 2573 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 16:34:54.702448 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.702436 2573 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 16:34:54.702479 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.702472 2573 server.go:1295] "Started kubelet" Apr 23 16:34:54.702571 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.702548 2573 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 16:34:54.702617 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.702563 2573 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 16:34:54.702660 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.702618 2573 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 16:34:54.703443 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.703421 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-rqf7m" Apr 23 16:34:54.703495 ip-10-0-137-14 systemd[1]: Started Kubernetes Kubelet. Apr 23 16:34:54.703687 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.703663 2573 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 16:34:54.709213 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.708484 2573 server.go:317] "Adding debug handlers to kubelet server" Apr 23 16:34:54.711686 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:54.710658 2573 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-137-14.ec2.internal.18a9099c4212cfdf default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-137-14.ec2.internal,UID:ip-10-0-137-14.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-137-14.ec2.internal,},FirstTimestamp:2026-04-23 16:34:54.702448607 +0000 UTC m=+0.392234944,LastTimestamp:2026-04-23 16:34:54.702448607 +0000 UTC m=+0.392234944,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-137-14.ec2.internal,}" Apr 23 16:34:54.712463 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.712434 2573 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 16:34:54.712891 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.712865 2573 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 16:34:54.712998 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:54.712935 2573 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 16:34:54.713568 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:54.713548 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-14.ec2.internal\" not found" Apr 23 16:34:54.713568 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.713553 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-rqf7m" Apr 23 16:34:54.713747 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.713644 2573 factory.go:55] Registering systemd factory Apr 23 16:34:54.713747 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.713715 2573 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 16:34:54.713747 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.713727 2573 factory.go:223] Registration of the systemd container factory successfully Apr 23 16:34:54.713870 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.713730 2573 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 16:34:54.713870 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.713836 2573 reconstruct.go:97] "Volume reconstruction finished" Apr 23 16:34:54.713870 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.713846 2573 reconciler.go:26] "Reconciler: start to sync state" Apr 23 16:34:54.713972 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.713924 2573 factory.go:153] Registering CRI-O factory Apr 23 16:34:54.713972 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.713934 2573 factory.go:223] Registration of the crio container factory successfully Apr 23 16:34:54.714037 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.713987 2573 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 16:34:54.714037 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.714012 2573 factory.go:103] Registering Raw factory Apr 23 16:34:54.714037 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.714029 2573 manager.go:1196] Started watching for new ooms in manager Apr 23 16:34:54.714135 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.714120 2573 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 16:34:54.714486 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.714474 2573 manager.go:319] Starting recovery of all containers Apr 23 16:34:54.720945 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.720891 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 16:34:54.723991 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.723968 2573 manager.go:324] Recovery completed Apr 23 16:34:54.725222 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.725204 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:34:54.728289 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:54.728270 2573 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-137-14.ec2.internal\" not found" node="ip-10-0-137-14.ec2.internal" Apr 23 16:34:54.729369 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.728974 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:34:54.731912 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.731896 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-14.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:34:54.731974 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.731924 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-14.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:34:54.731974 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.731934 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-14.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:34:54.732479 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.732465 2573 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 16:34:54.732479 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.732478 2573 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 16:34:54.732585 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.732495 2573 state_mem.go:36] "Initialized new in-memory state store" Apr 23 16:34:54.734552 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.734541 2573 policy_none.go:49] "None policy: Start" Apr 23 16:34:54.734590 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.734556 2573 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 16:34:54.734590 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.734566 2573 state_mem.go:35] "Initializing new in-memory state store" Apr 23 16:34:54.776393 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.776377 2573 manager.go:341] "Starting Device Plugin manager" Apr 23 16:34:54.788953 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:54.776475 2573 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 16:34:54.788953 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.776489 2573 server.go:85] "Starting device plugin registration server" Apr 23 16:34:54.788953 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.777623 2573 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 16:34:54.788953 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.777659 2573 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 16:34:54.788953 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.778128 2573 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 16:34:54.788953 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.778274 2573 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 16:34:54.788953 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.778296 2573 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 16:34:54.788953 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:54.779216 2573 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 16:34:54.788953 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:54.779290 2573 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-137-14.ec2.internal\" not found" Apr 23 16:34:54.875117 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.875040 2573 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 16:34:54.875117 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.875076 2573 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 16:34:54.875117 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.875102 2573 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 16:34:54.875117 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.875109 2573 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 16:34:54.875360 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:54.875150 2573 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 16:34:54.877869 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.877846 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:34:54.878920 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.878899 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-14.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:34:54.879086 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.878935 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-14.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:34:54.879086 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.878949 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-14.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:34:54.879086 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.878979 2573 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-137-14.ec2.internal" Apr 23 16:34:54.879239 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.879186 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:34:54.885413 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.885397 2573 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-137-14.ec2.internal" Apr 23 16:34:54.885479 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:54.885419 2573 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-137-14.ec2.internal\": node \"ip-10-0-137-14.ec2.internal\" not found" Apr 23 16:34:54.908546 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:54.908512 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-14.ec2.internal\" not found" Apr 23 16:34:54.976058 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.976024 2573 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-14.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-137-14.ec2.internal"] Apr 23 16:34:54.976189 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.976135 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:34:54.977909 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.977884 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-14.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:34:54.977995 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.977916 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-14.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:34:54.977995 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.977929 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-14.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:34:54.980400 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.980386 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:34:54.980556 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.980542 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-14.ec2.internal" Apr 23 16:34:54.980593 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.980574 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:34:54.981213 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.981195 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-14.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:34:54.981306 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.981217 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-14.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:34:54.981306 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.981223 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-14.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:34:54.981306 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.981233 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-14.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:34:54.981306 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.981235 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-14.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:34:54.981306 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.981243 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-14.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:34:54.983514 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.983499 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-14.ec2.internal" Apr 23 16:34:54.983597 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.983525 2573 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 16:34:54.984231 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.984217 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-14.ec2.internal" event="NodeHasSufficientMemory" Apr 23 16:34:54.984298 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.984244 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-14.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 16:34:54.984298 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:54.984256 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-14.ec2.internal" event="NodeHasSufficientPID" Apr 23 16:34:55.000315 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:55.000294 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-14.ec2.internal\" not found" node="ip-10-0-137-14.ec2.internal" Apr 23 16:34:55.004932 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:55.004913 2573 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-137-14.ec2.internal\" not found" node="ip-10-0-137-14.ec2.internal" Apr 23 16:34:55.008966 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:55.008945 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-14.ec2.internal\" not found" Apr 23 16:34:55.110053 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:55.110019 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-14.ec2.internal\" not found" Apr 23 16:34:55.115334 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:55.115314 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0085a20aac38133e753ce010973cc630-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-14.ec2.internal\" (UID: \"0085a20aac38133e753ce010973cc630\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-14.ec2.internal" Apr 23 16:34:55.115487 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:55.115361 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0085a20aac38133e753ce010973cc630-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-14.ec2.internal\" (UID: \"0085a20aac38133e753ce010973cc630\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-14.ec2.internal" Apr 23 16:34:55.115487 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:55.115387 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c67cb434c3c0499bd3e70b96d76e7361-config\") pod \"kube-apiserver-proxy-ip-10-0-137-14.ec2.internal\" (UID: \"c67cb434c3c0499bd3e70b96d76e7361\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-14.ec2.internal" Apr 23 16:34:55.210429 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:55.210379 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-14.ec2.internal\" not found" Apr 23 16:34:55.215738 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:55.215714 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0085a20aac38133e753ce010973cc630-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-14.ec2.internal\" (UID: \"0085a20aac38133e753ce010973cc630\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-14.ec2.internal" Apr 23 16:34:55.215798 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:55.215746 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0085a20aac38133e753ce010973cc630-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-14.ec2.internal\" (UID: \"0085a20aac38133e753ce010973cc630\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-14.ec2.internal" Apr 23 16:34:55.215798 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:55.215764 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c67cb434c3c0499bd3e70b96d76e7361-config\") pod \"kube-apiserver-proxy-ip-10-0-137-14.ec2.internal\" (UID: \"c67cb434c3c0499bd3e70b96d76e7361\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-14.ec2.internal" Apr 23 16:34:55.215866 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:55.215815 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/0085a20aac38133e753ce010973cc630-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-137-14.ec2.internal\" (UID: \"0085a20aac38133e753ce010973cc630\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-14.ec2.internal" Apr 23 16:34:55.215866 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:55.215828 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0085a20aac38133e753ce010973cc630-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-137-14.ec2.internal\" (UID: \"0085a20aac38133e753ce010973cc630\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-14.ec2.internal" Apr 23 16:34:55.215926 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:55.215818 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/c67cb434c3c0499bd3e70b96d76e7361-config\") pod \"kube-apiserver-proxy-ip-10-0-137-14.ec2.internal\" (UID: \"c67cb434c3c0499bd3e70b96d76e7361\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-137-14.ec2.internal" Apr 23 16:34:55.302943 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:55.302886 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-14.ec2.internal" Apr 23 16:34:55.307665 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:55.307640 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-14.ec2.internal" Apr 23 16:34:55.311385 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:55.311354 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-14.ec2.internal\" not found" Apr 23 16:34:55.412005 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:55.411948 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-14.ec2.internal\" not found" Apr 23 16:34:55.512567 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:55.512489 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-14.ec2.internal\" not found" Apr 23 16:34:55.613112 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:55.613087 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-14.ec2.internal\" not found" Apr 23 16:34:55.624471 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:55.624449 2573 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 16:34:55.624599 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:55.624582 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 16:34:55.624653 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:55.624607 2573 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 16:34:55.712760 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:55.712733 2573 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 16:34:55.713173 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:55.713154 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-14.ec2.internal\" not found" Apr 23 16:34:55.715585 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:55.715557 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 16:29:54 +0000 UTC" deadline="2027-10-02 13:40:15.0819459 +0000 UTC" Apr 23 16:34:55.715631 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:55.715585 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12645h5m19.366363493s" Apr 23 16:34:55.720444 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:55.720427 2573 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 16:34:55.753185 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:55.753162 2573 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-j7vhg" Apr 23 16:34:55.761204 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:55.761180 2573 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-j7vhg" Apr 23 16:34:55.814106 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:55.814067 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-14.ec2.internal\" not found" Apr 23 16:34:55.913805 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:55.913762 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0085a20aac38133e753ce010973cc630.slice/crio-052282b76e122f8d571a14fc5ca10452658d2e1415d9776ade44825c61a64ab5 WatchSource:0}: Error finding container 052282b76e122f8d571a14fc5ca10452658d2e1415d9776ade44825c61a64ab5: Status 404 returned error can't find the container with id 052282b76e122f8d571a14fc5ca10452658d2e1415d9776ade44825c61a64ab5 Apr 23 16:34:55.914168 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:55.914142 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-14.ec2.internal\" not found" Apr 23 16:34:55.914305 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:55.914293 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc67cb434c3c0499bd3e70b96d76e7361.slice/crio-b1b9626543be34f7fcbc854cf223807f02688740d8b567fa17d455f7416d1c14 WatchSource:0}: Error finding container b1b9626543be34f7fcbc854cf223807f02688740d8b567fa17d455f7416d1c14: Status 404 returned error can't find the container with id b1b9626543be34f7fcbc854cf223807f02688740d8b567fa17d455f7416d1c14 Apr 23 16:34:55.918893 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:55.918880 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:34:56.014920 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:56.014887 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-14.ec2.internal\" not found" Apr 23 16:34:56.115405 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:56.115341 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-14.ec2.internal\" not found" Apr 23 16:34:56.177489 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.177457 2573 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:34:56.216120 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:56.216091 2573 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-137-14.ec2.internal\" not found" Apr 23 16:34:56.223145 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.223111 2573 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:34:56.313677 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.313637 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-14.ec2.internal" Apr 23 16:34:56.323511 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.323482 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 16:34:56.324427 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.324400 2573 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-14.ec2.internal" Apr 23 16:34:56.333413 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.333386 2573 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 16:34:56.624377 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.624339 2573 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:34:56.691471 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.691435 2573 apiserver.go:52] "Watching apiserver" Apr 23 16:34:56.698808 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.698784 2573 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 16:34:56.700417 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.700386 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-ch99b","openshift-network-operator/iptables-alerter-xrqlx","openshift-ovn-kubernetes/ovnkube-node-wd2cz","kube-system/konnectivity-agent-rq49t","openshift-cluster-node-tuning-operator/tuned-xjm86","openshift-dns/node-resolver-b8tpj","openshift-image-registry/node-ca-tk6kx","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-14.ec2.internal","openshift-multus/multus-additional-cni-plugins-8zwmw","openshift-multus/multus-b7t9t","openshift-multus/network-metrics-daemon-jpzq7","kube-system/kube-apiserver-proxy-ip-10-0-137-14.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4wr"] Apr 23 16:34:56.703032 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.703007 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.705189 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.705056 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-tx5dc\"" Apr 23 16:34:56.706993 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.705642 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 16:34:56.706993 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.706047 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 16:34:56.706993 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.706458 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 16:34:56.706993 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.706831 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 16:34:56.710165 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.710101 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.712243 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.712124 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 16:34:56.712351 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.712257 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 16:34:56.712351 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.712124 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 16:34:56.712463 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.712431 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xrqlx" Apr 23 16:34:56.712463 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.712447 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 16:34:56.712560 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.712476 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 16:34:56.712723 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.712658 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 16:34:56.713025 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.713009 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-nhpz7\"" Apr 23 16:34:56.714206 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.713958 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 16:34:56.714273 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.714237 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 16:34:56.714579 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.714438 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:34:56.714579 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.714516 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-9zqk6\"" Apr 23 16:34:56.715028 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.715010 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rq49t" Apr 23 16:34:56.715126 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.715105 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-b8tpj" Apr 23 16:34:56.716584 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.716567 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 16:34:56.716676 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.716589 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 16:34:56.716676 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.716630 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-7b98n\"" Apr 23 16:34:56.717332 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.717019 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-tnk44\"" Apr 23 16:34:56.717332 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.717025 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 16:34:56.717332 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.717029 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 16:34:56.717479 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.717423 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ch99b" Apr 23 16:34:56.717534 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:56.717514 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ch99b" podUID="493d9466-44b1-4315-9f1b-a60f6bb428c1" Apr 23 16:34:56.719906 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.719804 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpzq7" Apr 23 16:34:56.719997 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:56.719934 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpzq7" podUID="8306d95a-dbae-4dd7-bf93-637a12f98c59" Apr 23 16:34:56.722209 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.722189 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4wr" Apr 23 16:34:56.723094 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.723072 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-host-slash\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.723237 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.723222 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-os-release\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.723380 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.723362 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-multus-socket-dir-parent\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.723498 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.723484 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-host-run-netns\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.723601 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.723587 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-host-var-lib-cni-bin\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.723714 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.723682 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0056f1d2-57d7-40d1-9290-31c514f0d40e-multus-daemon-config\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.723840 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.723826 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-etc-kubernetes\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.724917 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.724889 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a4c2c1ca-68ee-40d1-8110-1c24a086d157-host-slash\") pod \"iptables-alerter-xrqlx\" (UID: \"a4c2c1ca-68ee-40d1-8110-1c24a086d157\") " pod="openshift-network-operator/iptables-alerter-xrqlx" Apr 23 16:34:56.725069 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.725052 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-host-run-multus-certs\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.725178 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.725162 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a4c2c1ca-68ee-40d1-8110-1c24a086d157-iptables-alerter-script\") pod \"iptables-alerter-xrqlx\" (UID: \"a4c2c1ca-68ee-40d1-8110-1c24a086d157\") " pod="openshift-network-operator/iptables-alerter-xrqlx" Apr 23 16:34:56.725245 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.725209 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zl5p8\" (UniqueName: \"kubernetes.io/projected/a4c2c1ca-68ee-40d1-8110-1c24a086d157-kube-api-access-zl5p8\") pod \"iptables-alerter-xrqlx\" (UID: \"a4c2c1ca-68ee-40d1-8110-1c24a086d157\") " pod="openshift-network-operator/iptables-alerter-xrqlx" Apr 23 16:34:56.725245 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.724209 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-94hm9\"" Apr 23 16:34:56.725350 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.723845 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 16:34:56.725350 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.724161 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 16:34:56.725446 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.725239 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-host-run-ovn-kubernetes\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.725446 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.725392 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-host-cni-bin\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.725446 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.724055 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 16:34:56.725583 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.725419 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.725583 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.725470 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/52988e90-484a-49cd-98f6-5510a28890d6-ovnkube-script-lib\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.725583 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.725483 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbtfl\" (UniqueName: \"kubernetes.io/projected/52988e90-484a-49cd-98f6-5510a28890d6-kube-api-access-mbtfl\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.725583 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.725501 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0056f1d2-57d7-40d1-9290-31c514f0d40e-cni-binary-copy\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.725583 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.725521 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-systemd-units\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.725583 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.725569 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/52988e90-484a-49cd-98f6-5510a28890d6-env-overrides\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.725911 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.725602 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/52988e90-484a-49cd-98f6-5510a28890d6-ovn-node-metrics-cert\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.725911 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.725645 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-cnibin\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.725911 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.725788 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-host-var-lib-cni-multus\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.725911 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.725835 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-hostroot\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.725911 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.725867 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-multus-conf-dir\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.726063 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.725914 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-run-ovn\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.726063 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.725938 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-log-socket\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.726063 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.725961 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/52988e90-484a-49cd-98f6-5510a28890d6-ovnkube-config\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.726063 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.726001 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/56124f3f-030d-47d4-99f9-65b3011d5573-tmp-dir\") pod \"node-resolver-b8tpj\" (UID: \"56124f3f-030d-47d4-99f9-65b3011d5573\") " pod="openshift-dns/node-resolver-b8tpj" Apr 23 16:34:56.726063 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.726024 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-multus-cni-dir\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.726063 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.726048 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-host-var-lib-kubelet\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.726270 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.726070 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4hzj\" (UniqueName: \"kubernetes.io/projected/0056f1d2-57d7-40d1-9290-31c514f0d40e-kube-api-access-v4hzj\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.726270 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.726097 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-var-lib-openvswitch\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.726270 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.726153 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-etc-openvswitch\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.726526 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.726480 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-run-openvswitch\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.726639 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.726512 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-host-cni-netd\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.726639 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.726550 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-system-cni-dir\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.726639 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.726567 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-host-run-netns\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.726639 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.726589 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5fck\" (UniqueName: \"kubernetes.io/projected/56124f3f-030d-47d4-99f9-65b3011d5573-kube-api-access-m5fck\") pod \"node-resolver-b8tpj\" (UID: \"56124f3f-030d-47d4-99f9-65b3011d5573\") " pod="openshift-dns/node-resolver-b8tpj" Apr 23 16:34:56.726639 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.726625 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-host-run-k8s-cni-cncf-io\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.726930 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.726669 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-host-kubelet\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.726930 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.726711 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-run-systemd\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.727263 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.726757 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-node-log\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.727322 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.727291 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d28475ec-752d-4947-9e85-f35681ad68ab-agent-certs\") pod \"konnectivity-agent-rq49t\" (UID: \"d28475ec-752d-4947-9e85-f35681ad68ab\") " pod="kube-system/konnectivity-agent-rq49t" Apr 23 16:34:56.727322 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.727316 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d28475ec-752d-4947-9e85-f35681ad68ab-konnectivity-ca\") pod \"konnectivity-agent-rq49t\" (UID: \"d28475ec-752d-4947-9e85-f35681ad68ab\") " pod="kube-system/konnectivity-agent-rq49t" Apr 23 16:34:56.727423 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.727340 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/56124f3f-030d-47d4-99f9-65b3011d5573-hosts-file\") pod \"node-resolver-b8tpj\" (UID: \"56124f3f-030d-47d4-99f9-65b3011d5573\") " pod="openshift-dns/node-resolver-b8tpj" Apr 23 16:34:56.727609 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.727588 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.729207 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.729187 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 16:34:56.729289 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.729245 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 16:34:56.729863 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.729845 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-r7zt2\"" Apr 23 16:34:56.730480 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.730405 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8zwmw" Apr 23 16:34:56.730925 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.730764 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tk6kx" Apr 23 16:34:56.732148 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.732131 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 16:34:56.732592 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.732574 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 16:34:56.732671 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.732619 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 16:34:56.732671 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.732623 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-8jkvt\"" Apr 23 16:34:56.732897 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.732626 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-qlnsr\"" Apr 23 16:34:56.732897 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.732573 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 16:34:56.732897 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.732880 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 16:34:56.761937 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.761881 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 16:29:55 +0000 UTC" deadline="2028-01-29 04:49:23.465026144 +0000 UTC" Apr 23 16:34:56.761937 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.761912 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15492h14m26.703117914s" Apr 23 16:34:56.815138 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.815113 2573 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 16:34:56.827955 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.827924 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/56124f3f-030d-47d4-99f9-65b3011d5573-tmp-dir\") pod \"node-resolver-b8tpj\" (UID: \"56124f3f-030d-47d4-99f9-65b3011d5573\") " pod="openshift-dns/node-resolver-b8tpj" Apr 23 16:34:56.828070 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.827973 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-etc-sysconfig\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.828070 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.827998 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-etc-tuned\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.828070 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828027 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwxll\" (UniqueName: \"kubernetes.io/projected/0a0f1a6d-d0aa-4632-8dd9-0adbe8707e89-kube-api-access-xwxll\") pod \"node-ca-tk6kx\" (UID: \"0a0f1a6d-d0aa-4632-8dd9-0adbe8707e89\") " pod="openshift-image-registry/node-ca-tk6kx" Apr 23 16:34:56.828070 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828054 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-multus-cni-dir\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.828281 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828078 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-var-lib-openvswitch\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.828281 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828100 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5b2b1418-dfa9-48eb-9f15-1110e4874cd2-registration-dir\") pod \"aws-ebs-csi-driver-node-8g4wr\" (UID: \"5b2b1418-dfa9-48eb-9f15-1110e4874cd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4wr" Apr 23 16:34:56.828281 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828126 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfjzn\" (UniqueName: \"kubernetes.io/projected/5b2b1418-dfa9-48eb-9f15-1110e4874cd2-kube-api-access-jfjzn\") pod \"aws-ebs-csi-driver-node-8g4wr\" (UID: \"5b2b1418-dfa9-48eb-9f15-1110e4874cd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4wr" Apr 23 16:34:56.828281 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828149 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-var-lib-kubelet\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.828281 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828171 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-var-lib-openvswitch\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.828281 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828175 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m5fck\" (UniqueName: \"kubernetes.io/projected/56124f3f-030d-47d4-99f9-65b3011d5573-kube-api-access-m5fck\") pod \"node-resolver-b8tpj\" (UID: \"56124f3f-030d-47d4-99f9-65b3011d5573\") " pod="openshift-dns/node-resolver-b8tpj" Apr 23 16:34:56.828281 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828210 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-multus-cni-dir\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.828281 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828216 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5b2b1418-dfa9-48eb-9f15-1110e4874cd2-socket-dir\") pod \"aws-ebs-csi-driver-node-8g4wr\" (UID: \"5b2b1418-dfa9-48eb-9f15-1110e4874cd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4wr" Apr 23 16:34:56.828281 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828258 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-host-run-k8s-cni-cncf-io\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.828281 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828275 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/56124f3f-030d-47d4-99f9-65b3011d5573-tmp-dir\") pod \"node-resolver-b8tpj\" (UID: \"56124f3f-030d-47d4-99f9-65b3011d5573\") " pod="openshift-dns/node-resolver-b8tpj" Apr 23 16:34:56.828769 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828302 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-host-run-k8s-cni-cncf-io\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.828769 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828321 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-host-kubelet\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.828769 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828353 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-run-systemd\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.828769 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828379 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-node-log\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.828769 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828406 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d28475ec-752d-4947-9e85-f35681ad68ab-agent-certs\") pod \"konnectivity-agent-rq49t\" (UID: \"d28475ec-752d-4947-9e85-f35681ad68ab\") " pod="kube-system/konnectivity-agent-rq49t" Apr 23 16:34:56.828769 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828424 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-run-systemd\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.828769 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828431 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/56124f3f-030d-47d4-99f9-65b3011d5573-hosts-file\") pod \"node-resolver-b8tpj\" (UID: \"56124f3f-030d-47d4-99f9-65b3011d5573\") " pod="openshift-dns/node-resolver-b8tpj" Apr 23 16:34:56.828769 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828436 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-node-log\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.828769 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828419 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-host-kubelet\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.828769 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828480 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-tmp\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.828769 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828487 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/56124f3f-030d-47d4-99f9-65b3011d5573-hosts-file\") pod \"node-resolver-b8tpj\" (UID: \"56124f3f-030d-47d4-99f9-65b3011d5573\") " pod="openshift-dns/node-resolver-b8tpj" Apr 23 16:34:56.828769 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828514 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zl5p8\" (UniqueName: \"kubernetes.io/projected/a4c2c1ca-68ee-40d1-8110-1c24a086d157-kube-api-access-zl5p8\") pod \"iptables-alerter-xrqlx\" (UID: \"a4c2c1ca-68ee-40d1-8110-1c24a086d157\") " pod="openshift-network-operator/iptables-alerter-xrqlx" Apr 23 16:34:56.828769 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828563 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mbtfl\" (UniqueName: \"kubernetes.io/projected/52988e90-484a-49cd-98f6-5510a28890d6-kube-api-access-mbtfl\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.828769 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828666 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0056f1d2-57d7-40d1-9290-31c514f0d40e-multus-daemon-config\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.828769 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828714 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-etc-kubernetes\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.828769 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828744 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8306d95a-dbae-4dd7-bf93-637a12f98c59-metrics-certs\") pod \"network-metrics-daemon-jpzq7\" (UID: \"8306d95a-dbae-4dd7-bf93-637a12f98c59\") " pod="openshift-multus/network-metrics-daemon-jpzq7" Apr 23 16:34:56.828769 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828770 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a4c2c1ca-68ee-40d1-8110-1c24a086d157-iptables-alerter-script\") pod \"iptables-alerter-xrqlx\" (UID: \"a4c2c1ca-68ee-40d1-8110-1c24a086d157\") " pod="openshift-network-operator/iptables-alerter-xrqlx" Apr 23 16:34:56.829533 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828767 2573 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 16:34:56.829533 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828800 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-host-run-ovn-kubernetes\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.829533 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828797 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-etc-kubernetes\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.829533 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828827 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-host-cni-bin\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.829533 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828862 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-host-cni-bin\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.829533 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828956 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-sys\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.829533 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828959 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-host-run-ovn-kubernetes\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.829533 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.828986 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7hhkv\" (UniqueName: \"kubernetes.io/projected/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-kube-api-access-7hhkv\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.829533 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829011 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0a0f1a6d-d0aa-4632-8dd9-0adbe8707e89-serviceca\") pod \"node-ca-tk6kx\" (UID: \"0a0f1a6d-d0aa-4632-8dd9-0adbe8707e89\") " pod="openshift-image-registry/node-ca-tk6kx" Apr 23 16:34:56.829533 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829058 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0056f1d2-57d7-40d1-9290-31c514f0d40e-cni-binary-copy\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.829533 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829089 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-systemd-units\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.829533 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829116 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/52988e90-484a-49cd-98f6-5510a28890d6-env-overrides\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.829533 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829143 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5b2b1418-dfa9-48eb-9f15-1110e4874cd2-etc-selinux\") pod \"aws-ebs-csi-driver-node-8g4wr\" (UID: \"5b2b1418-dfa9-48eb-9f15-1110e4874cd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4wr" Apr 23 16:34:56.829533 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829172 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-cnibin\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.829533 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829195 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-hostroot\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.829533 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829221 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-run-ovn\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.829533 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829215 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-systemd-units\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.829533 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829267 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-log-socket\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.830405 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829292 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-cnibin\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.830405 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829295 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0f6f780e-a2ae-473d-ad75-c644275b6cdb-cnibin\") pod \"multus-additional-cni-plugins-8zwmw\" (UID: \"0f6f780e-a2ae-473d-ad75-c644275b6cdb\") " pod="openshift-multus/multus-additional-cni-plugins-8zwmw" Apr 23 16:34:56.830405 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829351 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a4c2c1ca-68ee-40d1-8110-1c24a086d157-iptables-alerter-script\") pod \"iptables-alerter-xrqlx\" (UID: \"a4c2c1ca-68ee-40d1-8110-1c24a086d157\") " pod="openshift-network-operator/iptables-alerter-xrqlx" Apr 23 16:34:56.830405 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829368 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-run-ovn\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.830405 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829371 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0056f1d2-57d7-40d1-9290-31c514f0d40e-multus-daemon-config\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.830405 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829410 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-log-socket\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.830405 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829420 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-etc-sysctl-d\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.830405 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829426 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-hostroot\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.830405 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829450 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-run\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.830405 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829479 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-host\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.830405 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829505 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a0f1a6d-d0aa-4632-8dd9-0adbe8707e89-host\") pod \"node-ca-tk6kx\" (UID: \"0a0f1a6d-d0aa-4632-8dd9-0adbe8707e89\") " pod="openshift-image-registry/node-ca-tk6kx" Apr 23 16:34:56.830405 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829538 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-host-var-lib-kubelet\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.830405 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829566 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v4hzj\" (UniqueName: \"kubernetes.io/projected/0056f1d2-57d7-40d1-9290-31c514f0d40e-kube-api-access-v4hzj\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.830405 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829590 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-host-var-lib-kubelet\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.830405 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829616 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0056f1d2-57d7-40d1-9290-31c514f0d40e-cni-binary-copy\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.830405 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829636 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-etc-openvswitch\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.830405 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829596 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-etc-openvswitch\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.830405 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829674 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-run-openvswitch\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.831239 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829716 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-host-cni-netd\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.831239 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829744 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0f6f780e-a2ae-473d-ad75-c644275b6cdb-cni-binary-copy\") pod \"multus-additional-cni-plugins-8zwmw\" (UID: \"0f6f780e-a2ae-473d-ad75-c644275b6cdb\") " pod="openshift-multus/multus-additional-cni-plugins-8zwmw" Apr 23 16:34:56.831239 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829773 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-etc-systemd\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.831239 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829794 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/52988e90-484a-49cd-98f6-5510a28890d6-env-overrides\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.831239 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829797 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqgkx\" (UniqueName: \"kubernetes.io/projected/493d9466-44b1-4315-9f1b-a60f6bb428c1-kube-api-access-cqgkx\") pod \"network-check-target-ch99b\" (UID: \"493d9466-44b1-4315-9f1b-a60f6bb428c1\") " pod="openshift-network-diagnostics/network-check-target-ch99b" Apr 23 16:34:56.831239 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829834 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-host-cni-netd\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.831239 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829881 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-run-openvswitch\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.831239 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829901 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-system-cni-dir\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.831239 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829934 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-host-run-netns\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.831239 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829973 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-host-run-netns\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.831239 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.829975 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-etc-kubernetes\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.831239 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830027 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-system-cni-dir\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.831239 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830049 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-lib-modules\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.831239 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830089 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d28475ec-752d-4947-9e85-f35681ad68ab-konnectivity-ca\") pod \"konnectivity-agent-rq49t\" (UID: \"d28475ec-752d-4947-9e85-f35681ad68ab\") " pod="kube-system/konnectivity-agent-rq49t" Apr 23 16:34:56.831239 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830114 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.831239 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830138 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/52988e90-484a-49cd-98f6-5510a28890d6-ovnkube-script-lib\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.831239 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830160 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-host-slash\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.832011 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830182 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-os-release\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.832011 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830208 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-multus-socket-dir-parent\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.832011 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830232 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-host-run-netns\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.832011 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830233 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.832011 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830256 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-host-var-lib-cni-bin\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.832011 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830305 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a4c2c1ca-68ee-40d1-8110-1c24a086d157-host-slash\") pod \"iptables-alerter-xrqlx\" (UID: \"a4c2c1ca-68ee-40d1-8110-1c24a086d157\") " pod="openshift-network-operator/iptables-alerter-xrqlx" Apr 23 16:34:56.832011 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830333 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-host-run-multus-certs\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.832011 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830370 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-host-run-netns\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.832011 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830364 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0f6f780e-a2ae-473d-ad75-c644275b6cdb-system-cni-dir\") pod \"multus-additional-cni-plugins-8zwmw\" (UID: \"0f6f780e-a2ae-473d-ad75-c644275b6cdb\") " pod="openshift-multus/multus-additional-cni-plugins-8zwmw" Apr 23 16:34:56.832011 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830391 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/52988e90-484a-49cd-98f6-5510a28890d6-host-slash\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.832011 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830413 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0f6f780e-a2ae-473d-ad75-c644275b6cdb-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8zwmw\" (UID: \"0f6f780e-a2ae-473d-ad75-c644275b6cdb\") " pod="openshift-multus/multus-additional-cni-plugins-8zwmw" Apr 23 16:34:56.832011 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830431 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-multus-socket-dir-parent\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.832011 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830439 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf4j5\" (UniqueName: \"kubernetes.io/projected/8306d95a-dbae-4dd7-bf93-637a12f98c59-kube-api-access-zf4j5\") pod \"network-metrics-daemon-jpzq7\" (UID: \"8306d95a-dbae-4dd7-bf93-637a12f98c59\") " pod="openshift-multus/network-metrics-daemon-jpzq7" Apr 23 16:34:56.832011 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830446 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a4c2c1ca-68ee-40d1-8110-1c24a086d157-host-slash\") pod \"iptables-alerter-xrqlx\" (UID: \"a4c2c1ca-68ee-40d1-8110-1c24a086d157\") " pod="openshift-network-operator/iptables-alerter-xrqlx" Apr 23 16:34:56.832011 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830485 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5b2b1418-dfa9-48eb-9f15-1110e4874cd2-device-dir\") pod \"aws-ebs-csi-driver-node-8g4wr\" (UID: \"5b2b1418-dfa9-48eb-9f15-1110e4874cd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4wr" Apr 23 16:34:56.832011 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830499 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-os-release\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.832011 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830517 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-etc-sysctl-conf\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.832796 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830538 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-host-run-multus-certs\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.832796 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830490 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-host-var-lib-cni-bin\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.832796 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830546 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/52988e90-484a-49cd-98f6-5510a28890d6-ovn-node-metrics-cert\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.832796 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830593 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0f6f780e-a2ae-473d-ad75-c644275b6cdb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8zwmw\" (UID: \"0f6f780e-a2ae-473d-ad75-c644275b6cdb\") " pod="openshift-multus/multus-additional-cni-plugins-8zwmw" Apr 23 16:34:56.832796 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830621 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0f6f780e-a2ae-473d-ad75-c644275b6cdb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8zwmw\" (UID: \"0f6f780e-a2ae-473d-ad75-c644275b6cdb\") " pod="openshift-multus/multus-additional-cni-plugins-8zwmw" Apr 23 16:34:56.832796 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830622 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/d28475ec-752d-4947-9e85-f35681ad68ab-konnectivity-ca\") pod \"konnectivity-agent-rq49t\" (UID: \"d28475ec-752d-4947-9e85-f35681ad68ab\") " pod="kube-system/konnectivity-agent-rq49t" Apr 23 16:34:56.832796 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830646 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b2b1418-dfa9-48eb-9f15-1110e4874cd2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8g4wr\" (UID: \"5b2b1418-dfa9-48eb-9f15-1110e4874cd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4wr" Apr 23 16:34:56.832796 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830672 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5b2b1418-dfa9-48eb-9f15-1110e4874cd2-sys-fs\") pod \"aws-ebs-csi-driver-node-8g4wr\" (UID: \"5b2b1418-dfa9-48eb-9f15-1110e4874cd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4wr" Apr 23 16:34:56.832796 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830716 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-host-var-lib-cni-multus\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.832796 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830747 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-multus-conf-dir\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.832796 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830772 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/52988e90-484a-49cd-98f6-5510a28890d6-ovnkube-config\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.832796 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830814 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0f6f780e-a2ae-473d-ad75-c644275b6cdb-os-release\") pod \"multus-additional-cni-plugins-8zwmw\" (UID: \"0f6f780e-a2ae-473d-ad75-c644275b6cdb\") " pod="openshift-multus/multus-additional-cni-plugins-8zwmw" Apr 23 16:34:56.832796 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830817 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-host-var-lib-cni-multus\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.832796 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830840 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffmwc\" (UniqueName: \"kubernetes.io/projected/0f6f780e-a2ae-473d-ad75-c644275b6cdb-kube-api-access-ffmwc\") pod \"multus-additional-cni-plugins-8zwmw\" (UID: \"0f6f780e-a2ae-473d-ad75-c644275b6cdb\") " pod="openshift-multus/multus-additional-cni-plugins-8zwmw" Apr 23 16:34:56.832796 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830869 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-etc-modprobe-d\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.832796 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.830882 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0056f1d2-57d7-40d1-9290-31c514f0d40e-multus-conf-dir\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.832796 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.831340 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/52988e90-484a-49cd-98f6-5510a28890d6-ovnkube-script-lib\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.833561 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.831800 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/52988e90-484a-49cd-98f6-5510a28890d6-ovnkube-config\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.833561 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.832292 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/d28475ec-752d-4947-9e85-f35681ad68ab-agent-certs\") pod \"konnectivity-agent-rq49t\" (UID: \"d28475ec-752d-4947-9e85-f35681ad68ab\") " pod="kube-system/konnectivity-agent-rq49t" Apr 23 16:34:56.834238 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.834208 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/52988e90-484a-49cd-98f6-5510a28890d6-ovn-node-metrics-cert\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.836177 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.836155 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbtfl\" (UniqueName: \"kubernetes.io/projected/52988e90-484a-49cd-98f6-5510a28890d6-kube-api-access-mbtfl\") pod \"ovnkube-node-wd2cz\" (UID: \"52988e90-484a-49cd-98f6-5510a28890d6\") " pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:56.838926 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.838904 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zl5p8\" (UniqueName: \"kubernetes.io/projected/a4c2c1ca-68ee-40d1-8110-1c24a086d157-kube-api-access-zl5p8\") pod \"iptables-alerter-xrqlx\" (UID: \"a4c2c1ca-68ee-40d1-8110-1c24a086d157\") " pod="openshift-network-operator/iptables-alerter-xrqlx" Apr 23 16:34:56.839705 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.839667 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4hzj\" (UniqueName: \"kubernetes.io/projected/0056f1d2-57d7-40d1-9290-31c514f0d40e-kube-api-access-v4hzj\") pod \"multus-b7t9t\" (UID: \"0056f1d2-57d7-40d1-9290-31c514f0d40e\") " pod="openshift-multus/multus-b7t9t" Apr 23 16:34:56.840066 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.840041 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5fck\" (UniqueName: \"kubernetes.io/projected/56124f3f-030d-47d4-99f9-65b3011d5573-kube-api-access-m5fck\") pod \"node-resolver-b8tpj\" (UID: \"56124f3f-030d-47d4-99f9-65b3011d5573\") " pod="openshift-dns/node-resolver-b8tpj" Apr 23 16:34:56.881439 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.881349 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-14.ec2.internal" event={"ID":"c67cb434c3c0499bd3e70b96d76e7361","Type":"ContainerStarted","Data":"b1b9626543be34f7fcbc854cf223807f02688740d8b567fa17d455f7416d1c14"} Apr 23 16:34:56.882413 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.882378 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-14.ec2.internal" event={"ID":"0085a20aac38133e753ce010973cc630","Type":"ContainerStarted","Data":"052282b76e122f8d571a14fc5ca10452658d2e1415d9776ade44825c61a64ab5"} Apr 23 16:34:56.932087 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932059 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0f6f780e-a2ae-473d-ad75-c644275b6cdb-system-cni-dir\") pod \"multus-additional-cni-plugins-8zwmw\" (UID: \"0f6f780e-a2ae-473d-ad75-c644275b6cdb\") " pod="openshift-multus/multus-additional-cni-plugins-8zwmw" Apr 23 16:34:56.932222 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932094 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0f6f780e-a2ae-473d-ad75-c644275b6cdb-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8zwmw\" (UID: \"0f6f780e-a2ae-473d-ad75-c644275b6cdb\") " pod="openshift-multus/multus-additional-cni-plugins-8zwmw" Apr 23 16:34:56.932222 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932113 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zf4j5\" (UniqueName: \"kubernetes.io/projected/8306d95a-dbae-4dd7-bf93-637a12f98c59-kube-api-access-zf4j5\") pod \"network-metrics-daemon-jpzq7\" (UID: \"8306d95a-dbae-4dd7-bf93-637a12f98c59\") " pod="openshift-multus/network-metrics-daemon-jpzq7" Apr 23 16:34:56.932222 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932128 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5b2b1418-dfa9-48eb-9f15-1110e4874cd2-device-dir\") pod \"aws-ebs-csi-driver-node-8g4wr\" (UID: \"5b2b1418-dfa9-48eb-9f15-1110e4874cd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4wr" Apr 23 16:34:56.932222 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932148 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-etc-sysctl-conf\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.932222 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932175 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0f6f780e-a2ae-473d-ad75-c644275b6cdb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8zwmw\" (UID: \"0f6f780e-a2ae-473d-ad75-c644275b6cdb\") " pod="openshift-multus/multus-additional-cni-plugins-8zwmw" Apr 23 16:34:56.932222 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932175 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0f6f780e-a2ae-473d-ad75-c644275b6cdb-system-cni-dir\") pod \"multus-additional-cni-plugins-8zwmw\" (UID: \"0f6f780e-a2ae-473d-ad75-c644275b6cdb\") " pod="openshift-multus/multus-additional-cni-plugins-8zwmw" Apr 23 16:34:56.932222 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932200 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0f6f780e-a2ae-473d-ad75-c644275b6cdb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8zwmw\" (UID: \"0f6f780e-a2ae-473d-ad75-c644275b6cdb\") " pod="openshift-multus/multus-additional-cni-plugins-8zwmw" Apr 23 16:34:56.932538 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932228 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b2b1418-dfa9-48eb-9f15-1110e4874cd2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8g4wr\" (UID: \"5b2b1418-dfa9-48eb-9f15-1110e4874cd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4wr" Apr 23 16:34:56.932538 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932247 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/5b2b1418-dfa9-48eb-9f15-1110e4874cd2-device-dir\") pod \"aws-ebs-csi-driver-node-8g4wr\" (UID: \"5b2b1418-dfa9-48eb-9f15-1110e4874cd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4wr" Apr 23 16:34:56.932538 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932256 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5b2b1418-dfa9-48eb-9f15-1110e4874cd2-sys-fs\") pod \"aws-ebs-csi-driver-node-8g4wr\" (UID: \"5b2b1418-dfa9-48eb-9f15-1110e4874cd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4wr" Apr 23 16:34:56.932538 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932283 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0f6f780e-a2ae-473d-ad75-c644275b6cdb-os-release\") pod \"multus-additional-cni-plugins-8zwmw\" (UID: \"0f6f780e-a2ae-473d-ad75-c644275b6cdb\") " pod="openshift-multus/multus-additional-cni-plugins-8zwmw" Apr 23 16:34:56.932538 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932307 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ffmwc\" (UniqueName: \"kubernetes.io/projected/0f6f780e-a2ae-473d-ad75-c644275b6cdb-kube-api-access-ffmwc\") pod \"multus-additional-cni-plugins-8zwmw\" (UID: \"0f6f780e-a2ae-473d-ad75-c644275b6cdb\") " pod="openshift-multus/multus-additional-cni-plugins-8zwmw" Apr 23 16:34:56.932538 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932331 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-etc-modprobe-d\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.932538 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932356 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-etc-sysconfig\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.932538 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932366 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/5b2b1418-dfa9-48eb-9f15-1110e4874cd2-sys-fs\") pod \"aws-ebs-csi-driver-node-8g4wr\" (UID: \"5b2b1418-dfa9-48eb-9f15-1110e4874cd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4wr" Apr 23 16:34:56.932538 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932367 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-etc-sysctl-conf\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.932538 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932377 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-etc-tuned\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.932538 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932420 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xwxll\" (UniqueName: \"kubernetes.io/projected/0a0f1a6d-d0aa-4632-8dd9-0adbe8707e89-kube-api-access-xwxll\") pod \"node-ca-tk6kx\" (UID: \"0a0f1a6d-d0aa-4632-8dd9-0adbe8707e89\") " pod="openshift-image-registry/node-ca-tk6kx" Apr 23 16:34:56.932538 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932423 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b2b1418-dfa9-48eb-9f15-1110e4874cd2-kubelet-dir\") pod \"aws-ebs-csi-driver-node-8g4wr\" (UID: \"5b2b1418-dfa9-48eb-9f15-1110e4874cd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4wr" Apr 23 16:34:56.932538 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932441 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0f6f780e-a2ae-473d-ad75-c644275b6cdb-os-release\") pod \"multus-additional-cni-plugins-8zwmw\" (UID: \"0f6f780e-a2ae-473d-ad75-c644275b6cdb\") " pod="openshift-multus/multus-additional-cni-plugins-8zwmw" Apr 23 16:34:56.932538 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932451 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5b2b1418-dfa9-48eb-9f15-1110e4874cd2-registration-dir\") pod \"aws-ebs-csi-driver-node-8g4wr\" (UID: \"5b2b1418-dfa9-48eb-9f15-1110e4874cd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4wr" Apr 23 16:34:56.932538 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932505 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/5b2b1418-dfa9-48eb-9f15-1110e4874cd2-registration-dir\") pod \"aws-ebs-csi-driver-node-8g4wr\" (UID: \"5b2b1418-dfa9-48eb-9f15-1110e4874cd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4wr" Apr 23 16:34:56.933235 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932546 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0f6f780e-a2ae-473d-ad75-c644275b6cdb-tuning-conf-dir\") pod \"multus-additional-cni-plugins-8zwmw\" (UID: \"0f6f780e-a2ae-473d-ad75-c644275b6cdb\") " pod="openshift-multus/multus-additional-cni-plugins-8zwmw" Apr 23 16:34:56.933235 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932553 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-etc-modprobe-d\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.933235 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932562 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-etc-sysconfig\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.933235 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932588 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jfjzn\" (UniqueName: \"kubernetes.io/projected/5b2b1418-dfa9-48eb-9f15-1110e4874cd2-kube-api-access-jfjzn\") pod \"aws-ebs-csi-driver-node-8g4wr\" (UID: \"5b2b1418-dfa9-48eb-9f15-1110e4874cd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4wr" Apr 23 16:34:56.933235 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932625 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-var-lib-kubelet\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.933235 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932653 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5b2b1418-dfa9-48eb-9f15-1110e4874cd2-socket-dir\") pod \"aws-ebs-csi-driver-node-8g4wr\" (UID: \"5b2b1418-dfa9-48eb-9f15-1110e4874cd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4wr" Apr 23 16:34:56.933235 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932669 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0f6f780e-a2ae-473d-ad75-c644275b6cdb-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-8zwmw\" (UID: \"0f6f780e-a2ae-473d-ad75-c644275b6cdb\") " pod="openshift-multus/multus-additional-cni-plugins-8zwmw" Apr 23 16:34:56.933235 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932706 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-tmp\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.933235 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932755 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-var-lib-kubelet\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.933235 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932768 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8306d95a-dbae-4dd7-bf93-637a12f98c59-metrics-certs\") pod \"network-metrics-daemon-jpzq7\" (UID: \"8306d95a-dbae-4dd7-bf93-637a12f98c59\") " pod="openshift-multus/network-metrics-daemon-jpzq7" Apr 23 16:34:56.933235 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932812 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-sys\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.933235 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932833 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/5b2b1418-dfa9-48eb-9f15-1110e4874cd2-socket-dir\") pod \"aws-ebs-csi-driver-node-8g4wr\" (UID: \"5b2b1418-dfa9-48eb-9f15-1110e4874cd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4wr" Apr 23 16:34:56.933235 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932839 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7hhkv\" (UniqueName: \"kubernetes.io/projected/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-kube-api-access-7hhkv\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.933235 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:56.932865 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:56.933235 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932883 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0a0f1a6d-d0aa-4632-8dd9-0adbe8707e89-serviceca\") pod \"node-ca-tk6kx\" (UID: \"0a0f1a6d-d0aa-4632-8dd9-0adbe8707e89\") " pod="openshift-image-registry/node-ca-tk6kx" Apr 23 16:34:56.933235 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932941 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-sys\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.933235 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:56.932960 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8306d95a-dbae-4dd7-bf93-637a12f98c59-metrics-certs podName:8306d95a-dbae-4dd7-bf93-637a12f98c59 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:57.432925546 +0000 UTC m=+3.122711884 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8306d95a-dbae-4dd7-bf93-637a12f98c59-metrics-certs") pod "network-metrics-daemon-jpzq7" (UID: "8306d95a-dbae-4dd7-bf93-637a12f98c59") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:56.934024 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.932995 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5b2b1418-dfa9-48eb-9f15-1110e4874cd2-etc-selinux\") pod \"aws-ebs-csi-driver-node-8g4wr\" (UID: \"5b2b1418-dfa9-48eb-9f15-1110e4874cd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4wr" Apr 23 16:34:56.934024 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.933025 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0f6f780e-a2ae-473d-ad75-c644275b6cdb-cnibin\") pod \"multus-additional-cni-plugins-8zwmw\" (UID: \"0f6f780e-a2ae-473d-ad75-c644275b6cdb\") " pod="openshift-multus/multus-additional-cni-plugins-8zwmw" Apr 23 16:34:56.934024 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.933050 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-etc-sysctl-d\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.934024 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.933073 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-run\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.934024 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.933082 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/5b2b1418-dfa9-48eb-9f15-1110e4874cd2-etc-selinux\") pod \"aws-ebs-csi-driver-node-8g4wr\" (UID: \"5b2b1418-dfa9-48eb-9f15-1110e4874cd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4wr" Apr 23 16:34:56.934024 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.933094 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-host\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.934024 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.933118 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a0f1a6d-d0aa-4632-8dd9-0adbe8707e89-host\") pod \"node-ca-tk6kx\" (UID: \"0a0f1a6d-d0aa-4632-8dd9-0adbe8707e89\") " pod="openshift-image-registry/node-ca-tk6kx" Apr 23 16:34:56.934024 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.933142 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0f6f780e-a2ae-473d-ad75-c644275b6cdb-cnibin\") pod \"multus-additional-cni-plugins-8zwmw\" (UID: \"0f6f780e-a2ae-473d-ad75-c644275b6cdb\") " pod="openshift-multus/multus-additional-cni-plugins-8zwmw" Apr 23 16:34:56.934024 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.933152 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-host\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.934024 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.933147 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0f6f780e-a2ae-473d-ad75-c644275b6cdb-cni-binary-copy\") pod \"multus-additional-cni-plugins-8zwmw\" (UID: \"0f6f780e-a2ae-473d-ad75-c644275b6cdb\") " pod="openshift-multus/multus-additional-cni-plugins-8zwmw" Apr 23 16:34:56.934024 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.933188 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0a0f1a6d-d0aa-4632-8dd9-0adbe8707e89-host\") pod \"node-ca-tk6kx\" (UID: \"0a0f1a6d-d0aa-4632-8dd9-0adbe8707e89\") " pod="openshift-image-registry/node-ca-tk6kx" Apr 23 16:34:56.934024 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.933200 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-run\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.934024 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.933223 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-etc-sysctl-d\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.934024 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.933234 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0f6f780e-a2ae-473d-ad75-c644275b6cdb-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-8zwmw\" (UID: \"0f6f780e-a2ae-473d-ad75-c644275b6cdb\") " pod="openshift-multus/multus-additional-cni-plugins-8zwmw" Apr 23 16:34:56.934024 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.933268 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-etc-systemd\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.934024 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.933315 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-etc-systemd\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.934024 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.933307 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqgkx\" (UniqueName: \"kubernetes.io/projected/493d9466-44b1-4315-9f1b-a60f6bb428c1-kube-api-access-cqgkx\") pod \"network-check-target-ch99b\" (UID: \"493d9466-44b1-4315-9f1b-a60f6bb428c1\") " pod="openshift-network-diagnostics/network-check-target-ch99b" Apr 23 16:34:56.934024 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.933359 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-etc-kubernetes\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.934844 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.933383 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0a0f1a6d-d0aa-4632-8dd9-0adbe8707e89-serviceca\") pod \"node-ca-tk6kx\" (UID: \"0a0f1a6d-d0aa-4632-8dd9-0adbe8707e89\") " pod="openshift-image-registry/node-ca-tk6kx" Apr 23 16:34:56.934844 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.933400 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-etc-kubernetes\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.934844 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.933433 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-lib-modules\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.934844 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.933569 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0f6f780e-a2ae-473d-ad75-c644275b6cdb-cni-binary-copy\") pod \"multus-additional-cni-plugins-8zwmw\" (UID: \"0f6f780e-a2ae-473d-ad75-c644275b6cdb\") " pod="openshift-multus/multus-additional-cni-plugins-8zwmw" Apr 23 16:34:56.934844 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.933598 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-lib-modules\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.935569 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.935547 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-etc-tuned\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.935651 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.935575 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-tmp\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.939033 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:56.939005 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:34:56.939033 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:56.939028 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:34:56.939192 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:56.939042 2573 projected.go:194] Error preparing data for projected volume kube-api-access-cqgkx for pod openshift-network-diagnostics/network-check-target-ch99b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:34:56.939192 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:56.939104 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/493d9466-44b1-4315-9f1b-a60f6bb428c1-kube-api-access-cqgkx podName:493d9466-44b1-4315-9f1b-a60f6bb428c1 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:57.439087413 +0000 UTC m=+3.128873742 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqgkx" (UniqueName: "kubernetes.io/projected/493d9466-44b1-4315-9f1b-a60f6bb428c1-kube-api-access-cqgkx") pod "network-check-target-ch99b" (UID: "493d9466-44b1-4315-9f1b-a60f6bb428c1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:34:56.941185 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.941157 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf4j5\" (UniqueName: \"kubernetes.io/projected/8306d95a-dbae-4dd7-bf93-637a12f98c59-kube-api-access-zf4j5\") pod \"network-metrics-daemon-jpzq7\" (UID: \"8306d95a-dbae-4dd7-bf93-637a12f98c59\") " pod="openshift-multus/network-metrics-daemon-jpzq7" Apr 23 16:34:56.941379 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.941359 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffmwc\" (UniqueName: \"kubernetes.io/projected/0f6f780e-a2ae-473d-ad75-c644275b6cdb-kube-api-access-ffmwc\") pod \"multus-additional-cni-plugins-8zwmw\" (UID: \"0f6f780e-a2ae-473d-ad75-c644275b6cdb\") " pod="openshift-multus/multus-additional-cni-plugins-8zwmw" Apr 23 16:34:56.941442 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.941362 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwxll\" (UniqueName: \"kubernetes.io/projected/0a0f1a6d-d0aa-4632-8dd9-0adbe8707e89-kube-api-access-xwxll\") pod \"node-ca-tk6kx\" (UID: \"0a0f1a6d-d0aa-4632-8dd9-0adbe8707e89\") " pod="openshift-image-registry/node-ca-tk6kx" Apr 23 16:34:56.941858 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.941839 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7hhkv\" (UniqueName: \"kubernetes.io/projected/cb4c9ed2-c60a-4c94-9f67-e156f422d6a0-kube-api-access-7hhkv\") pod \"tuned-xjm86\" (UID: \"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0\") " pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:56.941936 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.941857 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfjzn\" (UniqueName: \"kubernetes.io/projected/5b2b1418-dfa9-48eb-9f15-1110e4874cd2-kube-api-access-jfjzn\") pod \"aws-ebs-csi-driver-node-8g4wr\" (UID: \"5b2b1418-dfa9-48eb-9f15-1110e4874cd2\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4wr" Apr 23 16:34:56.998932 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:56.998901 2573 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 16:34:57.018581 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:57.018547 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-b7t9t" Apr 23 16:34:57.027469 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:57.027442 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:34:57.037202 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:57.037182 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-xrqlx" Apr 23 16:34:57.042790 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:57.042770 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rq49t" Apr 23 16:34:57.050245 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:57.050224 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-b8tpj" Apr 23 16:34:57.064834 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:57.064807 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4wr" Apr 23 16:34:57.071481 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:57.071451 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-xjm86" Apr 23 16:34:57.080100 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:57.080082 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-8zwmw" Apr 23 16:34:57.085656 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:57.085639 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tk6kx" Apr 23 16:34:57.438149 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:57.438115 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8306d95a-dbae-4dd7-bf93-637a12f98c59-metrics-certs\") pod \"network-metrics-daemon-jpzq7\" (UID: \"8306d95a-dbae-4dd7-bf93-637a12f98c59\") " pod="openshift-multus/network-metrics-daemon-jpzq7" Apr 23 16:34:57.438314 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:57.438266 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:57.438397 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:57.438337 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8306d95a-dbae-4dd7-bf93-637a12f98c59-metrics-certs podName:8306d95a-dbae-4dd7-bf93-637a12f98c59 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:58.438315726 +0000 UTC m=+4.128102059 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8306d95a-dbae-4dd7-bf93-637a12f98c59-metrics-certs") pod "network-metrics-daemon-jpzq7" (UID: "8306d95a-dbae-4dd7-bf93-637a12f98c59") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:57.539096 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:57.539066 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqgkx\" (UniqueName: \"kubernetes.io/projected/493d9466-44b1-4315-9f1b-a60f6bb428c1-kube-api-access-cqgkx\") pod \"network-check-target-ch99b\" (UID: \"493d9466-44b1-4315-9f1b-a60f6bb428c1\") " pod="openshift-network-diagnostics/network-check-target-ch99b" Apr 23 16:34:57.539270 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:57.539201 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:34:57.539270 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:57.539220 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:34:57.539270 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:57.539233 2573 projected.go:194] Error preparing data for projected volume kube-api-access-cqgkx for pod openshift-network-diagnostics/network-check-target-ch99b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:34:57.539425 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:57.539294 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/493d9466-44b1-4315-9f1b-a60f6bb428c1-kube-api-access-cqgkx podName:493d9466-44b1-4315-9f1b-a60f6bb428c1 nodeName:}" failed. No retries permitted until 2026-04-23 16:34:58.539273743 +0000 UTC m=+4.229060068 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqgkx" (UniqueName: "kubernetes.io/projected/493d9466-44b1-4315-9f1b-a60f6bb428c1-kube-api-access-cqgkx") pod "network-check-target-ch99b" (UID: "493d9466-44b1-4315-9f1b-a60f6bb428c1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:34:57.569400 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:57.569372 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52988e90_484a_49cd_98f6_5510a28890d6.slice/crio-c81670ce397e853cc94c5af3d4827cf3229eaaedc9faa5adad1e6e444d8b43ab WatchSource:0}: Error finding container c81670ce397e853cc94c5af3d4827cf3229eaaedc9faa5adad1e6e444d8b43ab: Status 404 returned error can't find the container with id c81670ce397e853cc94c5af3d4827cf3229eaaedc9faa5adad1e6e444d8b43ab Apr 23 16:34:57.572555 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:57.572518 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0056f1d2_57d7_40d1_9290_31c514f0d40e.slice/crio-fcd0047feec61987351100df5dc46599aaa41b4319791db08cd6c7f76b88c4bb WatchSource:0}: Error finding container fcd0047feec61987351100df5dc46599aaa41b4319791db08cd6c7f76b88c4bb: Status 404 returned error can't find the container with id fcd0047feec61987351100df5dc46599aaa41b4319791db08cd6c7f76b88c4bb Apr 23 16:34:57.578484 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:57.578455 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56124f3f_030d_47d4_99f9_65b3011d5573.slice/crio-d1b7d2c2bd68165c4b7846c6fe97fdffc3f2deb99c98ef67aae830485b3d3924 WatchSource:0}: Error finding container d1b7d2c2bd68165c4b7846c6fe97fdffc3f2deb99c98ef67aae830485b3d3924: Status 404 returned error can't find the container with id d1b7d2c2bd68165c4b7846c6fe97fdffc3f2deb99c98ef67aae830485b3d3924 Apr 23 16:34:57.579024 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:57.578999 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcb4c9ed2_c60a_4c94_9f67_e156f422d6a0.slice/crio-0ae26fc18251747176664d914a3f148e0e77cf9130114505c7f66d9733b2d4c4 WatchSource:0}: Error finding container 0ae26fc18251747176664d914a3f148e0e77cf9130114505c7f66d9733b2d4c4: Status 404 returned error can't find the container with id 0ae26fc18251747176664d914a3f148e0e77cf9130114505c7f66d9733b2d4c4 Apr 23 16:34:57.579834 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:57.579815 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f6f780e_a2ae_473d_ad75_c644275b6cdb.slice/crio-24ca8ebbedf7a2c72a5728b6b79150c7b985dc033ad298120f84fdc173457f4c WatchSource:0}: Error finding container 24ca8ebbedf7a2c72a5728b6b79150c7b985dc033ad298120f84fdc173457f4c: Status 404 returned error can't find the container with id 24ca8ebbedf7a2c72a5728b6b79150c7b985dc033ad298120f84fdc173457f4c Apr 23 16:34:57.601209 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:57.601012 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4c2c1ca_68ee_40d1_8110_1c24a086d157.slice/crio-7b5851bf532f7ee1ab1b2a979b0b94c338417be9b9aa1e45a3e821aaffa4f1b9 WatchSource:0}: Error finding container 7b5851bf532f7ee1ab1b2a979b0b94c338417be9b9aa1e45a3e821aaffa4f1b9: Status 404 returned error can't find the container with id 7b5851bf532f7ee1ab1b2a979b0b94c338417be9b9aa1e45a3e821aaffa4f1b9 Apr 23 16:34:57.602024 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:57.601997 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a0f1a6d_d0aa_4632_8dd9_0adbe8707e89.slice/crio-d71be91b423964895eb5b999d6b33f05c0d0c62e72e76cede619b1b8f7774be2 WatchSource:0}: Error finding container d71be91b423964895eb5b999d6b33f05c0d0c62e72e76cede619b1b8f7774be2: Status 404 returned error can't find the container with id d71be91b423964895eb5b999d6b33f05c0d0c62e72e76cede619b1b8f7774be2 Apr 23 16:34:57.602683 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:57.602470 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b2b1418_dfa9_48eb_9f15_1110e4874cd2.slice/crio-1abe68ad6605aa680b075652a5159b311742ed37ce2536ee18a9a2575cd20092 WatchSource:0}: Error finding container 1abe68ad6605aa680b075652a5159b311742ed37ce2536ee18a9a2575cd20092: Status 404 returned error can't find the container with id 1abe68ad6605aa680b075652a5159b311742ed37ce2536ee18a9a2575cd20092 Apr 23 16:34:57.603294 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:34:57.603266 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd28475ec_752d_4947_9e85_f35681ad68ab.slice/crio-aced3c1415993f850183e2a0d317a64c66ec954484e1019aa6f43e2b3a4d3d9a WatchSource:0}: Error finding container aced3c1415993f850183e2a0d317a64c66ec954484e1019aa6f43e2b3a4d3d9a: Status 404 returned error can't find the container with id aced3c1415993f850183e2a0d317a64c66ec954484e1019aa6f43e2b3a4d3d9a Apr 23 16:34:57.762328 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:57.762234 2573 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 16:29:55 +0000 UTC" deadline="2028-01-07 02:28:26.288172028 +0000 UTC" Apr 23 16:34:57.762328 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:57.762265 2573 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14961h53m28.525909606s" Apr 23 16:34:57.885978 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:57.885941 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8zwmw" event={"ID":"0f6f780e-a2ae-473d-ad75-c644275b6cdb","Type":"ContainerStarted","Data":"24ca8ebbedf7a2c72a5728b6b79150c7b985dc033ad298120f84fdc173457f4c"} Apr 23 16:34:57.887190 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:57.887147 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-b8tpj" event={"ID":"56124f3f-030d-47d4-99f9-65b3011d5573","Type":"ContainerStarted","Data":"d1b7d2c2bd68165c4b7846c6fe97fdffc3f2deb99c98ef67aae830485b3d3924"} Apr 23 16:34:57.888397 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:57.888367 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b7t9t" event={"ID":"0056f1d2-57d7-40d1-9290-31c514f0d40e","Type":"ContainerStarted","Data":"fcd0047feec61987351100df5dc46599aaa41b4319791db08cd6c7f76b88c4bb"} Apr 23 16:34:57.889646 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:57.889616 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" event={"ID":"52988e90-484a-49cd-98f6-5510a28890d6","Type":"ContainerStarted","Data":"c81670ce397e853cc94c5af3d4827cf3229eaaedc9faa5adad1e6e444d8b43ab"} Apr 23 16:34:57.893184 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:57.893155 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-14.ec2.internal" event={"ID":"c67cb434c3c0499bd3e70b96d76e7361","Type":"ContainerStarted","Data":"75bdb822d8882a5359e0051af72ea8450437b1dbf2b9085d111f1c4bff223e5b"} Apr 23 16:34:57.894814 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:57.894789 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4wr" event={"ID":"5b2b1418-dfa9-48eb-9f15-1110e4874cd2","Type":"ContainerStarted","Data":"1abe68ad6605aa680b075652a5159b311742ed37ce2536ee18a9a2575cd20092"} Apr 23 16:34:57.896151 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:57.896127 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xjm86" event={"ID":"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0","Type":"ContainerStarted","Data":"0ae26fc18251747176664d914a3f148e0e77cf9130114505c7f66d9733b2d4c4"} Apr 23 16:34:57.897242 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:57.897217 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rq49t" event={"ID":"d28475ec-752d-4947-9e85-f35681ad68ab","Type":"ContainerStarted","Data":"aced3c1415993f850183e2a0d317a64c66ec954484e1019aa6f43e2b3a4d3d9a"} Apr 23 16:34:57.898289 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:57.898265 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tk6kx" event={"ID":"0a0f1a6d-d0aa-4632-8dd9-0adbe8707e89","Type":"ContainerStarted","Data":"d71be91b423964895eb5b999d6b33f05c0d0c62e72e76cede619b1b8f7774be2"} Apr 23 16:34:57.899238 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:57.899218 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xrqlx" event={"ID":"a4c2c1ca-68ee-40d1-8110-1c24a086d157","Type":"ContainerStarted","Data":"7b5851bf532f7ee1ab1b2a979b0b94c338417be9b9aa1e45a3e821aaffa4f1b9"} Apr 23 16:34:57.907711 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:57.907646 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-137-14.ec2.internal" podStartSLOduration=1.907630307 podStartE2EDuration="1.907630307s" podCreationTimestamp="2026-04-23 16:34:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:34:57.906356744 +0000 UTC m=+3.596143091" watchObservedRunningTime="2026-04-23 16:34:57.907630307 +0000 UTC m=+3.597416654" Apr 23 16:34:58.447413 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:58.447354 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8306d95a-dbae-4dd7-bf93-637a12f98c59-metrics-certs\") pod \"network-metrics-daemon-jpzq7\" (UID: \"8306d95a-dbae-4dd7-bf93-637a12f98c59\") " pod="openshift-multus/network-metrics-daemon-jpzq7" Apr 23 16:34:58.447590 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:58.447506 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:58.447590 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:58.447577 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8306d95a-dbae-4dd7-bf93-637a12f98c59-metrics-certs podName:8306d95a-dbae-4dd7-bf93-637a12f98c59 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:00.447558358 +0000 UTC m=+6.137344686 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8306d95a-dbae-4dd7-bf93-637a12f98c59-metrics-certs") pod "network-metrics-daemon-jpzq7" (UID: "8306d95a-dbae-4dd7-bf93-637a12f98c59") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:34:58.547928 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:58.547882 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqgkx\" (UniqueName: \"kubernetes.io/projected/493d9466-44b1-4315-9f1b-a60f6bb428c1-kube-api-access-cqgkx\") pod \"network-check-target-ch99b\" (UID: \"493d9466-44b1-4315-9f1b-a60f6bb428c1\") " pod="openshift-network-diagnostics/network-check-target-ch99b" Apr 23 16:34:58.548101 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:58.548083 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:34:58.548161 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:58.548110 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:34:58.548161 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:58.548144 2573 projected.go:194] Error preparing data for projected volume kube-api-access-cqgkx for pod openshift-network-diagnostics/network-check-target-ch99b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:34:58.548253 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:58.548214 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/493d9466-44b1-4315-9f1b-a60f6bb428c1-kube-api-access-cqgkx podName:493d9466-44b1-4315-9f1b-a60f6bb428c1 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:00.548194315 +0000 UTC m=+6.237980647 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqgkx" (UniqueName: "kubernetes.io/projected/493d9466-44b1-4315-9f1b-a60f6bb428c1-kube-api-access-cqgkx") pod "network-check-target-ch99b" (UID: "493d9466-44b1-4315-9f1b-a60f6bb428c1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:34:58.878830 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:58.878745 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ch99b" Apr 23 16:34:58.879362 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:58.878878 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ch99b" podUID="493d9466-44b1-4315-9f1b-a60f6bb428c1" Apr 23 16:34:58.879362 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:58.878977 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpzq7" Apr 23 16:34:58.879362 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:34:58.879057 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpzq7" podUID="8306d95a-dbae-4dd7-bf93-637a12f98c59" Apr 23 16:34:58.910510 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:58.910466 2573 generic.go:358] "Generic (PLEG): container finished" podID="0085a20aac38133e753ce010973cc630" containerID="6e8ef3d0ba0d2790935a718521462a1850febedd0cc9ebb61851fea89c6ebb59" exitCode=0 Apr 23 16:34:58.911159 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:58.911130 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-14.ec2.internal" event={"ID":"0085a20aac38133e753ce010973cc630","Type":"ContainerDied","Data":"6e8ef3d0ba0d2790935a718521462a1850febedd0cc9ebb61851fea89c6ebb59"} Apr 23 16:34:59.928233 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:34:59.928193 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-14.ec2.internal" event={"ID":"0085a20aac38133e753ce010973cc630","Type":"ContainerStarted","Data":"e427ce0bcbbec85c355bc334dd2d081ca50cb1014d20982b5cb80a85580cbecf"} Apr 23 16:35:00.467079 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:00.467024 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8306d95a-dbae-4dd7-bf93-637a12f98c59-metrics-certs\") pod \"network-metrics-daemon-jpzq7\" (UID: \"8306d95a-dbae-4dd7-bf93-637a12f98c59\") " pod="openshift-multus/network-metrics-daemon-jpzq7" Apr 23 16:35:00.467255 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:00.467228 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:00.467337 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:00.467291 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8306d95a-dbae-4dd7-bf93-637a12f98c59-metrics-certs podName:8306d95a-dbae-4dd7-bf93-637a12f98c59 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:04.467273448 +0000 UTC m=+10.157059775 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8306d95a-dbae-4dd7-bf93-637a12f98c59-metrics-certs") pod "network-metrics-daemon-jpzq7" (UID: "8306d95a-dbae-4dd7-bf93-637a12f98c59") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:00.567861 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:00.567819 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqgkx\" (UniqueName: \"kubernetes.io/projected/493d9466-44b1-4315-9f1b-a60f6bb428c1-kube-api-access-cqgkx\") pod \"network-check-target-ch99b\" (UID: \"493d9466-44b1-4315-9f1b-a60f6bb428c1\") " pod="openshift-network-diagnostics/network-check-target-ch99b" Apr 23 16:35:00.568069 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:00.568011 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:00.568069 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:00.568033 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:00.568069 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:00.568047 2573 projected.go:194] Error preparing data for projected volume kube-api-access-cqgkx for pod openshift-network-diagnostics/network-check-target-ch99b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:00.568264 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:00.568108 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/493d9466-44b1-4315-9f1b-a60f6bb428c1-kube-api-access-cqgkx podName:493d9466-44b1-4315-9f1b-a60f6bb428c1 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:04.568088249 +0000 UTC m=+10.257874580 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqgkx" (UniqueName: "kubernetes.io/projected/493d9466-44b1-4315-9f1b-a60f6bb428c1-kube-api-access-cqgkx") pod "network-check-target-ch99b" (UID: "493d9466-44b1-4315-9f1b-a60f6bb428c1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:00.876028 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:00.875935 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpzq7" Apr 23 16:35:00.876197 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:00.876077 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpzq7" podUID="8306d95a-dbae-4dd7-bf93-637a12f98c59" Apr 23 16:35:00.876452 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:00.876433 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ch99b" Apr 23 16:35:00.876560 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:00.876539 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ch99b" podUID="493d9466-44b1-4315-9f1b-a60f6bb428c1" Apr 23 16:35:02.876386 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:02.876345 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpzq7" Apr 23 16:35:02.876867 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:02.876408 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ch99b" Apr 23 16:35:02.876867 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:02.876553 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpzq7" podUID="8306d95a-dbae-4dd7-bf93-637a12f98c59" Apr 23 16:35:02.877078 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:02.877054 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ch99b" podUID="493d9466-44b1-4315-9f1b-a60f6bb428c1" Apr 23 16:35:04.504632 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:04.504512 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8306d95a-dbae-4dd7-bf93-637a12f98c59-metrics-certs\") pod \"network-metrics-daemon-jpzq7\" (UID: \"8306d95a-dbae-4dd7-bf93-637a12f98c59\") " pod="openshift-multus/network-metrics-daemon-jpzq7" Apr 23 16:35:04.505145 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:04.504656 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:04.505145 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:04.504741 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8306d95a-dbae-4dd7-bf93-637a12f98c59-metrics-certs podName:8306d95a-dbae-4dd7-bf93-637a12f98c59 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:12.504721712 +0000 UTC m=+18.194508040 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8306d95a-dbae-4dd7-bf93-637a12f98c59-metrics-certs") pod "network-metrics-daemon-jpzq7" (UID: "8306d95a-dbae-4dd7-bf93-637a12f98c59") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:04.605194 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:04.605091 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqgkx\" (UniqueName: \"kubernetes.io/projected/493d9466-44b1-4315-9f1b-a60f6bb428c1-kube-api-access-cqgkx\") pod \"network-check-target-ch99b\" (UID: \"493d9466-44b1-4315-9f1b-a60f6bb428c1\") " pod="openshift-network-diagnostics/network-check-target-ch99b" Apr 23 16:35:04.605404 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:04.605284 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:04.605404 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:04.605303 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:04.605404 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:04.605318 2573 projected.go:194] Error preparing data for projected volume kube-api-access-cqgkx for pod openshift-network-diagnostics/network-check-target-ch99b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:04.605404 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:04.605386 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/493d9466-44b1-4315-9f1b-a60f6bb428c1-kube-api-access-cqgkx podName:493d9466-44b1-4315-9f1b-a60f6bb428c1 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:12.605367774 +0000 UTC m=+18.295154112 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqgkx" (UniqueName: "kubernetes.io/projected/493d9466-44b1-4315-9f1b-a60f6bb428c1-kube-api-access-cqgkx") pod "network-check-target-ch99b" (UID: "493d9466-44b1-4315-9f1b-a60f6bb428c1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:04.877768 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:04.877480 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpzq7" Apr 23 16:35:04.877768 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:04.877611 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpzq7" podUID="8306d95a-dbae-4dd7-bf93-637a12f98c59" Apr 23 16:35:04.877768 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:04.877619 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ch99b" Apr 23 16:35:04.877768 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:04.877720 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ch99b" podUID="493d9466-44b1-4315-9f1b-a60f6bb428c1" Apr 23 16:35:06.875544 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:06.875508 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpzq7" Apr 23 16:35:06.876050 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:06.875508 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ch99b" Apr 23 16:35:06.876050 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:06.875655 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpzq7" podUID="8306d95a-dbae-4dd7-bf93-637a12f98c59" Apr 23 16:35:06.876050 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:06.875737 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ch99b" podUID="493d9466-44b1-4315-9f1b-a60f6bb428c1" Apr 23 16:35:08.875952 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:08.875912 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpzq7" Apr 23 16:35:08.876408 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:08.875967 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ch99b" Apr 23 16:35:08.876408 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:08.876068 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpzq7" podUID="8306d95a-dbae-4dd7-bf93-637a12f98c59" Apr 23 16:35:08.876408 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:08.876157 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ch99b" podUID="493d9466-44b1-4315-9f1b-a60f6bb428c1" Apr 23 16:35:10.875914 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:10.875868 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ch99b" Apr 23 16:35:10.876354 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:10.875995 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ch99b" podUID="493d9466-44b1-4315-9f1b-a60f6bb428c1" Apr 23 16:35:10.876354 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:10.876061 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpzq7" Apr 23 16:35:10.876354 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:10.876191 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpzq7" podUID="8306d95a-dbae-4dd7-bf93-637a12f98c59" Apr 23 16:35:12.560810 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:12.560771 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8306d95a-dbae-4dd7-bf93-637a12f98c59-metrics-certs\") pod \"network-metrics-daemon-jpzq7\" (UID: \"8306d95a-dbae-4dd7-bf93-637a12f98c59\") " pod="openshift-multus/network-metrics-daemon-jpzq7" Apr 23 16:35:12.561277 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:12.560960 2573 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:12.561277 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:12.561069 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8306d95a-dbae-4dd7-bf93-637a12f98c59-metrics-certs podName:8306d95a-dbae-4dd7-bf93-637a12f98c59 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:28.56102358 +0000 UTC m=+34.250809906 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8306d95a-dbae-4dd7-bf93-637a12f98c59-metrics-certs") pod "network-metrics-daemon-jpzq7" (UID: "8306d95a-dbae-4dd7-bf93-637a12f98c59") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 16:35:12.661963 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:12.661912 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqgkx\" (UniqueName: \"kubernetes.io/projected/493d9466-44b1-4315-9f1b-a60f6bb428c1-kube-api-access-cqgkx\") pod \"network-check-target-ch99b\" (UID: \"493d9466-44b1-4315-9f1b-a60f6bb428c1\") " pod="openshift-network-diagnostics/network-check-target-ch99b" Apr 23 16:35:12.662158 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:12.662088 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 16:35:12.662158 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:12.662113 2573 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 16:35:12.662158 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:12.662126 2573 projected.go:194] Error preparing data for projected volume kube-api-access-cqgkx for pod openshift-network-diagnostics/network-check-target-ch99b: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:12.662315 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:12.662196 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/493d9466-44b1-4315-9f1b-a60f6bb428c1-kube-api-access-cqgkx podName:493d9466-44b1-4315-9f1b-a60f6bb428c1 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:28.662175505 +0000 UTC m=+34.351961851 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqgkx" (UniqueName: "kubernetes.io/projected/493d9466-44b1-4315-9f1b-a60f6bb428c1-kube-api-access-cqgkx") pod "network-check-target-ch99b" (UID: "493d9466-44b1-4315-9f1b-a60f6bb428c1") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 16:35:12.876207 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:12.876115 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpzq7" Apr 23 16:35:12.876366 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:12.876115 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ch99b" Apr 23 16:35:12.876366 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:12.876238 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpzq7" podUID="8306d95a-dbae-4dd7-bf93-637a12f98c59" Apr 23 16:35:12.876366 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:12.876287 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ch99b" podUID="493d9466-44b1-4315-9f1b-a60f6bb428c1" Apr 23 16:35:14.877618 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:14.877589 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpzq7" Apr 23 16:35:14.878449 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:14.877750 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpzq7" podUID="8306d95a-dbae-4dd7-bf93-637a12f98c59" Apr 23 16:35:14.878449 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:14.877816 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ch99b" Apr 23 16:35:14.878449 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:14.877897 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ch99b" podUID="493d9466-44b1-4315-9f1b-a60f6bb428c1" Apr 23 16:35:14.955508 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:14.955479 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-b7t9t" event={"ID":"0056f1d2-57d7-40d1-9290-31c514f0d40e","Type":"ContainerStarted","Data":"fbb2745c32eafbdc1e025042915466bbb3f23ae3b32eaa9f311ac6096e15ca11"} Apr 23 16:35:14.957114 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:14.957073 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" event={"ID":"52988e90-484a-49cd-98f6-5510a28890d6","Type":"ContainerStarted","Data":"3499ae957b2c3f9543597a76442750693859c6cbd3bb505b5f4426f0c762bd0b"} Apr 23 16:35:14.958189 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:14.958101 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-xjm86" event={"ID":"cb4c9ed2-c60a-4c94-9f67-e156f422d6a0","Type":"ContainerStarted","Data":"bb9c95930e83d5aa21ea1ced0fa688fed6d89fadbe4f47d6c8942871a5af6b2d"} Apr 23 16:35:14.960268 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:14.960232 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rq49t" event={"ID":"d28475ec-752d-4947-9e85-f35681ad68ab","Type":"ContainerStarted","Data":"026b5e33c8de8501f188d258465d47e3befdce1625373f75db9a5a250b342817"} Apr 23 16:35:14.961785 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:14.961752 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tk6kx" event={"ID":"0a0f1a6d-d0aa-4632-8dd9-0adbe8707e89","Type":"ContainerStarted","Data":"331bc3864412e12d26e158ad33157c1a6985a4e7314940305a7127920f7a3ee5"} Apr 23 16:35:14.963203 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:14.963135 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8zwmw" event={"ID":"0f6f780e-a2ae-473d-ad75-c644275b6cdb","Type":"ContainerStarted","Data":"4eb7adf2e9eecea26dbaa1cb4522db6054a5a9d7b7ac68eb671c059d704680a3"} Apr 23 16:35:14.992286 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:14.992230 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-137-14.ec2.internal" podStartSLOduration=18.992210174 podStartE2EDuration="18.992210174s" podCreationTimestamp="2026-04-23 16:34:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:34:59.944160785 +0000 UTC m=+5.633947132" watchObservedRunningTime="2026-04-23 16:35:14.992210174 +0000 UTC m=+20.681996521" Apr 23 16:35:14.992481 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:14.992448 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-b7t9t" podStartSLOduration=3.91978389 podStartE2EDuration="20.992439277s" podCreationTimestamp="2026-04-23 16:34:54 +0000 UTC" firstStartedPulling="2026-04-23 16:34:57.575794584 +0000 UTC m=+3.265580914" lastFinishedPulling="2026-04-23 16:35:14.648449961 +0000 UTC m=+20.338236301" observedRunningTime="2026-04-23 16:35:14.990974398 +0000 UTC m=+20.680760767" watchObservedRunningTime="2026-04-23 16:35:14.992439277 +0000 UTC m=+20.682225624" Apr 23 16:35:15.091861 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:15.091648 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-rq49t" podStartSLOduration=4.069414668 podStartE2EDuration="21.09163276s" podCreationTimestamp="2026-04-23 16:34:54 +0000 UTC" firstStartedPulling="2026-04-23 16:34:57.606430864 +0000 UTC m=+3.296217192" lastFinishedPulling="2026-04-23 16:35:14.628648947 +0000 UTC m=+20.318435284" observedRunningTime="2026-04-23 16:35:15.048076073 +0000 UTC m=+20.737862419" watchObservedRunningTime="2026-04-23 16:35:15.09163276 +0000 UTC m=+20.781419083" Apr 23 16:35:15.970153 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:15.970069 2573 generic.go:358] "Generic (PLEG): container finished" podID="0f6f780e-a2ae-473d-ad75-c644275b6cdb" containerID="4eb7adf2e9eecea26dbaa1cb4522db6054a5a9d7b7ac68eb671c059d704680a3" exitCode=0 Apr 23 16:35:15.970153 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:15.970144 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8zwmw" event={"ID":"0f6f780e-a2ae-473d-ad75-c644275b6cdb","Type":"ContainerDied","Data":"4eb7adf2e9eecea26dbaa1cb4522db6054a5a9d7b7ac68eb671c059d704680a3"} Apr 23 16:35:15.971656 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:15.971527 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-b8tpj" event={"ID":"56124f3f-030d-47d4-99f9-65b3011d5573","Type":"ContainerStarted","Data":"3162f49d16279883ee98e3a68b85c36b7d1ecda3c426bee0592220f6d7736874"} Apr 23 16:35:15.973823 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:15.973801 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wd2cz_52988e90-484a-49cd-98f6-5510a28890d6/ovn-acl-logging/0.log" Apr 23 16:35:15.974095 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:15.974074 2573 generic.go:358] "Generic (PLEG): container finished" podID="52988e90-484a-49cd-98f6-5510a28890d6" containerID="6768081a1cfd57a6e36590720e2f03e335e93dcf4cea120e12e204346b5dd415" exitCode=1 Apr 23 16:35:15.974172 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:15.974100 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" event={"ID":"52988e90-484a-49cd-98f6-5510a28890d6","Type":"ContainerStarted","Data":"007c9e1b527f789380db66ee44b043b6411900f166df38e968a850e66b83591e"} Apr 23 16:35:15.974172 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:15.974133 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" event={"ID":"52988e90-484a-49cd-98f6-5510a28890d6","Type":"ContainerStarted","Data":"f1e6964be7c5eb2b536e335807356ff89ef0595e37ddf8b3780e34217e1348e8"} Apr 23 16:35:15.974172 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:15.974149 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" event={"ID":"52988e90-484a-49cd-98f6-5510a28890d6","Type":"ContainerStarted","Data":"8c78a2c726db74dfba647c69fdac9744861ebf3d7b51df1cf9fc3f85e6858d26"} Apr 23 16:35:15.974172 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:15.974161 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" event={"ID":"52988e90-484a-49cd-98f6-5510a28890d6","Type":"ContainerStarted","Data":"c46e1558266b40391663d0ff69f5579de375829e0ea427c679d55ce5702a6561"} Apr 23 16:35:15.974329 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:15.974174 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" event={"ID":"52988e90-484a-49cd-98f6-5510a28890d6","Type":"ContainerDied","Data":"6768081a1cfd57a6e36590720e2f03e335e93dcf4cea120e12e204346b5dd415"} Apr 23 16:35:15.975388 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:15.975366 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4wr" event={"ID":"5b2b1418-dfa9-48eb-9f15-1110e4874cd2","Type":"ContainerStarted","Data":"a45e3879ae2a8de44278db06e84dda99c8dd9a20671d77632e945787f19653dd"} Apr 23 16:35:16.006126 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:16.006067 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-xjm86" podStartSLOduration=4.04009361 podStartE2EDuration="21.006052796s" podCreationTimestamp="2026-04-23 16:34:55 +0000 UTC" firstStartedPulling="2026-04-23 16:34:57.599864673 +0000 UTC m=+3.289651000" lastFinishedPulling="2026-04-23 16:35:14.565823859 +0000 UTC m=+20.255610186" observedRunningTime="2026-04-23 16:35:15.092677061 +0000 UTC m=+20.782463406" watchObservedRunningTime="2026-04-23 16:35:16.006052796 +0000 UTC m=+21.695839237" Apr 23 16:35:16.041498 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:16.041441 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-b8tpj" podStartSLOduration=5.011953597 podStartE2EDuration="22.04142538s" podCreationTimestamp="2026-04-23 16:34:54 +0000 UTC" firstStartedPulling="2026-04-23 16:34:57.599720498 +0000 UTC m=+3.289506833" lastFinishedPulling="2026-04-23 16:35:14.629192287 +0000 UTC m=+20.318978616" observedRunningTime="2026-04-23 16:35:16.024804229 +0000 UTC m=+21.714590574" watchObservedRunningTime="2026-04-23 16:35:16.04142538 +0000 UTC m=+21.731211740" Apr 23 16:35:16.059024 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:16.058995 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-rq49t" Apr 23 16:35:16.456010 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:16.455788 2573 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 16:35:16.484645 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:16.484612 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-rq49t" Apr 23 16:35:16.485319 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:16.485298 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-rq49t" Apr 23 16:35:16.502348 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:16.502293 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-tk6kx" podStartSLOduration=4.47933895 podStartE2EDuration="21.502274644s" podCreationTimestamp="2026-04-23 16:34:55 +0000 UTC" firstStartedPulling="2026-04-23 16:34:57.606502053 +0000 UTC m=+3.296288383" lastFinishedPulling="2026-04-23 16:35:14.629437739 +0000 UTC m=+20.319224077" observedRunningTime="2026-04-23 16:35:16.040997266 +0000 UTC m=+21.730783612" watchObservedRunningTime="2026-04-23 16:35:16.502274644 +0000 UTC m=+22.192060991" Apr 23 16:35:16.789333 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:16.789175 2573 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T16:35:16.45600053Z","UUID":"84ab38fe-5ade-408c-8b75-40437ecfebba","Handler":null,"Name":"","Endpoint":""} Apr 23 16:35:16.791078 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:16.791051 2573 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 16:35:16.791078 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:16.791083 2573 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 16:35:16.876368 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:16.876334 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpzq7" Apr 23 16:35:16.876520 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:16.876334 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ch99b" Apr 23 16:35:16.876520 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:16.876458 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpzq7" podUID="8306d95a-dbae-4dd7-bf93-637a12f98c59" Apr 23 16:35:16.876586 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:16.876521 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ch99b" podUID="493d9466-44b1-4315-9f1b-a60f6bb428c1" Apr 23 16:35:16.979948 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:16.979902 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4wr" event={"ID":"5b2b1418-dfa9-48eb-9f15-1110e4874cd2","Type":"ContainerStarted","Data":"d0c71f12d57a4575bf2a448fb327e36de798df6747b43369a0e1e55856942e89"} Apr 23 16:35:16.981545 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:16.981517 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-xrqlx" event={"ID":"a4c2c1ca-68ee-40d1-8110-1c24a086d157","Type":"ContainerStarted","Data":"7b561d4811a9948cad9cc96321eb69f5a8bf8ce927fd4e2462e625a39ba12137"} Apr 23 16:35:16.982416 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:16.982385 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-rq49t" Apr 23 16:35:16.996177 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:16.996132 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-xrqlx" podStartSLOduration=5.972495516 podStartE2EDuration="22.996115358s" podCreationTimestamp="2026-04-23 16:34:54 +0000 UTC" firstStartedPulling="2026-04-23 16:34:57.606594945 +0000 UTC m=+3.296381270" lastFinishedPulling="2026-04-23 16:35:14.630214771 +0000 UTC m=+20.320001112" observedRunningTime="2026-04-23 16:35:16.995860416 +0000 UTC m=+22.685646763" watchObservedRunningTime="2026-04-23 16:35:16.996115358 +0000 UTC m=+22.685901705" Apr 23 16:35:17.986274 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:17.986244 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wd2cz_52988e90-484a-49cd-98f6-5510a28890d6/ovn-acl-logging/0.log" Apr 23 16:35:17.986801 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:17.986618 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" event={"ID":"52988e90-484a-49cd-98f6-5510a28890d6","Type":"ContainerStarted","Data":"e61c1d5c9adda3c60023dbc3c2e7afef1e5c308996efd334a872a7c761d75b66"} Apr 23 16:35:17.988590 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:17.988565 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4wr" event={"ID":"5b2b1418-dfa9-48eb-9f15-1110e4874cd2","Type":"ContainerStarted","Data":"871c8abfd6562ef1c5209cbd5ced9120fcecf095651ccb6ce4679130a66fc14a"} Apr 23 16:35:18.875978 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:18.875945 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpzq7" Apr 23 16:35:18.876161 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:18.875985 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ch99b" Apr 23 16:35:18.876161 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:18.876083 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpzq7" podUID="8306d95a-dbae-4dd7-bf93-637a12f98c59" Apr 23 16:35:18.876280 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:18.876264 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ch99b" podUID="493d9466-44b1-4315-9f1b-a60f6bb428c1" Apr 23 16:35:20.875493 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:20.875295 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ch99b" Apr 23 16:35:20.876333 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:20.875295 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpzq7" Apr 23 16:35:20.876333 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:20.875564 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ch99b" podUID="493d9466-44b1-4315-9f1b-a60f6bb428c1" Apr 23 16:35:20.876333 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:20.875635 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpzq7" podUID="8306d95a-dbae-4dd7-bf93-637a12f98c59" Apr 23 16:35:20.995158 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:20.995125 2573 generic.go:358] "Generic (PLEG): container finished" podID="0f6f780e-a2ae-473d-ad75-c644275b6cdb" containerID="3189cd0b07fbc345dd9ab116a3b3aaebd15b6956792e53e3e4ed9f1f1cc807d2" exitCode=0 Apr 23 16:35:20.995313 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:20.995200 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8zwmw" event={"ID":"0f6f780e-a2ae-473d-ad75-c644275b6cdb","Type":"ContainerDied","Data":"3189cd0b07fbc345dd9ab116a3b3aaebd15b6956792e53e3e4ed9f1f1cc807d2"} Apr 23 16:35:20.998315 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:20.998295 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wd2cz_52988e90-484a-49cd-98f6-5510a28890d6/ovn-acl-logging/0.log" Apr 23 16:35:20.998678 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:20.998650 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" event={"ID":"52988e90-484a-49cd-98f6-5510a28890d6","Type":"ContainerStarted","Data":"fa784a54f8ed8153cd50d5bb18eb67f232d8b057c28584db9e169df2824acf13"} Apr 23 16:35:20.998988 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:20.998971 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:35:20.999052 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:20.998994 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:35:20.999123 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:20.999109 2573 scope.go:117] "RemoveContainer" containerID="6768081a1cfd57a6e36590720e2f03e335e93dcf4cea120e12e204346b5dd415" Apr 23 16:35:21.014625 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:21.014603 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:35:21.023905 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:21.023863 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-8g4wr" podStartSLOduration=7.014528721 podStartE2EDuration="27.023848772s" podCreationTimestamp="2026-04-23 16:34:54 +0000 UTC" firstStartedPulling="2026-04-23 16:34:57.606621851 +0000 UTC m=+3.296408175" lastFinishedPulling="2026-04-23 16:35:17.615941882 +0000 UTC m=+23.305728226" observedRunningTime="2026-04-23 16:35:18.007150783 +0000 UTC m=+23.696937153" watchObservedRunningTime="2026-04-23 16:35:21.023848772 +0000 UTC m=+26.713635118" Apr 23 16:35:22.002032 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:22.001989 2573 generic.go:358] "Generic (PLEG): container finished" podID="0f6f780e-a2ae-473d-ad75-c644275b6cdb" containerID="c89bcd273588c71e64283f74291e994fd05155036f9dc4f40ecc63ec5a95db02" exitCode=0 Apr 23 16:35:22.002609 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:22.002047 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8zwmw" event={"ID":"0f6f780e-a2ae-473d-ad75-c644275b6cdb","Type":"ContainerDied","Data":"c89bcd273588c71e64283f74291e994fd05155036f9dc4f40ecc63ec5a95db02"} Apr 23 16:35:22.005578 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:22.005554 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wd2cz_52988e90-484a-49cd-98f6-5510a28890d6/ovn-acl-logging/0.log" Apr 23 16:35:22.005928 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:22.005907 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" event={"ID":"52988e90-484a-49cd-98f6-5510a28890d6","Type":"ContainerStarted","Data":"3137f7c0564f379678b42690331cc6415f6eb2f93b2aed8aa3a2462a83c242f0"} Apr 23 16:35:22.006278 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:22.006257 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:35:22.020355 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:22.020328 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:35:22.049942 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:22.049889 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" podStartSLOduration=10.947696622 podStartE2EDuration="28.04987321s" podCreationTimestamp="2026-04-23 16:34:54 +0000 UTC" firstStartedPulling="2026-04-23 16:34:57.571319806 +0000 UTC m=+3.261106139" lastFinishedPulling="2026-04-23 16:35:14.673496401 +0000 UTC m=+20.363282727" observedRunningTime="2026-04-23 16:35:22.049358948 +0000 UTC m=+27.739145335" watchObservedRunningTime="2026-04-23 16:35:22.04987321 +0000 UTC m=+27.739659556" Apr 23 16:35:22.875328 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:22.875290 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpzq7" Apr 23 16:35:22.875521 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:22.875301 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ch99b" Apr 23 16:35:22.875521 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:22.875414 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpzq7" podUID="8306d95a-dbae-4dd7-bf93-637a12f98c59" Apr 23 16:35:22.875521 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:22.875458 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ch99b" podUID="493d9466-44b1-4315-9f1b-a60f6bb428c1" Apr 23 16:35:23.009590 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:23.009501 2573 generic.go:358] "Generic (PLEG): container finished" podID="0f6f780e-a2ae-473d-ad75-c644275b6cdb" containerID="51a46cd0301e2429c0c936376405ea970e7b5079afdf51608f649e2f43538007" exitCode=0 Apr 23 16:35:23.009590 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:23.009552 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8zwmw" event={"ID":"0f6f780e-a2ae-473d-ad75-c644275b6cdb","Type":"ContainerDied","Data":"51a46cd0301e2429c0c936376405ea970e7b5079afdf51608f649e2f43538007"} Apr 23 16:35:23.919759 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:23.919497 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ch99b"] Apr 23 16:35:23.919921 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:23.919855 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ch99b" Apr 23 16:35:23.919981 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:23.919963 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ch99b" podUID="493d9466-44b1-4315-9f1b-a60f6bb428c1" Apr 23 16:35:23.921210 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:23.920975 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jpzq7"] Apr 23 16:35:23.921210 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:23.921083 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpzq7" Apr 23 16:35:23.921210 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:23.921186 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpzq7" podUID="8306d95a-dbae-4dd7-bf93-637a12f98c59" Apr 23 16:35:25.875390 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:25.875350 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpzq7" Apr 23 16:35:25.876028 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:25.875349 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ch99b" Apr 23 16:35:25.876028 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:25.875511 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-jpzq7" podUID="8306d95a-dbae-4dd7-bf93-637a12f98c59" Apr 23 16:35:25.876028 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:25.875575 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-ch99b" podUID="493d9466-44b1-4315-9f1b-a60f6bb428c1" Apr 23 16:35:27.637983 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.637902 2573 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-137-14.ec2.internal" event="NodeReady" Apr 23 16:35:27.638480 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.638059 2573 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 16:35:27.688748 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.688708 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6d44bcc48b-45nq9"] Apr 23 16:35:27.693005 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.692977 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d44bcc48b-45nq9" Apr 23 16:35:27.695837 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.695806 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-q6vbq\"" Apr 23 16:35:27.696024 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.695947 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 16:35:27.696086 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.696028 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 16:35:27.696192 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.696123 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 16:35:27.700976 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.700813 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 16:35:27.709107 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.709077 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6d44bcc48b-45nq9"] Apr 23 16:35:27.717940 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.717913 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-g4m2z"] Apr 23 16:35:27.721368 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.721296 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-c9zd9"] Apr 23 16:35:27.721576 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.721548 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-g4m2z" Apr 23 16:35:27.723435 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.723412 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 16:35:27.723603 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.723587 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 16:35:27.724558 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.724539 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-nghfn\"" Apr 23 16:35:27.724645 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.724560 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 16:35:27.725021 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.725001 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-c9zd9" Apr 23 16:35:27.726580 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.726551 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 16:35:27.726681 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.726587 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-tz9n7\"" Apr 23 16:35:27.726872 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.726857 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 16:35:27.730788 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.730768 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-g4m2z"] Apr 23 16:35:27.734301 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.734281 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-c9zd9"] Apr 23 16:35:27.755587 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.755557 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-8nvz6"] Apr 23 16:35:27.758918 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.758881 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8nvz6" Apr 23 16:35:27.761209 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.761115 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 16:35:27.761209 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.761190 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 16:35:27.761408 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.761190 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 16:35:27.761408 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.761197 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-nzf9q\"" Apr 23 16:35:27.762628 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.762600 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 16:35:27.770625 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.770600 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8nvz6"] Apr 23 16:35:27.773067 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.773042 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e39f91f1-ba00-471a-9d79-c0574e83f873-image-registry-private-configuration\") pod \"image-registry-6d44bcc48b-45nq9\" (UID: \"e39f91f1-ba00-471a-9d79-c0574e83f873\") " pod="openshift-image-registry/image-registry-6d44bcc48b-45nq9" Apr 23 16:35:27.773202 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.773086 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e39f91f1-ba00-471a-9d79-c0574e83f873-registry-tls\") pod \"image-registry-6d44bcc48b-45nq9\" (UID: \"e39f91f1-ba00-471a-9d79-c0574e83f873\") " pod="openshift-image-registry/image-registry-6d44bcc48b-45nq9" Apr 23 16:35:27.773202 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.773106 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e39f91f1-ba00-471a-9d79-c0574e83f873-bound-sa-token\") pod \"image-registry-6d44bcc48b-45nq9\" (UID: \"e39f91f1-ba00-471a-9d79-c0574e83f873\") " pod="openshift-image-registry/image-registry-6d44bcc48b-45nq9" Apr 23 16:35:27.773202 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.773165 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e39f91f1-ba00-471a-9d79-c0574e83f873-registry-certificates\") pod \"image-registry-6d44bcc48b-45nq9\" (UID: \"e39f91f1-ba00-471a-9d79-c0574e83f873\") " pod="openshift-image-registry/image-registry-6d44bcc48b-45nq9" Apr 23 16:35:27.773202 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.773201 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e39f91f1-ba00-471a-9d79-c0574e83f873-ca-trust-extracted\") pod \"image-registry-6d44bcc48b-45nq9\" (UID: \"e39f91f1-ba00-471a-9d79-c0574e83f873\") " pod="openshift-image-registry/image-registry-6d44bcc48b-45nq9" Apr 23 16:35:27.773409 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.773224 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e39f91f1-ba00-471a-9d79-c0574e83f873-trusted-ca\") pod \"image-registry-6d44bcc48b-45nq9\" (UID: \"e39f91f1-ba00-471a-9d79-c0574e83f873\") " pod="openshift-image-registry/image-registry-6d44bcc48b-45nq9" Apr 23 16:35:27.773409 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.773247 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmsrb\" (UniqueName: \"kubernetes.io/projected/e39f91f1-ba00-471a-9d79-c0574e83f873-kube-api-access-cmsrb\") pod \"image-registry-6d44bcc48b-45nq9\" (UID: \"e39f91f1-ba00-471a-9d79-c0574e83f873\") " pod="openshift-image-registry/image-registry-6d44bcc48b-45nq9" Apr 23 16:35:27.773409 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.773283 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e39f91f1-ba00-471a-9d79-c0574e83f873-installation-pull-secrets\") pod \"image-registry-6d44bcc48b-45nq9\" (UID: \"e39f91f1-ba00-471a-9d79-c0574e83f873\") " pod="openshift-image-registry/image-registry-6d44bcc48b-45nq9" Apr 23 16:35:27.874668 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.874621 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e39f91f1-ba00-471a-9d79-c0574e83f873-image-registry-private-configuration\") pod \"image-registry-6d44bcc48b-45nq9\" (UID: \"e39f91f1-ba00-471a-9d79-c0574e83f873\") " pod="openshift-image-registry/image-registry-6d44bcc48b-45nq9" Apr 23 16:35:27.874875 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.874678 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e39f91f1-ba00-471a-9d79-c0574e83f873-registry-tls\") pod \"image-registry-6d44bcc48b-45nq9\" (UID: \"e39f91f1-ba00-471a-9d79-c0574e83f873\") " pod="openshift-image-registry/image-registry-6d44bcc48b-45nq9" Apr 23 16:35:27.874951 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.874898 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e39f91f1-ba00-471a-9d79-c0574e83f873-bound-sa-token\") pod \"image-registry-6d44bcc48b-45nq9\" (UID: \"e39f91f1-ba00-471a-9d79-c0574e83f873\") " pod="openshift-image-registry/image-registry-6d44bcc48b-45nq9" Apr 23 16:35:27.875011 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.874950 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a887758a-12b6-4011-819b-3e2204768863-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8nvz6\" (UID: \"a887758a-12b6-4011-819b-3e2204768863\") " pod="openshift-insights/insights-runtime-extractor-8nvz6" Apr 23 16:35:27.875011 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.874985 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwfvv\" (UniqueName: \"kubernetes.io/projected/a0c13d03-c009-47c2-b8ff-f968c81cb35c-kube-api-access-hwfvv\") pod \"ingress-canary-g4m2z\" (UID: \"a0c13d03-c009-47c2-b8ff-f968c81cb35c\") " pod="openshift-ingress-canary/ingress-canary-g4m2z" Apr 23 16:35:27.875111 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.875020 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a887758a-12b6-4011-819b-3e2204768863-data-volume\") pod \"insights-runtime-extractor-8nvz6\" (UID: \"a887758a-12b6-4011-819b-3e2204768863\") " pod="openshift-insights/insights-runtime-extractor-8nvz6" Apr 23 16:35:27.875111 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.875093 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rrmt\" (UniqueName: \"kubernetes.io/projected/1c58deba-82d5-4b30-a362-237c21e8311b-kube-api-access-2rrmt\") pod \"dns-default-c9zd9\" (UID: \"1c58deba-82d5-4b30-a362-237c21e8311b\") " pod="openshift-dns/dns-default-c9zd9" Apr 23 16:35:27.875207 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.875153 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e39f91f1-ba00-471a-9d79-c0574e83f873-registry-certificates\") pod \"image-registry-6d44bcc48b-45nq9\" (UID: \"e39f91f1-ba00-471a-9d79-c0574e83f873\") " pod="openshift-image-registry/image-registry-6d44bcc48b-45nq9" Apr 23 16:35:27.875207 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.875185 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c58deba-82d5-4b30-a362-237c21e8311b-config-volume\") pod \"dns-default-c9zd9\" (UID: \"1c58deba-82d5-4b30-a362-237c21e8311b\") " pod="openshift-dns/dns-default-c9zd9" Apr 23 16:35:27.875275 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.875208 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a887758a-12b6-4011-819b-3e2204768863-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8nvz6\" (UID: \"a887758a-12b6-4011-819b-3e2204768863\") " pod="openshift-insights/insights-runtime-extractor-8nvz6" Apr 23 16:35:27.875275 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.875230 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0c13d03-c009-47c2-b8ff-f968c81cb35c-cert\") pod \"ingress-canary-g4m2z\" (UID: \"a0c13d03-c009-47c2-b8ff-f968c81cb35c\") " pod="openshift-ingress-canary/ingress-canary-g4m2z" Apr 23 16:35:27.875275 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.875263 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e39f91f1-ba00-471a-9d79-c0574e83f873-ca-trust-extracted\") pod \"image-registry-6d44bcc48b-45nq9\" (UID: \"e39f91f1-ba00-471a-9d79-c0574e83f873\") " pod="openshift-image-registry/image-registry-6d44bcc48b-45nq9" Apr 23 16:35:27.875388 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.875285 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e39f91f1-ba00-471a-9d79-c0574e83f873-trusted-ca\") pod \"image-registry-6d44bcc48b-45nq9\" (UID: \"e39f91f1-ba00-471a-9d79-c0574e83f873\") " pod="openshift-image-registry/image-registry-6d44bcc48b-45nq9" Apr 23 16:35:27.875388 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.875308 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmsrb\" (UniqueName: \"kubernetes.io/projected/e39f91f1-ba00-471a-9d79-c0574e83f873-kube-api-access-cmsrb\") pod \"image-registry-6d44bcc48b-45nq9\" (UID: \"e39f91f1-ba00-471a-9d79-c0574e83f873\") " pod="openshift-image-registry/image-registry-6d44bcc48b-45nq9" Apr 23 16:35:27.875388 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.875353 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e39f91f1-ba00-471a-9d79-c0574e83f873-installation-pull-secrets\") pod \"image-registry-6d44bcc48b-45nq9\" (UID: \"e39f91f1-ba00-471a-9d79-c0574e83f873\") " pod="openshift-image-registry/image-registry-6d44bcc48b-45nq9" Apr 23 16:35:27.875537 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.875393 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a887758a-12b6-4011-819b-3e2204768863-crio-socket\") pod \"insights-runtime-extractor-8nvz6\" (UID: \"a887758a-12b6-4011-819b-3e2204768863\") " pod="openshift-insights/insights-runtime-extractor-8nvz6" Apr 23 16:35:27.875537 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.875427 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k5bj6\" (UniqueName: \"kubernetes.io/projected/a887758a-12b6-4011-819b-3e2204768863-kube-api-access-k5bj6\") pod \"insights-runtime-extractor-8nvz6\" (UID: \"a887758a-12b6-4011-819b-3e2204768863\") " pod="openshift-insights/insights-runtime-extractor-8nvz6" Apr 23 16:35:27.875537 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.875469 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c58deba-82d5-4b30-a362-237c21e8311b-metrics-tls\") pod \"dns-default-c9zd9\" (UID: \"1c58deba-82d5-4b30-a362-237c21e8311b\") " pod="openshift-dns/dns-default-c9zd9" Apr 23 16:35:27.875537 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.875493 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1c58deba-82d5-4b30-a362-237c21e8311b-tmp-dir\") pod \"dns-default-c9zd9\" (UID: \"1c58deba-82d5-4b30-a362-237c21e8311b\") " pod="openshift-dns/dns-default-c9zd9" Apr 23 16:35:27.876105 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.876080 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/e39f91f1-ba00-471a-9d79-c0574e83f873-registry-certificates\") pod \"image-registry-6d44bcc48b-45nq9\" (UID: \"e39f91f1-ba00-471a-9d79-c0574e83f873\") " pod="openshift-image-registry/image-registry-6d44bcc48b-45nq9" Apr 23 16:35:27.876224 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.876139 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/e39f91f1-ba00-471a-9d79-c0574e83f873-ca-trust-extracted\") pod \"image-registry-6d44bcc48b-45nq9\" (UID: \"e39f91f1-ba00-471a-9d79-c0574e83f873\") " pod="openshift-image-registry/image-registry-6d44bcc48b-45nq9" Apr 23 16:35:27.876287 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.876250 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ch99b" Apr 23 16:35:27.876614 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.876251 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpzq7" Apr 23 16:35:27.876614 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.876393 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/e39f91f1-ba00-471a-9d79-c0574e83f873-trusted-ca\") pod \"image-registry-6d44bcc48b-45nq9\" (UID: \"e39f91f1-ba00-471a-9d79-c0574e83f873\") " pod="openshift-image-registry/image-registry-6d44bcc48b-45nq9" Apr 23 16:35:27.879377 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.878669 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 16:35:27.879377 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.878925 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 16:35:27.879377 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.879011 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 16:35:27.879377 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.879165 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-vvm92\"" Apr 23 16:35:27.879377 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.879200 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-4csk9\"" Apr 23 16:35:27.880133 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.880101 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/e39f91f1-ba00-471a-9d79-c0574e83f873-image-registry-private-configuration\") pod \"image-registry-6d44bcc48b-45nq9\" (UID: \"e39f91f1-ba00-471a-9d79-c0574e83f873\") " pod="openshift-image-registry/image-registry-6d44bcc48b-45nq9" Apr 23 16:35:27.880249 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.880101 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/e39f91f1-ba00-471a-9d79-c0574e83f873-installation-pull-secrets\") pod \"image-registry-6d44bcc48b-45nq9\" (UID: \"e39f91f1-ba00-471a-9d79-c0574e83f873\") " pod="openshift-image-registry/image-registry-6d44bcc48b-45nq9" Apr 23 16:35:27.880249 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.880169 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/e39f91f1-ba00-471a-9d79-c0574e83f873-registry-tls\") pod \"image-registry-6d44bcc48b-45nq9\" (UID: \"e39f91f1-ba00-471a-9d79-c0574e83f873\") " pod="openshift-image-registry/image-registry-6d44bcc48b-45nq9" Apr 23 16:35:27.883127 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.883096 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/e39f91f1-ba00-471a-9d79-c0574e83f873-bound-sa-token\") pod \"image-registry-6d44bcc48b-45nq9\" (UID: \"e39f91f1-ba00-471a-9d79-c0574e83f873\") " pod="openshift-image-registry/image-registry-6d44bcc48b-45nq9" Apr 23 16:35:27.885306 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.885283 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmsrb\" (UniqueName: \"kubernetes.io/projected/e39f91f1-ba00-471a-9d79-c0574e83f873-kube-api-access-cmsrb\") pod \"image-registry-6d44bcc48b-45nq9\" (UID: \"e39f91f1-ba00-471a-9d79-c0574e83f873\") " pod="openshift-image-registry/image-registry-6d44bcc48b-45nq9" Apr 23 16:35:27.976317 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.976278 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a887758a-12b6-4011-819b-3e2204768863-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8nvz6\" (UID: \"a887758a-12b6-4011-819b-3e2204768863\") " pod="openshift-insights/insights-runtime-extractor-8nvz6" Apr 23 16:35:27.976508 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.976328 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwfvv\" (UniqueName: \"kubernetes.io/projected/a0c13d03-c009-47c2-b8ff-f968c81cb35c-kube-api-access-hwfvv\") pod \"ingress-canary-g4m2z\" (UID: \"a0c13d03-c009-47c2-b8ff-f968c81cb35c\") " pod="openshift-ingress-canary/ingress-canary-g4m2z" Apr 23 16:35:27.976508 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.976441 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a887758a-12b6-4011-819b-3e2204768863-data-volume\") pod \"insights-runtime-extractor-8nvz6\" (UID: \"a887758a-12b6-4011-819b-3e2204768863\") " pod="openshift-insights/insights-runtime-extractor-8nvz6" Apr 23 16:35:27.976508 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.976476 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2rrmt\" (UniqueName: \"kubernetes.io/projected/1c58deba-82d5-4b30-a362-237c21e8311b-kube-api-access-2rrmt\") pod \"dns-default-c9zd9\" (UID: \"1c58deba-82d5-4b30-a362-237c21e8311b\") " pod="openshift-dns/dns-default-c9zd9" Apr 23 16:35:27.976764 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.976530 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c58deba-82d5-4b30-a362-237c21e8311b-config-volume\") pod \"dns-default-c9zd9\" (UID: \"1c58deba-82d5-4b30-a362-237c21e8311b\") " pod="openshift-dns/dns-default-c9zd9" Apr 23 16:35:27.976764 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.976558 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a887758a-12b6-4011-819b-3e2204768863-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8nvz6\" (UID: \"a887758a-12b6-4011-819b-3e2204768863\") " pod="openshift-insights/insights-runtime-extractor-8nvz6" Apr 23 16:35:27.976764 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.976580 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0c13d03-c009-47c2-b8ff-f968c81cb35c-cert\") pod \"ingress-canary-g4m2z\" (UID: \"a0c13d03-c009-47c2-b8ff-f968c81cb35c\") " pod="openshift-ingress-canary/ingress-canary-g4m2z" Apr 23 16:35:27.976764 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.976632 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a887758a-12b6-4011-819b-3e2204768863-crio-socket\") pod \"insights-runtime-extractor-8nvz6\" (UID: \"a887758a-12b6-4011-819b-3e2204768863\") " pod="openshift-insights/insights-runtime-extractor-8nvz6" Apr 23 16:35:27.976764 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.976655 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k5bj6\" (UniqueName: \"kubernetes.io/projected/a887758a-12b6-4011-819b-3e2204768863-kube-api-access-k5bj6\") pod \"insights-runtime-extractor-8nvz6\" (UID: \"a887758a-12b6-4011-819b-3e2204768863\") " pod="openshift-insights/insights-runtime-extractor-8nvz6" Apr 23 16:35:27.976764 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.976729 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c58deba-82d5-4b30-a362-237c21e8311b-metrics-tls\") pod \"dns-default-c9zd9\" (UID: \"1c58deba-82d5-4b30-a362-237c21e8311b\") " pod="openshift-dns/dns-default-c9zd9" Apr 23 16:35:27.976764 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.976753 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1c58deba-82d5-4b30-a362-237c21e8311b-tmp-dir\") pod \"dns-default-c9zd9\" (UID: \"1c58deba-82d5-4b30-a362-237c21e8311b\") " pod="openshift-dns/dns-default-c9zd9" Apr 23 16:35:27.977157 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.976831 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/a887758a-12b6-4011-819b-3e2204768863-data-volume\") pod \"insights-runtime-extractor-8nvz6\" (UID: \"a887758a-12b6-4011-819b-3e2204768863\") " pod="openshift-insights/insights-runtime-extractor-8nvz6" Apr 23 16:35:27.977157 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.976905 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/a887758a-12b6-4011-819b-3e2204768863-crio-socket\") pod \"insights-runtime-extractor-8nvz6\" (UID: \"a887758a-12b6-4011-819b-3e2204768863\") " pod="openshift-insights/insights-runtime-extractor-8nvz6" Apr 23 16:35:27.977306 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.977206 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1c58deba-82d5-4b30-a362-237c21e8311b-config-volume\") pod \"dns-default-c9zd9\" (UID: \"1c58deba-82d5-4b30-a362-237c21e8311b\") " pod="openshift-dns/dns-default-c9zd9" Apr 23 16:35:27.977402 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.977376 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/a887758a-12b6-4011-819b-3e2204768863-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-8nvz6\" (UID: \"a887758a-12b6-4011-819b-3e2204768863\") " pod="openshift-insights/insights-runtime-extractor-8nvz6" Apr 23 16:35:27.977521 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.977417 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/1c58deba-82d5-4b30-a362-237c21e8311b-tmp-dir\") pod \"dns-default-c9zd9\" (UID: \"1c58deba-82d5-4b30-a362-237c21e8311b\") " pod="openshift-dns/dns-default-c9zd9" Apr 23 16:35:27.979223 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.979203 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/a887758a-12b6-4011-819b-3e2204768863-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-8nvz6\" (UID: \"a887758a-12b6-4011-819b-3e2204768863\") " pod="openshift-insights/insights-runtime-extractor-8nvz6" Apr 23 16:35:27.979555 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.979517 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/1c58deba-82d5-4b30-a362-237c21e8311b-metrics-tls\") pod \"dns-default-c9zd9\" (UID: \"1c58deba-82d5-4b30-a362-237c21e8311b\") " pod="openshift-dns/dns-default-c9zd9" Apr 23 16:35:27.979832 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.979814 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a0c13d03-c009-47c2-b8ff-f968c81cb35c-cert\") pod \"ingress-canary-g4m2z\" (UID: \"a0c13d03-c009-47c2-b8ff-f968c81cb35c\") " pod="openshift-ingress-canary/ingress-canary-g4m2z" Apr 23 16:35:27.986225 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.986198 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k5bj6\" (UniqueName: \"kubernetes.io/projected/a887758a-12b6-4011-819b-3e2204768863-kube-api-access-k5bj6\") pod \"insights-runtime-extractor-8nvz6\" (UID: \"a887758a-12b6-4011-819b-3e2204768863\") " pod="openshift-insights/insights-runtime-extractor-8nvz6" Apr 23 16:35:27.986438 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.986402 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwfvv\" (UniqueName: \"kubernetes.io/projected/a0c13d03-c009-47c2-b8ff-f968c81cb35c-kube-api-access-hwfvv\") pod \"ingress-canary-g4m2z\" (UID: \"a0c13d03-c009-47c2-b8ff-f968c81cb35c\") " pod="openshift-ingress-canary/ingress-canary-g4m2z" Apr 23 16:35:27.987509 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:27.987487 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rrmt\" (UniqueName: \"kubernetes.io/projected/1c58deba-82d5-4b30-a362-237c21e8311b-kube-api-access-2rrmt\") pod \"dns-default-c9zd9\" (UID: \"1c58deba-82d5-4b30-a362-237c21e8311b\") " pod="openshift-dns/dns-default-c9zd9" Apr 23 16:35:28.004401 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:28.004371 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6d44bcc48b-45nq9" Apr 23 16:35:28.033094 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:28.033054 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-g4m2z" Apr 23 16:35:28.039825 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:28.039798 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-c9zd9" Apr 23 16:35:28.070541 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:28.070507 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-8nvz6" Apr 23 16:35:28.582561 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:28.582312 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8306d95a-dbae-4dd7-bf93-637a12f98c59-metrics-certs\") pod \"network-metrics-daemon-jpzq7\" (UID: \"8306d95a-dbae-4dd7-bf93-637a12f98c59\") " pod="openshift-multus/network-metrics-daemon-jpzq7" Apr 23 16:35:28.585350 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:28.585318 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8306d95a-dbae-4dd7-bf93-637a12f98c59-metrics-certs\") pod \"network-metrics-daemon-jpzq7\" (UID: \"8306d95a-dbae-4dd7-bf93-637a12f98c59\") " pod="openshift-multus/network-metrics-daemon-jpzq7" Apr 23 16:35:28.688732 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:28.683954 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cqgkx\" (UniqueName: \"kubernetes.io/projected/493d9466-44b1-4315-9f1b-a60f6bb428c1-kube-api-access-cqgkx\") pod \"network-check-target-ch99b\" (UID: \"493d9466-44b1-4315-9f1b-a60f6bb428c1\") " pod="openshift-network-diagnostics/network-check-target-ch99b" Apr 23 16:35:28.692023 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:28.691569 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqgkx\" (UniqueName: \"kubernetes.io/projected/493d9466-44b1-4315-9f1b-a60f6bb428c1-kube-api-access-cqgkx\") pod \"network-check-target-ch99b\" (UID: \"493d9466-44b1-4315-9f1b-a60f6bb428c1\") " pod="openshift-network-diagnostics/network-check-target-ch99b" Apr 23 16:35:28.787509 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:28.787484 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-c9zd9"] Apr 23 16:35:28.792669 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:35:28.792628 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c58deba_82d5_4b30_a362_237c21e8311b.slice/crio-a4b58d37694e89763ac11368933a67f4f051cfeb812e5c0145228b681d3a10f6 WatchSource:0}: Error finding container a4b58d37694e89763ac11368933a67f4f051cfeb812e5c0145228b681d3a10f6: Status 404 returned error can't find the container with id a4b58d37694e89763ac11368933a67f4f051cfeb812e5c0145228b681d3a10f6 Apr 23 16:35:28.793735 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:28.793615 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6d44bcc48b-45nq9"] Apr 23 16:35:28.794502 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:28.794486 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-g4m2z"] Apr 23 16:35:28.795028 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:28.795004 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-8nvz6"] Apr 23 16:35:28.799415 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:28.799400 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-ch99b" Apr 23 16:35:28.803792 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:35:28.803767 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0c13d03_c009_47c2_b8ff_f968c81cb35c.slice/crio-9d4ca192c7024fdfb59b4421c7af5f330682df9c0898bbe188b25bb4e15791f3 WatchSource:0}: Error finding container 9d4ca192c7024fdfb59b4421c7af5f330682df9c0898bbe188b25bb4e15791f3: Status 404 returned error can't find the container with id 9d4ca192c7024fdfb59b4421c7af5f330682df9c0898bbe188b25bb4e15791f3 Apr 23 16:35:28.804672 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:35:28.804648 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode39f91f1_ba00_471a_9d79_c0574e83f873.slice/crio-e12a7ef7b1b2af1517e8d20f3a719a0e00446122204236d5025dfeb1206bd4da WatchSource:0}: Error finding container e12a7ef7b1b2af1517e8d20f3a719a0e00446122204236d5025dfeb1206bd4da: Status 404 returned error can't find the container with id e12a7ef7b1b2af1517e8d20f3a719a0e00446122204236d5025dfeb1206bd4da Apr 23 16:35:28.805050 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:28.805032 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-jpzq7" Apr 23 16:35:28.805724 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:35:28.805609 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda887758a_12b6_4011_819b_3e2204768863.slice/crio-d73bfc5a30dc6e2c2d524c47b9fcef6dae273b4ff684973ad0a08ef73851cd8a WatchSource:0}: Error finding container d73bfc5a30dc6e2c2d524c47b9fcef6dae273b4ff684973ad0a08ef73851cd8a: Status 404 returned error can't find the container with id d73bfc5a30dc6e2c2d524c47b9fcef6dae273b4ff684973ad0a08ef73851cd8a Apr 23 16:35:28.997649 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:28.997615 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-jpzq7"] Apr 23 16:35:28.998225 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:28.998151 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-ch99b"] Apr 23 16:35:29.025066 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:29.024989 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8zwmw" event={"ID":"0f6f780e-a2ae-473d-ad75-c644275b6cdb","Type":"ContainerStarted","Data":"6517d0e81af173c32a916841d41b9538a288635b04e75b70f094cf4c4a9a2900"} Apr 23 16:35:29.026778 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:29.026730 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-g4m2z" event={"ID":"a0c13d03-c009-47c2-b8ff-f968c81cb35c","Type":"ContainerStarted","Data":"9d4ca192c7024fdfb59b4421c7af5f330682df9c0898bbe188b25bb4e15791f3"} Apr 23 16:35:29.028297 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:29.028249 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ch99b" event={"ID":"493d9466-44b1-4315-9f1b-a60f6bb428c1","Type":"ContainerStarted","Data":"92faf2d12209d1154fe9f482c56842d8d463aac2ca19254ecd45d3e071e0bd7f"} Apr 23 16:35:29.029784 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:29.029752 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d44bcc48b-45nq9" event={"ID":"e39f91f1-ba00-471a-9d79-c0574e83f873","Type":"ContainerStarted","Data":"821ce4ee781dc8356eaafb71dbc9c26ddd2d42acb9dcab3ed0afb710c2bec7ea"} Apr 23 16:35:29.029911 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:29.029792 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6d44bcc48b-45nq9" event={"ID":"e39f91f1-ba00-471a-9d79-c0574e83f873","Type":"ContainerStarted","Data":"e12a7ef7b1b2af1517e8d20f3a719a0e00446122204236d5025dfeb1206bd4da"} Apr 23 16:35:29.029985 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:29.029917 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6d44bcc48b-45nq9" Apr 23 16:35:29.031128 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:29.031103 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-c9zd9" event={"ID":"1c58deba-82d5-4b30-a362-237c21e8311b","Type":"ContainerStarted","Data":"a4b58d37694e89763ac11368933a67f4f051cfeb812e5c0145228b681d3a10f6"} Apr 23 16:35:29.032407 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:29.032347 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8nvz6" event={"ID":"a887758a-12b6-4011-819b-3e2204768863","Type":"ContainerStarted","Data":"6b4f12fb613d1ef7f6f9e255050813dff034bfc790b3ad0f3b0da93c1f84fd98"} Apr 23 16:35:29.032407 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:29.032376 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8nvz6" event={"ID":"a887758a-12b6-4011-819b-3e2204768863","Type":"ContainerStarted","Data":"d73bfc5a30dc6e2c2d524c47b9fcef6dae273b4ff684973ad0a08ef73851cd8a"} Apr 23 16:35:29.033303 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:29.033274 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jpzq7" event={"ID":"8306d95a-dbae-4dd7-bf93-637a12f98c59","Type":"ContainerStarted","Data":"55af7e87a40ffb8b0ea92090eccf32dcd6be198375132ae1bcf9ccd28b19547e"} Apr 23 16:35:29.051976 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:29.051919 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6d44bcc48b-45nq9" podStartSLOduration=7.051906641 podStartE2EDuration="7.051906641s" podCreationTimestamp="2026-04-23 16:35:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:35:29.051589092 +0000 UTC m=+34.741375438" watchObservedRunningTime="2026-04-23 16:35:29.051906641 +0000 UTC m=+34.741692987" Apr 23 16:35:30.040571 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:30.040498 2573 generic.go:358] "Generic (PLEG): container finished" podID="0f6f780e-a2ae-473d-ad75-c644275b6cdb" containerID="6517d0e81af173c32a916841d41b9538a288635b04e75b70f094cf4c4a9a2900" exitCode=0 Apr 23 16:35:30.041555 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:30.041506 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8zwmw" event={"ID":"0f6f780e-a2ae-473d-ad75-c644275b6cdb","Type":"ContainerDied","Data":"6517d0e81af173c32a916841d41b9538a288635b04e75b70f094cf4c4a9a2900"} Apr 23 16:35:31.099388 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.099309 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-7mt24"] Apr 23 16:35:31.109101 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.109076 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mt24" Apr 23 16:35:31.112571 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.112540 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-8fvqg\"" Apr 23 16:35:31.112571 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.112562 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 16:35:31.112808 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.112595 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 16:35:31.112808 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.112652 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 16:35:31.112808 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.112675 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 23 16:35:31.112956 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.112900 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 23 16:35:31.117728 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.117683 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-7mt24"] Apr 23 16:35:31.143968 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.143930 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-z88zz"] Apr 23 16:35:31.154077 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.154051 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-z88zz" Apr 23 16:35:31.156388 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.156369 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-g4jgc\"" Apr 23 16:35:31.156582 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.156567 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 16:35:31.156648 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.156632 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 16:35:31.156894 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.156866 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 16:35:31.208681 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.208649 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/046d19c7-3894-44f1-bc2d-035f0169e8d2-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-7mt24\" (UID: \"046d19c7-3894-44f1-bc2d-035f0169e8d2\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mt24" Apr 23 16:35:31.208681 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.208702 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/046d19c7-3894-44f1-bc2d-035f0169e8d2-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-7mt24\" (UID: \"046d19c7-3894-44f1-bc2d-035f0169e8d2\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mt24" Apr 23 16:35:31.208904 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.208797 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/046d19c7-3894-44f1-bc2d-035f0169e8d2-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-7mt24\" (UID: \"046d19c7-3894-44f1-bc2d-035f0169e8d2\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mt24" Apr 23 16:35:31.208904 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.208851 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl27q\" (UniqueName: \"kubernetes.io/projected/046d19c7-3894-44f1-bc2d-035f0169e8d2-kube-api-access-gl27q\") pod \"openshift-state-metrics-9d44df66c-7mt24\" (UID: \"046d19c7-3894-44f1-bc2d-035f0169e8d2\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mt24" Apr 23 16:35:31.309910 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.309870 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4953b4c2-f7bb-425b-a8af-7cff76f0e8c8-node-exporter-wtmp\") pod \"node-exporter-z88zz\" (UID: \"4953b4c2-f7bb-425b-a8af-7cff76f0e8c8\") " pod="openshift-monitoring/node-exporter-z88zz" Apr 23 16:35:31.310085 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.309920 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4953b4c2-f7bb-425b-a8af-7cff76f0e8c8-node-exporter-textfile\") pod \"node-exporter-z88zz\" (UID: \"4953b4c2-f7bb-425b-a8af-7cff76f0e8c8\") " pod="openshift-monitoring/node-exporter-z88zz" Apr 23 16:35:31.310085 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.309972 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4953b4c2-f7bb-425b-a8af-7cff76f0e8c8-metrics-client-ca\") pod \"node-exporter-z88zz\" (UID: \"4953b4c2-f7bb-425b-a8af-7cff76f0e8c8\") " pod="openshift-monitoring/node-exporter-z88zz" Apr 23 16:35:31.310085 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.310010 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/046d19c7-3894-44f1-bc2d-035f0169e8d2-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-7mt24\" (UID: \"046d19c7-3894-44f1-bc2d-035f0169e8d2\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mt24" Apr 23 16:35:31.310085 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.310028 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mv66s\" (UniqueName: \"kubernetes.io/projected/4953b4c2-f7bb-425b-a8af-7cff76f0e8c8-kube-api-access-mv66s\") pod \"node-exporter-z88zz\" (UID: \"4953b4c2-f7bb-425b-a8af-7cff76f0e8c8\") " pod="openshift-monitoring/node-exporter-z88zz" Apr 23 16:35:31.310085 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.310054 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4953b4c2-f7bb-425b-a8af-7cff76f0e8c8-node-exporter-accelerators-collector-config\") pod \"node-exporter-z88zz\" (UID: \"4953b4c2-f7bb-425b-a8af-7cff76f0e8c8\") " pod="openshift-monitoring/node-exporter-z88zz" Apr 23 16:35:31.310349 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.310105 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4953b4c2-f7bb-425b-a8af-7cff76f0e8c8-node-exporter-tls\") pod \"node-exporter-z88zz\" (UID: \"4953b4c2-f7bb-425b-a8af-7cff76f0e8c8\") " pod="openshift-monitoring/node-exporter-z88zz" Apr 23 16:35:31.310349 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.310135 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4953b4c2-f7bb-425b-a8af-7cff76f0e8c8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-z88zz\" (UID: \"4953b4c2-f7bb-425b-a8af-7cff76f0e8c8\") " pod="openshift-monitoring/node-exporter-z88zz" Apr 23 16:35:31.310349 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.310247 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4953b4c2-f7bb-425b-a8af-7cff76f0e8c8-sys\") pod \"node-exporter-z88zz\" (UID: \"4953b4c2-f7bb-425b-a8af-7cff76f0e8c8\") " pod="openshift-monitoring/node-exporter-z88zz" Apr 23 16:35:31.310349 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.310284 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4953b4c2-f7bb-425b-a8af-7cff76f0e8c8-root\") pod \"node-exporter-z88zz\" (UID: \"4953b4c2-f7bb-425b-a8af-7cff76f0e8c8\") " pod="openshift-monitoring/node-exporter-z88zz" Apr 23 16:35:31.310349 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.310324 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/046d19c7-3894-44f1-bc2d-035f0169e8d2-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-7mt24\" (UID: \"046d19c7-3894-44f1-bc2d-035f0169e8d2\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mt24" Apr 23 16:35:31.310588 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.310357 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/046d19c7-3894-44f1-bc2d-035f0169e8d2-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-7mt24\" (UID: \"046d19c7-3894-44f1-bc2d-035f0169e8d2\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mt24" Apr 23 16:35:31.310588 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.310392 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gl27q\" (UniqueName: \"kubernetes.io/projected/046d19c7-3894-44f1-bc2d-035f0169e8d2-kube-api-access-gl27q\") pod \"openshift-state-metrics-9d44df66c-7mt24\" (UID: \"046d19c7-3894-44f1-bc2d-035f0169e8d2\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mt24" Apr 23 16:35:31.310826 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.310797 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/046d19c7-3894-44f1-bc2d-035f0169e8d2-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-7mt24\" (UID: \"046d19c7-3894-44f1-bc2d-035f0169e8d2\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mt24" Apr 23 16:35:31.314666 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.314616 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/046d19c7-3894-44f1-bc2d-035f0169e8d2-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-7mt24\" (UID: \"046d19c7-3894-44f1-bc2d-035f0169e8d2\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mt24" Apr 23 16:35:31.314666 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.314634 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/046d19c7-3894-44f1-bc2d-035f0169e8d2-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-7mt24\" (UID: \"046d19c7-3894-44f1-bc2d-035f0169e8d2\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mt24" Apr 23 16:35:31.323582 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.323561 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl27q\" (UniqueName: \"kubernetes.io/projected/046d19c7-3894-44f1-bc2d-035f0169e8d2-kube-api-access-gl27q\") pod \"openshift-state-metrics-9d44df66c-7mt24\" (UID: \"046d19c7-3894-44f1-bc2d-035f0169e8d2\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mt24" Apr 23 16:35:31.410976 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.410928 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4953b4c2-f7bb-425b-a8af-7cff76f0e8c8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-z88zz\" (UID: \"4953b4c2-f7bb-425b-a8af-7cff76f0e8c8\") " pod="openshift-monitoring/node-exporter-z88zz" Apr 23 16:35:31.411171 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.410996 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4953b4c2-f7bb-425b-a8af-7cff76f0e8c8-sys\") pod \"node-exporter-z88zz\" (UID: \"4953b4c2-f7bb-425b-a8af-7cff76f0e8c8\") " pod="openshift-monitoring/node-exporter-z88zz" Apr 23 16:35:31.411171 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.411021 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4953b4c2-f7bb-425b-a8af-7cff76f0e8c8-root\") pod \"node-exporter-z88zz\" (UID: \"4953b4c2-f7bb-425b-a8af-7cff76f0e8c8\") " pod="openshift-monitoring/node-exporter-z88zz" Apr 23 16:35:31.411171 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.411085 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4953b4c2-f7bb-425b-a8af-7cff76f0e8c8-node-exporter-wtmp\") pod \"node-exporter-z88zz\" (UID: \"4953b4c2-f7bb-425b-a8af-7cff76f0e8c8\") " pod="openshift-monitoring/node-exporter-z88zz" Apr 23 16:35:31.411171 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.411118 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4953b4c2-f7bb-425b-a8af-7cff76f0e8c8-node-exporter-textfile\") pod \"node-exporter-z88zz\" (UID: \"4953b4c2-f7bb-425b-a8af-7cff76f0e8c8\") " pod="openshift-monitoring/node-exporter-z88zz" Apr 23 16:35:31.411171 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.411141 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4953b4c2-f7bb-425b-a8af-7cff76f0e8c8-metrics-client-ca\") pod \"node-exporter-z88zz\" (UID: \"4953b4c2-f7bb-425b-a8af-7cff76f0e8c8\") " pod="openshift-monitoring/node-exporter-z88zz" Apr 23 16:35:31.411171 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.411136 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4953b4c2-f7bb-425b-a8af-7cff76f0e8c8-sys\") pod \"node-exporter-z88zz\" (UID: \"4953b4c2-f7bb-425b-a8af-7cff76f0e8c8\") " pod="openshift-monitoring/node-exporter-z88zz" Apr 23 16:35:31.411171 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.411168 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mv66s\" (UniqueName: \"kubernetes.io/projected/4953b4c2-f7bb-425b-a8af-7cff76f0e8c8-kube-api-access-mv66s\") pod \"node-exporter-z88zz\" (UID: \"4953b4c2-f7bb-425b-a8af-7cff76f0e8c8\") " pod="openshift-monitoring/node-exporter-z88zz" Apr 23 16:35:31.411464 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.411235 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4953b4c2-f7bb-425b-a8af-7cff76f0e8c8-root\") pod \"node-exporter-z88zz\" (UID: \"4953b4c2-f7bb-425b-a8af-7cff76f0e8c8\") " pod="openshift-monitoring/node-exporter-z88zz" Apr 23 16:35:31.411464 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.411345 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4953b4c2-f7bb-425b-a8af-7cff76f0e8c8-node-exporter-wtmp\") pod \"node-exporter-z88zz\" (UID: \"4953b4c2-f7bb-425b-a8af-7cff76f0e8c8\") " pod="openshift-monitoring/node-exporter-z88zz" Apr 23 16:35:31.411464 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.411386 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4953b4c2-f7bb-425b-a8af-7cff76f0e8c8-node-exporter-accelerators-collector-config\") pod \"node-exporter-z88zz\" (UID: \"4953b4c2-f7bb-425b-a8af-7cff76f0e8c8\") " pod="openshift-monitoring/node-exporter-z88zz" Apr 23 16:35:31.411464 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.411452 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4953b4c2-f7bb-425b-a8af-7cff76f0e8c8-node-exporter-tls\") pod \"node-exporter-z88zz\" (UID: \"4953b4c2-f7bb-425b-a8af-7cff76f0e8c8\") " pod="openshift-monitoring/node-exporter-z88zz" Apr 23 16:35:31.411635 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.411471 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4953b4c2-f7bb-425b-a8af-7cff76f0e8c8-node-exporter-textfile\") pod \"node-exporter-z88zz\" (UID: \"4953b4c2-f7bb-425b-a8af-7cff76f0e8c8\") " pod="openshift-monitoring/node-exporter-z88zz" Apr 23 16:35:31.411635 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:31.411575 2573 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 23 16:35:31.411755 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:35:31.411636 2573 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4953b4c2-f7bb-425b-a8af-7cff76f0e8c8-node-exporter-tls podName:4953b4c2-f7bb-425b-a8af-7cff76f0e8c8 nodeName:}" failed. No retries permitted until 2026-04-23 16:35:31.911615103 +0000 UTC m=+37.601401430 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/4953b4c2-f7bb-425b-a8af-7cff76f0e8c8-node-exporter-tls") pod "node-exporter-z88zz" (UID: "4953b4c2-f7bb-425b-a8af-7cff76f0e8c8") : secret "node-exporter-tls" not found Apr 23 16:35:31.411926 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.411899 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4953b4c2-f7bb-425b-a8af-7cff76f0e8c8-metrics-client-ca\") pod \"node-exporter-z88zz\" (UID: \"4953b4c2-f7bb-425b-a8af-7cff76f0e8c8\") " pod="openshift-monitoring/node-exporter-z88zz" Apr 23 16:35:31.412053 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.411970 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4953b4c2-f7bb-425b-a8af-7cff76f0e8c8-node-exporter-accelerators-collector-config\") pod \"node-exporter-z88zz\" (UID: \"4953b4c2-f7bb-425b-a8af-7cff76f0e8c8\") " pod="openshift-monitoring/node-exporter-z88zz" Apr 23 16:35:31.413743 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.413717 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4953b4c2-f7bb-425b-a8af-7cff76f0e8c8-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-z88zz\" (UID: \"4953b4c2-f7bb-425b-a8af-7cff76f0e8c8\") " pod="openshift-monitoring/node-exporter-z88zz" Apr 23 16:35:31.418639 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.418616 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mt24" Apr 23 16:35:31.430296 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.430269 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mv66s\" (UniqueName: \"kubernetes.io/projected/4953b4c2-f7bb-425b-a8af-7cff76f0e8c8-kube-api-access-mv66s\") pod \"node-exporter-z88zz\" (UID: \"4953b4c2-f7bb-425b-a8af-7cff76f0e8c8\") " pod="openshift-monitoring/node-exporter-z88zz" Apr 23 16:35:31.915161 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.915128 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4953b4c2-f7bb-425b-a8af-7cff76f0e8c8-node-exporter-tls\") pod \"node-exporter-z88zz\" (UID: \"4953b4c2-f7bb-425b-a8af-7cff76f0e8c8\") " pod="openshift-monitoring/node-exporter-z88zz" Apr 23 16:35:31.917834 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:31.917808 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4953b4c2-f7bb-425b-a8af-7cff76f0e8c8-node-exporter-tls\") pod \"node-exporter-z88zz\" (UID: \"4953b4c2-f7bb-425b-a8af-7cff76f0e8c8\") " pod="openshift-monitoring/node-exporter-z88zz" Apr 23 16:35:32.063798 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.063750 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-z88zz" Apr 23 16:35:32.386681 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.386647 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:35:32.395142 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.395114 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.397944 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.397898 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 23 16:35:32.397944 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.397932 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 23 16:35:32.398213 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.398197 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 23 16:35:32.398377 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.398353 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 23 16:35:32.398682 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.398644 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 23 16:35:32.398682 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.398644 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 23 16:35:32.398980 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.398719 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 23 16:35:32.398980 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.398650 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-5gm2b\"" Apr 23 16:35:32.398980 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.398650 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 23 16:35:32.398980 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.398685 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 23 16:35:32.409902 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.409878 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:35:32.519468 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.519437 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.519644 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.519486 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.519644 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.519513 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b20dd212-869a-4e0b-9556-71ca42353761-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.519644 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.519543 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b20dd212-869a-4e0b-9556-71ca42353761-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.519644 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.519571 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.519644 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.519637 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.519921 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.519724 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.519921 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.519769 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fztt6\" (UniqueName: \"kubernetes.io/projected/b20dd212-869a-4e0b-9556-71ca42353761-kube-api-access-fztt6\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.519921 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.519798 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b20dd212-869a-4e0b-9556-71ca42353761-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.519921 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.519825 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-config-volume\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.519921 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.519848 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-web-config\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.519921 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.519905 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b20dd212-869a-4e0b-9556-71ca42353761-config-out\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.520133 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.519944 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b20dd212-869a-4e0b-9556-71ca42353761-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.620796 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.620747 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.621013 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.620934 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fztt6\" (UniqueName: \"kubernetes.io/projected/b20dd212-869a-4e0b-9556-71ca42353761-kube-api-access-fztt6\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.621013 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.621004 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b20dd212-869a-4e0b-9556-71ca42353761-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.621138 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.621117 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-config-volume\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.621187 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.621159 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-web-config\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.621235 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.621206 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b20dd212-869a-4e0b-9556-71ca42353761-config-out\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.621348 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.621232 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b20dd212-869a-4e0b-9556-71ca42353761-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.621414 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.621367 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.621471 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.621417 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.621471 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.621445 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b20dd212-869a-4e0b-9556-71ca42353761-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.621563 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.621480 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b20dd212-869a-4e0b-9556-71ca42353761-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.621563 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.621513 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.621563 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.621549 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.623429 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.622452 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b20dd212-869a-4e0b-9556-71ca42353761-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.623429 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.623086 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b20dd212-869a-4e0b-9556-71ca42353761-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.624164 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.624136 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b20dd212-869a-4e0b-9556-71ca42353761-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.624810 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.624419 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.624810 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.624617 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.624810 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.624771 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b20dd212-869a-4e0b-9556-71ca42353761-tls-assets\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.625493 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.625468 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-web-config\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.626126 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.626086 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.626448 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.626421 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.626536 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.626441 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b20dd212-869a-4e0b-9556-71ca42353761-config-out\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.626638 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.626617 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.626941 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.626915 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-config-volume\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.630665 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.630643 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fztt6\" (UniqueName: \"kubernetes.io/projected/b20dd212-869a-4e0b-9556-71ca42353761-kube-api-access-fztt6\") pod \"alertmanager-main-0\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:32.708002 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:32.707961 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:35:33.162022 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:35:33.161762 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4953b4c2_f7bb_425b_a8af_7cff76f0e8c8.slice/crio-632aa1c801bf7f6a8fd31d187627df4777ed41974c6109d13ebf4634dfbebed4 WatchSource:0}: Error finding container 632aa1c801bf7f6a8fd31d187627df4777ed41974c6109d13ebf4634dfbebed4: Status 404 returned error can't find the container with id 632aa1c801bf7f6a8fd31d187627df4777ed41974c6109d13ebf4634dfbebed4 Apr 23 16:35:33.323812 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:33.323747 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-7mt24"] Apr 23 16:35:33.327539 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:33.326350 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:35:33.330197 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:35:33.330167 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod046d19c7_3894_44f1_bc2d_035f0169e8d2.slice/crio-4c50a29be6f6392519e5e4d51f392d211c4360d65483613c9320990d3f8108f4 WatchSource:0}: Error finding container 4c50a29be6f6392519e5e4d51f392d211c4360d65483613c9320990d3f8108f4: Status 404 returned error can't find the container with id 4c50a29be6f6392519e5e4d51f392d211c4360d65483613c9320990d3f8108f4 Apr 23 16:35:33.331454 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:35:33.331279 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb20dd212_869a_4e0b_9556_71ca42353761.slice/crio-a4ee65b2cde0a1d2530f390a0265e6528ebfbce1b9c1e3d8f90a3119c49ab177 WatchSource:0}: Error finding container a4ee65b2cde0a1d2530f390a0265e6528ebfbce1b9c1e3d8f90a3119c49ab177: Status 404 returned error can't find the container with id a4ee65b2cde0a1d2530f390a0265e6528ebfbce1b9c1e3d8f90a3119c49ab177 Apr 23 16:35:34.058349 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:34.057905 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-ch99b" event={"ID":"493d9466-44b1-4315-9f1b-a60f6bb428c1","Type":"ContainerStarted","Data":"29f19b53367341a088df4c9bf93a62b4db5d5fd683da7a3add4d4c513f03bf24"} Apr 23 16:35:34.059261 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:34.059047 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-ch99b" Apr 23 16:35:34.060019 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:34.059983 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-c9zd9" event={"ID":"1c58deba-82d5-4b30-a362-237c21e8311b","Type":"ContainerStarted","Data":"31cac4c9d8a84b4b0cf8b9bb45f6c4ee7da8815f257c50c1d1c754518431201e"} Apr 23 16:35:34.060019 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:34.060015 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-c9zd9" event={"ID":"1c58deba-82d5-4b30-a362-237c21e8311b","Type":"ContainerStarted","Data":"b04633f66a46751c17a5eda79dcf2b390b78f4898203f42fcf0236f41c0dbc37"} Apr 23 16:35:34.060218 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:34.060144 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-c9zd9" Apr 23 16:35:34.061134 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:34.061099 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-z88zz" event={"ID":"4953b4c2-f7bb-425b-a8af-7cff76f0e8c8","Type":"ContainerStarted","Data":"632aa1c801bf7f6a8fd31d187627df4777ed41974c6109d13ebf4634dfbebed4"} Apr 23 16:35:34.062880 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:34.062857 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8nvz6" event={"ID":"a887758a-12b6-4011-819b-3e2204768863","Type":"ContainerStarted","Data":"c194655ba3ab631218ca173c89c4749119fab1af356fa11128fc7d1e63af4f69"} Apr 23 16:35:34.064492 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:34.064471 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jpzq7" event={"ID":"8306d95a-dbae-4dd7-bf93-637a12f98c59","Type":"ContainerStarted","Data":"cb673130040a484ce3c684411e4cfc9d852936add095a2e1d9edbe9c02d57ada"} Apr 23 16:35:34.064592 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:34.064496 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-jpzq7" event={"ID":"8306d95a-dbae-4dd7-bf93-637a12f98c59","Type":"ContainerStarted","Data":"273f789bfdab34f10d80291ffdc1eaf6ee96b54d3852c91b83d46786793cca52"} Apr 23 16:35:34.067571 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:34.067550 2573 generic.go:358] "Generic (PLEG): container finished" podID="0f6f780e-a2ae-473d-ad75-c644275b6cdb" containerID="75f633b16be64c756594119d04046c54ca4e2ea2d722ec19e96ee57e1f660983" exitCode=0 Apr 23 16:35:34.067674 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:34.067631 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8zwmw" event={"ID":"0f6f780e-a2ae-473d-ad75-c644275b6cdb","Type":"ContainerDied","Data":"75f633b16be64c756594119d04046c54ca4e2ea2d722ec19e96ee57e1f660983"} Apr 23 16:35:34.069150 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:34.069129 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-g4m2z" event={"ID":"a0c13d03-c009-47c2-b8ff-f968c81cb35c","Type":"ContainerStarted","Data":"24209c36ed91f6164f829924531c5986c2fc544e115e896f087d0a614ea7f03b"} Apr 23 16:35:34.070333 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:34.070299 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b20dd212-869a-4e0b-9556-71ca42353761","Type":"ContainerStarted","Data":"a4ee65b2cde0a1d2530f390a0265e6528ebfbce1b9c1e3d8f90a3119c49ab177"} Apr 23 16:35:34.072104 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:34.072075 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mt24" event={"ID":"046d19c7-3894-44f1-bc2d-035f0169e8d2","Type":"ContainerStarted","Data":"4019b6508ebb888c7328c13d39e3dc64c96098c3a9f62a15857d25bf85327679"} Apr 23 16:35:34.072104 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:34.072102 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mt24" event={"ID":"046d19c7-3894-44f1-bc2d-035f0169e8d2","Type":"ContainerStarted","Data":"c16f33a3e796cbc26171385a479d1b5da20be8b28bbddc4eea024b8fe332bf26"} Apr 23 16:35:34.072261 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:34.072111 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mt24" event={"ID":"046d19c7-3894-44f1-bc2d-035f0169e8d2","Type":"ContainerStarted","Data":"4c50a29be6f6392519e5e4d51f392d211c4360d65483613c9320990d3f8108f4"} Apr 23 16:35:34.075377 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:34.075331 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-ch99b" podStartSLOduration=35.919868027 podStartE2EDuration="40.075315331s" podCreationTimestamp="2026-04-23 16:34:54 +0000 UTC" firstStartedPulling="2026-04-23 16:35:29.006993337 +0000 UTC m=+34.696779661" lastFinishedPulling="2026-04-23 16:35:33.162440629 +0000 UTC m=+38.852226965" observedRunningTime="2026-04-23 16:35:34.073434912 +0000 UTC m=+39.763221258" watchObservedRunningTime="2026-04-23 16:35:34.075315331 +0000 UTC m=+39.765101699" Apr 23 16:35:34.091509 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:34.091466 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-jpzq7" podStartSLOduration=35.942715125 podStartE2EDuration="40.091449632s" podCreationTimestamp="2026-04-23 16:34:54 +0000 UTC" firstStartedPulling="2026-04-23 16:35:29.006318854 +0000 UTC m=+34.696105193" lastFinishedPulling="2026-04-23 16:35:33.155053373 +0000 UTC m=+38.844839700" observedRunningTime="2026-04-23 16:35:34.089957304 +0000 UTC m=+39.779743652" watchObservedRunningTime="2026-04-23 16:35:34.091449632 +0000 UTC m=+39.781235980" Apr 23 16:35:34.133204 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:34.132904 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-c9zd9" podStartSLOduration=2.776395945 podStartE2EDuration="7.132888657s" podCreationTimestamp="2026-04-23 16:35:27 +0000 UTC" firstStartedPulling="2026-04-23 16:35:28.794899949 +0000 UTC m=+34.484686273" lastFinishedPulling="2026-04-23 16:35:33.151392645 +0000 UTC m=+38.841178985" observedRunningTime="2026-04-23 16:35:34.131830285 +0000 UTC m=+39.821616643" watchObservedRunningTime="2026-04-23 16:35:34.132888657 +0000 UTC m=+39.822674980" Apr 23 16:35:34.148401 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:34.148332 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-g4m2z" podStartSLOduration=2.795584891 podStartE2EDuration="7.148315027s" podCreationTimestamp="2026-04-23 16:35:27 +0000 UTC" firstStartedPulling="2026-04-23 16:35:28.806093835 +0000 UTC m=+34.495880163" lastFinishedPulling="2026-04-23 16:35:33.158823959 +0000 UTC m=+38.848610299" observedRunningTime="2026-04-23 16:35:34.147274455 +0000 UTC m=+39.837060803" watchObservedRunningTime="2026-04-23 16:35:34.148315027 +0000 UTC m=+39.838101374" Apr 23 16:35:35.078526 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:35.078474 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-8zwmw" event={"ID":"0f6f780e-a2ae-473d-ad75-c644275b6cdb","Type":"ContainerStarted","Data":"e75afc10072c214cd36e328106cafc7c17830274edb399f24d4e5fe825c33708"} Apr 23 16:35:35.080175 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:35.080145 2573 generic.go:358] "Generic (PLEG): container finished" podID="4953b4c2-f7bb-425b-a8af-7cff76f0e8c8" containerID="18e97161c112e5adb8717ab81bfbc4483782963add6ac1c4b05e03f01198f7ee" exitCode=0 Apr 23 16:35:35.080306 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:35.080268 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-z88zz" event={"ID":"4953b4c2-f7bb-425b-a8af-7cff76f0e8c8","Type":"ContainerDied","Data":"18e97161c112e5adb8717ab81bfbc4483782963add6ac1c4b05e03f01198f7ee"} Apr 23 16:35:35.107478 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:35.107423 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-8zwmw" podStartSLOduration=8.922473366 podStartE2EDuration="40.107404953s" podCreationTimestamp="2026-04-23 16:34:55 +0000 UTC" firstStartedPulling="2026-04-23 16:34:57.599648029 +0000 UTC m=+3.289434354" lastFinishedPulling="2026-04-23 16:35:28.784579614 +0000 UTC m=+34.474365941" observedRunningTime="2026-04-23 16:35:35.10612032 +0000 UTC m=+40.795906668" watchObservedRunningTime="2026-04-23 16:35:35.107404953 +0000 UTC m=+40.797191303" Apr 23 16:35:37.086958 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.086922 2573 generic.go:358] "Generic (PLEG): container finished" podID="b20dd212-869a-4e0b-9556-71ca42353761" containerID="f8f0677c0a157d6ba0b1401f823626524fa5fa8c543e83cb57e4bd9b133248da" exitCode=0 Apr 23 16:35:37.087499 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.086996 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b20dd212-869a-4e0b-9556-71ca42353761","Type":"ContainerDied","Data":"f8f0677c0a157d6ba0b1401f823626524fa5fa8c543e83cb57e4bd9b133248da"} Apr 23 16:35:37.089005 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.088979 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mt24" event={"ID":"046d19c7-3894-44f1-bc2d-035f0169e8d2","Type":"ContainerStarted","Data":"ddeb0df151220f15d1755db7aa49754e41a08321055c75441588013dedc9c382"} Apr 23 16:35:37.093217 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.093185 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-z88zz" event={"ID":"4953b4c2-f7bb-425b-a8af-7cff76f0e8c8","Type":"ContainerStarted","Data":"48a2d1ee250e14e7345b4c4cc4431b861ab50193aacdf314bae5aeaafb45d2c3"} Apr 23 16:35:37.093217 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.093213 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-z88zz" event={"ID":"4953b4c2-f7bb-425b-a8af-7cff76f0e8c8","Type":"ContainerStarted","Data":"73d8b8db5a1853da9a88dcceb2e948a0dc5bc2b3124621a7ab34364bb500826d"} Apr 23 16:35:37.094980 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.094949 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-8nvz6" event={"ID":"a887758a-12b6-4011-819b-3e2204768863","Type":"ContainerStarted","Data":"4db5d94f83484c771ffedcac33955d09bbbf01d91a6dd21cee69e64ab7958f88"} Apr 23 16:35:37.147619 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.147557 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-z88zz" podStartSLOduration=5.075718187 podStartE2EDuration="6.147535082s" podCreationTimestamp="2026-04-23 16:35:31 +0000 UTC" firstStartedPulling="2026-04-23 16:35:33.163830161 +0000 UTC m=+38.853616485" lastFinishedPulling="2026-04-23 16:35:34.235647042 +0000 UTC m=+39.925433380" observedRunningTime="2026-04-23 16:35:37.146226372 +0000 UTC m=+42.836012717" watchObservedRunningTime="2026-04-23 16:35:37.147535082 +0000 UTC m=+42.837321429" Apr 23 16:35:37.172576 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.172519 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-8nvz6" podStartSLOduration=3.036767223 podStartE2EDuration="10.172503476s" podCreationTimestamp="2026-04-23 16:35:27 +0000 UTC" firstStartedPulling="2026-04-23 16:35:29.006821195 +0000 UTC m=+34.696607528" lastFinishedPulling="2026-04-23 16:35:36.142557454 +0000 UTC m=+41.832343781" observedRunningTime="2026-04-23 16:35:37.171488177 +0000 UTC m=+42.861274523" watchObservedRunningTime="2026-04-23 16:35:37.172503476 +0000 UTC m=+42.862289821" Apr 23 16:35:37.211386 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.211335 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-7mt24" podStartSLOduration=3.6744929920000002 podStartE2EDuration="6.211318277s" podCreationTimestamp="2026-04-23 16:35:31 +0000 UTC" firstStartedPulling="2026-04-23 16:35:33.607725407 +0000 UTC m=+39.297511746" lastFinishedPulling="2026-04-23 16:35:36.144550697 +0000 UTC m=+41.834337031" observedRunningTime="2026-04-23 16:35:37.209792907 +0000 UTC m=+42.899579252" watchObservedRunningTime="2026-04-23 16:35:37.211318277 +0000 UTC m=+42.901104658" Apr 23 16:35:37.450566 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.450524 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:35:37.456089 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.456064 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.458878 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.458851 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-jf7w4\"" Apr 23 16:35:37.459584 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.459568 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 23 16:35:37.459659 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.459601 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 23 16:35:37.459928 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.459909 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 23 16:35:37.459928 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.459916 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 23 16:35:37.460102 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.459924 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 23 16:35:37.460102 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.460027 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 23 16:35:37.465304 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.465285 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-8n72m0o1i5hvc\"" Apr 23 16:35:37.465522 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.465507 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 23 16:35:37.477191 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.477173 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 23 16:35:37.480231 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.480215 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 23 16:35:37.480336 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.480236 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 16:35:37.480336 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.480214 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 23 16:35:37.483071 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.483055 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 23 16:35:37.486021 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.486000 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 23 16:35:37.518914 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.518882 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:35:37.557416 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.557383 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.557416 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.557417 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f8c95658-88b6-40ea-8e55-36e1b9b16d38-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.557641 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.557432 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f8c95658-88b6-40ea-8e55-36e1b9b16d38-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.557641 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.557449 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f8c95658-88b6-40ea-8e55-36e1b9b16d38-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.557641 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.557469 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.557641 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.557540 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.557641 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.557569 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.557641 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.557590 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.557641 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.557612 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zd7p2\" (UniqueName: \"kubernetes.io/projected/f8c95658-88b6-40ea-8e55-36e1b9b16d38-kube-api-access-zd7p2\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.557641 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.557638 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8c95658-88b6-40ea-8e55-36e1b9b16d38-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.557925 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.557658 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.557925 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.557714 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-web-config\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.557925 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.557745 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f8c95658-88b6-40ea-8e55-36e1b9b16d38-config-out\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.557925 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.557768 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.557925 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.557808 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-config\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.557925 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.557826 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8c95658-88b6-40ea-8e55-36e1b9b16d38-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.557925 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.557841 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8c95658-88b6-40ea-8e55-36e1b9b16d38-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.557925 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.557859 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f8c95658-88b6-40ea-8e55-36e1b9b16d38-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.659164 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.659113 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.659164 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.659178 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-web-config\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.659412 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.659218 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f8c95658-88b6-40ea-8e55-36e1b9b16d38-config-out\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.659412 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.659252 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.659412 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.659287 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-config\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.659412 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.659313 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8c95658-88b6-40ea-8e55-36e1b9b16d38-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.659412 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.659337 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8c95658-88b6-40ea-8e55-36e1b9b16d38-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.659412 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.659364 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f8c95658-88b6-40ea-8e55-36e1b9b16d38-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.659412 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.659391 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.659769 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.659418 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f8c95658-88b6-40ea-8e55-36e1b9b16d38-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.659769 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.659446 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f8c95658-88b6-40ea-8e55-36e1b9b16d38-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.659769 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.659471 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f8c95658-88b6-40ea-8e55-36e1b9b16d38-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.659769 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.659497 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.659769 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.659536 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.659769 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.659563 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.659769 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.659593 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.659769 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.659623 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zd7p2\" (UniqueName: \"kubernetes.io/projected/f8c95658-88b6-40ea-8e55-36e1b9b16d38-kube-api-access-zd7p2\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.659769 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.659668 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8c95658-88b6-40ea-8e55-36e1b9b16d38-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.660237 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.660208 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8c95658-88b6-40ea-8e55-36e1b9b16d38-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.660386 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.660355 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8c95658-88b6-40ea-8e55-36e1b9b16d38-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.660546 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.660379 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8c95658-88b6-40ea-8e55-36e1b9b16d38-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.662650 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.662622 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-web-config\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.662768 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.662631 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f8c95658-88b6-40ea-8e55-36e1b9b16d38-config-out\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.662768 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.662735 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-config\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.662884 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.662784 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.663238 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.663031 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.663238 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.663082 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f8c95658-88b6-40ea-8e55-36e1b9b16d38-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.663238 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.663154 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.663238 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.663184 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f8c95658-88b6-40ea-8e55-36e1b9b16d38-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.663661 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.663638 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f8c95658-88b6-40ea-8e55-36e1b9b16d38-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.663770 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.663751 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.665113 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.665089 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.665198 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.665133 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.665435 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.665417 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.666750 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.666732 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f8c95658-88b6-40ea-8e55-36e1b9b16d38-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.688314 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.688290 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zd7p2\" (UniqueName: \"kubernetes.io/projected/f8c95658-88b6-40ea-8e55-36e1b9b16d38-kube-api-access-zd7p2\") pod \"prometheus-k8s-0\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.765495 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.765410 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:37.907849 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:37.907818 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:35:37.912204 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:35:37.912162 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8c95658_88b6_40ea_8e55_36e1b9b16d38.slice/crio-ee6a37b1d7abf4a7da393d0a2688d97e9d449cb99850d4f6f613aa8657931ee3 WatchSource:0}: Error finding container ee6a37b1d7abf4a7da393d0a2688d97e9d449cb99850d4f6f613aa8657931ee3: Status 404 returned error can't find the container with id ee6a37b1d7abf4a7da393d0a2688d97e9d449cb99850d4f6f613aa8657931ee3 Apr 23 16:35:38.099737 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:38.099637 2573 generic.go:358] "Generic (PLEG): container finished" podID="f8c95658-88b6-40ea-8e55-36e1b9b16d38" containerID="0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2" exitCode=0 Apr 23 16:35:38.100071 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:38.099765 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f8c95658-88b6-40ea-8e55-36e1b9b16d38","Type":"ContainerDied","Data":"0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2"} Apr 23 16:35:38.100071 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:38.099817 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f8c95658-88b6-40ea-8e55-36e1b9b16d38","Type":"ContainerStarted","Data":"ee6a37b1d7abf4a7da393d0a2688d97e9d449cb99850d4f6f613aa8657931ee3"} Apr 23 16:35:39.104502 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:39.104472 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b20dd212-869a-4e0b-9556-71ca42353761","Type":"ContainerStarted","Data":"86e8d38696836b68017f58a31139902a153884a67d27413c7f5ae85e5236f8f1"} Apr 23 16:35:40.110538 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:40.110496 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b20dd212-869a-4e0b-9556-71ca42353761","Type":"ContainerStarted","Data":"5d74fa0644e2a80d1a27a61e6cd86a625e8ceaa56b74fe4e2d580d55ea72593f"} Apr 23 16:35:40.110538 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:40.110542 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b20dd212-869a-4e0b-9556-71ca42353761","Type":"ContainerStarted","Data":"99915e0e5a97058e6eb4b4d43a91a64cc3b5afef3e814e0a3ed7a745e5186755"} Apr 23 16:35:40.111044 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:40.110558 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b20dd212-869a-4e0b-9556-71ca42353761","Type":"ContainerStarted","Data":"6baeec6d2c60889767650fa485b6b3e10b70302fd470fb212848971764faf939"} Apr 23 16:35:40.111044 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:40.110573 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b20dd212-869a-4e0b-9556-71ca42353761","Type":"ContainerStarted","Data":"a2dfe6f563099b5cf30dd3cf5811c122b51d4542bbb05106afc5cb709a05113d"} Apr 23 16:35:41.117336 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:41.117303 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b20dd212-869a-4e0b-9556-71ca42353761","Type":"ContainerStarted","Data":"9b56453f321de1390ecbf6639aaab661bc75bec4691401847f5e5532ce62e908"} Apr 23 16:35:41.149712 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:41.149637 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.186233232 podStartE2EDuration="9.14961765s" podCreationTimestamp="2026-04-23 16:35:32 +0000 UTC" firstStartedPulling="2026-04-23 16:35:33.333989971 +0000 UTC m=+39.023776304" lastFinishedPulling="2026-04-23 16:35:40.297374391 +0000 UTC m=+45.987160722" observedRunningTime="2026-04-23 16:35:41.147872769 +0000 UTC m=+46.837659117" watchObservedRunningTime="2026-04-23 16:35:41.14961765 +0000 UTC m=+46.839403996" Apr 23 16:35:42.123826 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:42.123793 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f8c95658-88b6-40ea-8e55-36e1b9b16d38","Type":"ContainerStarted","Data":"fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99"} Apr 23 16:35:42.123826 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:42.123831 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f8c95658-88b6-40ea-8e55-36e1b9b16d38","Type":"ContainerStarted","Data":"1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8"} Apr 23 16:35:44.083500 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:44.083469 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-c9zd9" Apr 23 16:35:44.133030 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:44.132990 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f8c95658-88b6-40ea-8e55-36e1b9b16d38","Type":"ContainerStarted","Data":"d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048"} Apr 23 16:35:45.143372 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:45.143333 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f8c95658-88b6-40ea-8e55-36e1b9b16d38","Type":"ContainerStarted","Data":"2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219"} Apr 23 16:35:45.143372 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:45.143377 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f8c95658-88b6-40ea-8e55-36e1b9b16d38","Type":"ContainerStarted","Data":"43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58"} Apr 23 16:35:45.143919 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:45.143392 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f8c95658-88b6-40ea-8e55-36e1b9b16d38","Type":"ContainerStarted","Data":"4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740"} Apr 23 16:35:45.178589 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:45.178514 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.310109908 podStartE2EDuration="8.178502002s" podCreationTimestamp="2026-04-23 16:35:37 +0000 UTC" firstStartedPulling="2026-04-23 16:35:38.101173742 +0000 UTC m=+43.790960066" lastFinishedPulling="2026-04-23 16:35:43.969565837 +0000 UTC m=+49.659352160" observedRunningTime="2026-04-23 16:35:45.178004882 +0000 UTC m=+50.867791230" watchObservedRunningTime="2026-04-23 16:35:45.178502002 +0000 UTC m=+50.868288347" Apr 23 16:35:47.765763 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:47.765725 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:35:50.045789 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:50.045759 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6d44bcc48b-45nq9" Apr 23 16:35:54.022543 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:35:54.022519 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-wd2cz" Apr 23 16:36:05.083586 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:05.083553 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-ch99b" Apr 23 16:36:08.904350 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:08.904308 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:08.964855 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:08.964826 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:09.222855 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:09.222780 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:21.646665 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:21.646630 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:36:21.647223 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:21.647066 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b20dd212-869a-4e0b-9556-71ca42353761" containerName="alertmanager" containerID="cri-o://86e8d38696836b68017f58a31139902a153884a67d27413c7f5ae85e5236f8f1" gracePeriod=120 Apr 23 16:36:21.647223 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:21.647164 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b20dd212-869a-4e0b-9556-71ca42353761" containerName="kube-rbac-proxy-metric" containerID="cri-o://5d74fa0644e2a80d1a27a61e6cd86a625e8ceaa56b74fe4e2d580d55ea72593f" gracePeriod=120 Apr 23 16:36:21.647223 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:21.647180 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b20dd212-869a-4e0b-9556-71ca42353761" containerName="prom-label-proxy" containerID="cri-o://9b56453f321de1390ecbf6639aaab661bc75bec4691401847f5e5532ce62e908" gracePeriod=120 Apr 23 16:36:21.647223 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:21.647198 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b20dd212-869a-4e0b-9556-71ca42353761" containerName="config-reloader" containerID="cri-o://a2dfe6f563099b5cf30dd3cf5811c122b51d4542bbb05106afc5cb709a05113d" gracePeriod=120 Apr 23 16:36:21.647472 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:21.647191 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b20dd212-869a-4e0b-9556-71ca42353761" containerName="kube-rbac-proxy-web" containerID="cri-o://6baeec6d2c60889767650fa485b6b3e10b70302fd470fb212848971764faf939" gracePeriod=120 Apr 23 16:36:21.647472 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:21.647201 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="b20dd212-869a-4e0b-9556-71ca42353761" containerName="kube-rbac-proxy" containerID="cri-o://99915e0e5a97058e6eb4b4d43a91a64cc3b5afef3e814e0a3ed7a745e5186755" gracePeriod=120 Apr 23 16:36:22.244987 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:22.244937 2573 generic.go:358] "Generic (PLEG): container finished" podID="b20dd212-869a-4e0b-9556-71ca42353761" containerID="9b56453f321de1390ecbf6639aaab661bc75bec4691401847f5e5532ce62e908" exitCode=0 Apr 23 16:36:22.244987 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:22.244971 2573 generic.go:358] "Generic (PLEG): container finished" podID="b20dd212-869a-4e0b-9556-71ca42353761" containerID="5d74fa0644e2a80d1a27a61e6cd86a625e8ceaa56b74fe4e2d580d55ea72593f" exitCode=0 Apr 23 16:36:22.244987 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:22.244980 2573 generic.go:358] "Generic (PLEG): container finished" podID="b20dd212-869a-4e0b-9556-71ca42353761" containerID="99915e0e5a97058e6eb4b4d43a91a64cc3b5afef3e814e0a3ed7a745e5186755" exitCode=0 Apr 23 16:36:22.244987 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:22.244990 2573 generic.go:358] "Generic (PLEG): container finished" podID="b20dd212-869a-4e0b-9556-71ca42353761" containerID="a2dfe6f563099b5cf30dd3cf5811c122b51d4542bbb05106afc5cb709a05113d" exitCode=0 Apr 23 16:36:22.245249 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:22.245000 2573 generic.go:358] "Generic (PLEG): container finished" podID="b20dd212-869a-4e0b-9556-71ca42353761" containerID="86e8d38696836b68017f58a31139902a153884a67d27413c7f5ae85e5236f8f1" exitCode=0 Apr 23 16:36:22.245249 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:22.245016 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b20dd212-869a-4e0b-9556-71ca42353761","Type":"ContainerDied","Data":"9b56453f321de1390ecbf6639aaab661bc75bec4691401847f5e5532ce62e908"} Apr 23 16:36:22.245249 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:22.245054 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b20dd212-869a-4e0b-9556-71ca42353761","Type":"ContainerDied","Data":"5d74fa0644e2a80d1a27a61e6cd86a625e8ceaa56b74fe4e2d580d55ea72593f"} Apr 23 16:36:22.245249 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:22.245064 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b20dd212-869a-4e0b-9556-71ca42353761","Type":"ContainerDied","Data":"99915e0e5a97058e6eb4b4d43a91a64cc3b5afef3e814e0a3ed7a745e5186755"} Apr 23 16:36:22.245249 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:22.245075 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b20dd212-869a-4e0b-9556-71ca42353761","Type":"ContainerDied","Data":"a2dfe6f563099b5cf30dd3cf5811c122b51d4542bbb05106afc5cb709a05113d"} Apr 23 16:36:22.245249 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:22.245083 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b20dd212-869a-4e0b-9556-71ca42353761","Type":"ContainerDied","Data":"86e8d38696836b68017f58a31139902a153884a67d27413c7f5ae85e5236f8f1"} Apr 23 16:36:22.897243 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:22.897219 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.024498 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.024417 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b20dd212-869a-4e0b-9556-71ca42353761-config-out\") pod \"b20dd212-869a-4e0b-9556-71ca42353761\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " Apr 23 16:36:23.024498 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.024459 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-secret-alertmanager-main-tls\") pod \"b20dd212-869a-4e0b-9556-71ca42353761\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " Apr 23 16:36:23.024498 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.024496 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-secret-alertmanager-kube-rbac-proxy\") pod \"b20dd212-869a-4e0b-9556-71ca42353761\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " Apr 23 16:36:23.024785 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.024515 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-config-volume\") pod \"b20dd212-869a-4e0b-9556-71ca42353761\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " Apr 23 16:36:23.024785 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.024542 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b20dd212-869a-4e0b-9556-71ca42353761-metrics-client-ca\") pod \"b20dd212-869a-4e0b-9556-71ca42353761\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " Apr 23 16:36:23.024785 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.024565 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-secret-alertmanager-kube-rbac-proxy-web\") pod \"b20dd212-869a-4e0b-9556-71ca42353761\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " Apr 23 16:36:23.024785 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.024593 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-secret-alertmanager-kube-rbac-proxy-metric\") pod \"b20dd212-869a-4e0b-9556-71ca42353761\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " Apr 23 16:36:23.024785 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.024644 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b20dd212-869a-4e0b-9556-71ca42353761-alertmanager-main-db\") pod \"b20dd212-869a-4e0b-9556-71ca42353761\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " Apr 23 16:36:23.024785 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.024677 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-cluster-tls-config\") pod \"b20dd212-869a-4e0b-9556-71ca42353761\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " Apr 23 16:36:23.024785 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.024723 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b20dd212-869a-4e0b-9556-71ca42353761-tls-assets\") pod \"b20dd212-869a-4e0b-9556-71ca42353761\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " Apr 23 16:36:23.024785 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.024751 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-web-config\") pod \"b20dd212-869a-4e0b-9556-71ca42353761\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " Apr 23 16:36:23.025208 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.024808 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b20dd212-869a-4e0b-9556-71ca42353761-alertmanager-trusted-ca-bundle\") pod \"b20dd212-869a-4e0b-9556-71ca42353761\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " Apr 23 16:36:23.025208 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.024847 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fztt6\" (UniqueName: \"kubernetes.io/projected/b20dd212-869a-4e0b-9556-71ca42353761-kube-api-access-fztt6\") pod \"b20dd212-869a-4e0b-9556-71ca42353761\" (UID: \"b20dd212-869a-4e0b-9556-71ca42353761\") " Apr 23 16:36:23.025208 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.024984 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b20dd212-869a-4e0b-9556-71ca42353761-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "b20dd212-869a-4e0b-9556-71ca42353761" (UID: "b20dd212-869a-4e0b-9556-71ca42353761"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:36:23.025208 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.025161 2573 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/b20dd212-869a-4e0b-9556-71ca42353761-metrics-client-ca\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:36:23.025950 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.025885 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b20dd212-869a-4e0b-9556-71ca42353761-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "b20dd212-869a-4e0b-9556-71ca42353761" (UID: "b20dd212-869a-4e0b-9556-71ca42353761"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:36:23.025950 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.025884 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b20dd212-869a-4e0b-9556-71ca42353761-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "b20dd212-869a-4e0b-9556-71ca42353761" (UID: "b20dd212-869a-4e0b-9556-71ca42353761"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:36:23.027362 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.027330 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "b20dd212-869a-4e0b-9556-71ca42353761" (UID: "b20dd212-869a-4e0b-9556-71ca42353761"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:36:23.027474 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.027422 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "b20dd212-869a-4e0b-9556-71ca42353761" (UID: "b20dd212-869a-4e0b-9556-71ca42353761"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:36:23.028031 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.027986 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b20dd212-869a-4e0b-9556-71ca42353761-kube-api-access-fztt6" (OuterVolumeSpecName: "kube-api-access-fztt6") pod "b20dd212-869a-4e0b-9556-71ca42353761" (UID: "b20dd212-869a-4e0b-9556-71ca42353761"). InnerVolumeSpecName "kube-api-access-fztt6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:36:23.028031 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.027986 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b20dd212-869a-4e0b-9556-71ca42353761-config-out" (OuterVolumeSpecName: "config-out") pod "b20dd212-869a-4e0b-9556-71ca42353761" (UID: "b20dd212-869a-4e0b-9556-71ca42353761"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:36:23.028167 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.028044 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-config-volume" (OuterVolumeSpecName: "config-volume") pod "b20dd212-869a-4e0b-9556-71ca42353761" (UID: "b20dd212-869a-4e0b-9556-71ca42353761"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:36:23.028167 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.028081 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "b20dd212-869a-4e0b-9556-71ca42353761" (UID: "b20dd212-869a-4e0b-9556-71ca42353761"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:36:23.028397 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.028376 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "b20dd212-869a-4e0b-9556-71ca42353761" (UID: "b20dd212-869a-4e0b-9556-71ca42353761"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:36:23.028984 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.028964 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b20dd212-869a-4e0b-9556-71ca42353761-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "b20dd212-869a-4e0b-9556-71ca42353761" (UID: "b20dd212-869a-4e0b-9556-71ca42353761"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:36:23.032642 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.032617 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "b20dd212-869a-4e0b-9556-71ca42353761" (UID: "b20dd212-869a-4e0b-9556-71ca42353761"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:36:23.038893 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.038872 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-web-config" (OuterVolumeSpecName: "web-config") pod "b20dd212-869a-4e0b-9556-71ca42353761" (UID: "b20dd212-869a-4e0b-9556-71ca42353761"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:36:23.126273 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.126232 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-secret-alertmanager-main-tls\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:36:23.126273 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.126268 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:36:23.126273 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.126284 2573 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-config-volume\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:36:23.126542 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.126298 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:36:23.126542 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.126311 2573 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:36:23.126542 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.126325 2573 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/b20dd212-869a-4e0b-9556-71ca42353761-alertmanager-main-db\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:36:23.126542 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.126337 2573 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-cluster-tls-config\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:36:23.126542 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.126348 2573 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/b20dd212-869a-4e0b-9556-71ca42353761-tls-assets\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:36:23.126542 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.126361 2573 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/b20dd212-869a-4e0b-9556-71ca42353761-web-config\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:36:23.126542 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.126372 2573 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b20dd212-869a-4e0b-9556-71ca42353761-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:36:23.126542 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.126398 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fztt6\" (UniqueName: \"kubernetes.io/projected/b20dd212-869a-4e0b-9556-71ca42353761-kube-api-access-fztt6\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:36:23.126542 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.126410 2573 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/b20dd212-869a-4e0b-9556-71ca42353761-config-out\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:36:23.250539 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.250498 2573 generic.go:358] "Generic (PLEG): container finished" podID="b20dd212-869a-4e0b-9556-71ca42353761" containerID="6baeec6d2c60889767650fa485b6b3e10b70302fd470fb212848971764faf939" exitCode=0 Apr 23 16:36:23.250719 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.250584 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b20dd212-869a-4e0b-9556-71ca42353761","Type":"ContainerDied","Data":"6baeec6d2c60889767650fa485b6b3e10b70302fd470fb212848971764faf939"} Apr 23 16:36:23.250719 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.250615 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.250719 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.250630 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"b20dd212-869a-4e0b-9556-71ca42353761","Type":"ContainerDied","Data":"a4ee65b2cde0a1d2530f390a0265e6528ebfbce1b9c1e3d8f90a3119c49ab177"} Apr 23 16:36:23.250719 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.250650 2573 scope.go:117] "RemoveContainer" containerID="9b56453f321de1390ecbf6639aaab661bc75bec4691401847f5e5532ce62e908" Apr 23 16:36:23.259274 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.259256 2573 scope.go:117] "RemoveContainer" containerID="5d74fa0644e2a80d1a27a61e6cd86a625e8ceaa56b74fe4e2d580d55ea72593f" Apr 23 16:36:23.269917 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.269893 2573 scope.go:117] "RemoveContainer" containerID="99915e0e5a97058e6eb4b4d43a91a64cc3b5afef3e814e0a3ed7a745e5186755" Apr 23 16:36:23.275117 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.275051 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:36:23.279651 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.279548 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:36:23.282399 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.282377 2573 scope.go:117] "RemoveContainer" containerID="6baeec6d2c60889767650fa485b6b3e10b70302fd470fb212848971764faf939" Apr 23 16:36:23.289206 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.289181 2573 scope.go:117] "RemoveContainer" containerID="a2dfe6f563099b5cf30dd3cf5811c122b51d4542bbb05106afc5cb709a05113d" Apr 23 16:36:23.296173 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.296154 2573 scope.go:117] "RemoveContainer" containerID="86e8d38696836b68017f58a31139902a153884a67d27413c7f5ae85e5236f8f1" Apr 23 16:36:23.302968 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.302950 2573 scope.go:117] "RemoveContainer" containerID="f8f0677c0a157d6ba0b1401f823626524fa5fa8c543e83cb57e4bd9b133248da" Apr 23 16:36:23.309775 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.309741 2573 scope.go:117] "RemoveContainer" containerID="9b56453f321de1390ecbf6639aaab661bc75bec4691401847f5e5532ce62e908" Apr 23 16:36:23.310109 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:36:23.310081 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9b56453f321de1390ecbf6639aaab661bc75bec4691401847f5e5532ce62e908\": container with ID starting with 9b56453f321de1390ecbf6639aaab661bc75bec4691401847f5e5532ce62e908 not found: ID does not exist" containerID="9b56453f321de1390ecbf6639aaab661bc75bec4691401847f5e5532ce62e908" Apr 23 16:36:23.310323 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.310122 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9b56453f321de1390ecbf6639aaab661bc75bec4691401847f5e5532ce62e908"} err="failed to get container status \"9b56453f321de1390ecbf6639aaab661bc75bec4691401847f5e5532ce62e908\": rpc error: code = NotFound desc = could not find container \"9b56453f321de1390ecbf6639aaab661bc75bec4691401847f5e5532ce62e908\": container with ID starting with 9b56453f321de1390ecbf6639aaab661bc75bec4691401847f5e5532ce62e908 not found: ID does not exist" Apr 23 16:36:23.310323 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.310176 2573 scope.go:117] "RemoveContainer" containerID="5d74fa0644e2a80d1a27a61e6cd86a625e8ceaa56b74fe4e2d580d55ea72593f" Apr 23 16:36:23.310496 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:36:23.310478 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d74fa0644e2a80d1a27a61e6cd86a625e8ceaa56b74fe4e2d580d55ea72593f\": container with ID starting with 5d74fa0644e2a80d1a27a61e6cd86a625e8ceaa56b74fe4e2d580d55ea72593f not found: ID does not exist" containerID="5d74fa0644e2a80d1a27a61e6cd86a625e8ceaa56b74fe4e2d580d55ea72593f" Apr 23 16:36:23.310542 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.310504 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d74fa0644e2a80d1a27a61e6cd86a625e8ceaa56b74fe4e2d580d55ea72593f"} err="failed to get container status \"5d74fa0644e2a80d1a27a61e6cd86a625e8ceaa56b74fe4e2d580d55ea72593f\": rpc error: code = NotFound desc = could not find container \"5d74fa0644e2a80d1a27a61e6cd86a625e8ceaa56b74fe4e2d580d55ea72593f\": container with ID starting with 5d74fa0644e2a80d1a27a61e6cd86a625e8ceaa56b74fe4e2d580d55ea72593f not found: ID does not exist" Apr 23 16:36:23.310542 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.310521 2573 scope.go:117] "RemoveContainer" containerID="99915e0e5a97058e6eb4b4d43a91a64cc3b5afef3e814e0a3ed7a745e5186755" Apr 23 16:36:23.310807 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:36:23.310771 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"99915e0e5a97058e6eb4b4d43a91a64cc3b5afef3e814e0a3ed7a745e5186755\": container with ID starting with 99915e0e5a97058e6eb4b4d43a91a64cc3b5afef3e814e0a3ed7a745e5186755 not found: ID does not exist" containerID="99915e0e5a97058e6eb4b4d43a91a64cc3b5afef3e814e0a3ed7a745e5186755" Apr 23 16:36:23.310889 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.310814 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"99915e0e5a97058e6eb4b4d43a91a64cc3b5afef3e814e0a3ed7a745e5186755"} err="failed to get container status \"99915e0e5a97058e6eb4b4d43a91a64cc3b5afef3e814e0a3ed7a745e5186755\": rpc error: code = NotFound desc = could not find container \"99915e0e5a97058e6eb4b4d43a91a64cc3b5afef3e814e0a3ed7a745e5186755\": container with ID starting with 99915e0e5a97058e6eb4b4d43a91a64cc3b5afef3e814e0a3ed7a745e5186755 not found: ID does not exist" Apr 23 16:36:23.310889 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.310836 2573 scope.go:117] "RemoveContainer" containerID="6baeec6d2c60889767650fa485b6b3e10b70302fd470fb212848971764faf939" Apr 23 16:36:23.310991 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.310927 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:36:23.311093 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:36:23.311070 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6baeec6d2c60889767650fa485b6b3e10b70302fd470fb212848971764faf939\": container with ID starting with 6baeec6d2c60889767650fa485b6b3e10b70302fd470fb212848971764faf939 not found: ID does not exist" containerID="6baeec6d2c60889767650fa485b6b3e10b70302fd470fb212848971764faf939" Apr 23 16:36:23.311152 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.311104 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6baeec6d2c60889767650fa485b6b3e10b70302fd470fb212848971764faf939"} err="failed to get container status \"6baeec6d2c60889767650fa485b6b3e10b70302fd470fb212848971764faf939\": rpc error: code = NotFound desc = could not find container \"6baeec6d2c60889767650fa485b6b3e10b70302fd470fb212848971764faf939\": container with ID starting with 6baeec6d2c60889767650fa485b6b3e10b70302fd470fb212848971764faf939 not found: ID does not exist" Apr 23 16:36:23.311152 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.311126 2573 scope.go:117] "RemoveContainer" containerID="a2dfe6f563099b5cf30dd3cf5811c122b51d4542bbb05106afc5cb709a05113d" Apr 23 16:36:23.311255 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.311238 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b20dd212-869a-4e0b-9556-71ca42353761" containerName="prom-label-proxy" Apr 23 16:36:23.311302 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.311259 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b20dd212-869a-4e0b-9556-71ca42353761" containerName="prom-label-proxy" Apr 23 16:36:23.311302 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.311274 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b20dd212-869a-4e0b-9556-71ca42353761" containerName="init-config-reloader" Apr 23 16:36:23.311302 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.311282 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b20dd212-869a-4e0b-9556-71ca42353761" containerName="init-config-reloader" Apr 23 16:36:23.311302 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.311299 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b20dd212-869a-4e0b-9556-71ca42353761" containerName="kube-rbac-proxy-metric" Apr 23 16:36:23.311483 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.311309 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b20dd212-869a-4e0b-9556-71ca42353761" containerName="kube-rbac-proxy-metric" Apr 23 16:36:23.311483 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.311325 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b20dd212-869a-4e0b-9556-71ca42353761" containerName="alertmanager" Apr 23 16:36:23.311483 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.311334 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b20dd212-869a-4e0b-9556-71ca42353761" containerName="alertmanager" Apr 23 16:36:23.311483 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.311343 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b20dd212-869a-4e0b-9556-71ca42353761" containerName="config-reloader" Apr 23 16:36:23.311483 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.311349 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b20dd212-869a-4e0b-9556-71ca42353761" containerName="config-reloader" Apr 23 16:36:23.311483 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.311354 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b20dd212-869a-4e0b-9556-71ca42353761" containerName="kube-rbac-proxy-web" Apr 23 16:36:23.311483 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.311359 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b20dd212-869a-4e0b-9556-71ca42353761" containerName="kube-rbac-proxy-web" Apr 23 16:36:23.311483 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.311365 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b20dd212-869a-4e0b-9556-71ca42353761" containerName="kube-rbac-proxy" Apr 23 16:36:23.311483 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.311370 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b20dd212-869a-4e0b-9556-71ca42353761" containerName="kube-rbac-proxy" Apr 23 16:36:23.311483 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:36:23.311391 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2dfe6f563099b5cf30dd3cf5811c122b51d4542bbb05106afc5cb709a05113d\": container with ID starting with a2dfe6f563099b5cf30dd3cf5811c122b51d4542bbb05106afc5cb709a05113d not found: ID does not exist" containerID="a2dfe6f563099b5cf30dd3cf5811c122b51d4542bbb05106afc5cb709a05113d" Apr 23 16:36:23.311483 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.311428 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b20dd212-869a-4e0b-9556-71ca42353761" containerName="kube-rbac-proxy" Apr 23 16:36:23.311483 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.311442 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b20dd212-869a-4e0b-9556-71ca42353761" containerName="kube-rbac-proxy-metric" Apr 23 16:36:23.311483 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.311449 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b20dd212-869a-4e0b-9556-71ca42353761" containerName="prom-label-proxy" Apr 23 16:36:23.311483 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.311455 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b20dd212-869a-4e0b-9556-71ca42353761" containerName="config-reloader" Apr 23 16:36:23.311483 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.311462 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b20dd212-869a-4e0b-9556-71ca42353761" containerName="alertmanager" Apr 23 16:36:23.311483 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.311470 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b20dd212-869a-4e0b-9556-71ca42353761" containerName="kube-rbac-proxy-web" Apr 23 16:36:23.312097 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.311417 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2dfe6f563099b5cf30dd3cf5811c122b51d4542bbb05106afc5cb709a05113d"} err="failed to get container status \"a2dfe6f563099b5cf30dd3cf5811c122b51d4542bbb05106afc5cb709a05113d\": rpc error: code = NotFound desc = could not find container \"a2dfe6f563099b5cf30dd3cf5811c122b51d4542bbb05106afc5cb709a05113d\": container with ID starting with a2dfe6f563099b5cf30dd3cf5811c122b51d4542bbb05106afc5cb709a05113d not found: ID does not exist" Apr 23 16:36:23.312097 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.311521 2573 scope.go:117] "RemoveContainer" containerID="86e8d38696836b68017f58a31139902a153884a67d27413c7f5ae85e5236f8f1" Apr 23 16:36:23.312097 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:36:23.311862 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86e8d38696836b68017f58a31139902a153884a67d27413c7f5ae85e5236f8f1\": container with ID starting with 86e8d38696836b68017f58a31139902a153884a67d27413c7f5ae85e5236f8f1 not found: ID does not exist" containerID="86e8d38696836b68017f58a31139902a153884a67d27413c7f5ae85e5236f8f1" Apr 23 16:36:23.312097 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.311885 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86e8d38696836b68017f58a31139902a153884a67d27413c7f5ae85e5236f8f1"} err="failed to get container status \"86e8d38696836b68017f58a31139902a153884a67d27413c7f5ae85e5236f8f1\": rpc error: code = NotFound desc = could not find container \"86e8d38696836b68017f58a31139902a153884a67d27413c7f5ae85e5236f8f1\": container with ID starting with 86e8d38696836b68017f58a31139902a153884a67d27413c7f5ae85e5236f8f1 not found: ID does not exist" Apr 23 16:36:23.312097 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.311905 2573 scope.go:117] "RemoveContainer" containerID="f8f0677c0a157d6ba0b1401f823626524fa5fa8c543e83cb57e4bd9b133248da" Apr 23 16:36:23.312257 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:36:23.312122 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8f0677c0a157d6ba0b1401f823626524fa5fa8c543e83cb57e4bd9b133248da\": container with ID starting with f8f0677c0a157d6ba0b1401f823626524fa5fa8c543e83cb57e4bd9b133248da not found: ID does not exist" containerID="f8f0677c0a157d6ba0b1401f823626524fa5fa8c543e83cb57e4bd9b133248da" Apr 23 16:36:23.312257 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.312140 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8f0677c0a157d6ba0b1401f823626524fa5fa8c543e83cb57e4bd9b133248da"} err="failed to get container status \"f8f0677c0a157d6ba0b1401f823626524fa5fa8c543e83cb57e4bd9b133248da\": rpc error: code = NotFound desc = could not find container \"f8f0677c0a157d6ba0b1401f823626524fa5fa8c543e83cb57e4bd9b133248da\": container with ID starting with f8f0677c0a157d6ba0b1401f823626524fa5fa8c543e83cb57e4bd9b133248da not found: ID does not exist" Apr 23 16:36:23.316410 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.316393 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.321683 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.321664 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 23 16:36:23.321683 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.321675 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 23 16:36:23.321943 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.321926 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 23 16:36:23.322023 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.321945 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 23 16:36:23.322023 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.321953 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 23 16:36:23.322222 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.322205 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 23 16:36:23.322275 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.322211 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-5gm2b\"" Apr 23 16:36:23.322561 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.322460 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 23 16:36:23.324052 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.324029 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 23 16:36:23.329240 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.329221 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 23 16:36:23.332043 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.332021 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:36:23.428203 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.428169 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f1188671-4e6b-4695-8519-525d5a1559ed-web-config\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.428203 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.428215 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f1188671-4e6b-4695-8519-525d5a1559ed-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.428533 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.428233 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1188671-4e6b-4695-8519-525d5a1559ed-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.428533 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.428254 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f1188671-4e6b-4695-8519-525d5a1559ed-config-volume\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.428533 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.428272 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f1188671-4e6b-4695-8519-525d5a1559ed-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.428533 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.428300 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f1188671-4e6b-4695-8519-525d5a1559ed-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.428533 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.428330 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f1188671-4e6b-4695-8519-525d5a1559ed-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.428533 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.428345 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztvxt\" (UniqueName: \"kubernetes.io/projected/f1188671-4e6b-4695-8519-525d5a1559ed-kube-api-access-ztvxt\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.428533 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.428433 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f1188671-4e6b-4695-8519-525d5a1559ed-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.428533 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.428477 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1188671-4e6b-4695-8519-525d5a1559ed-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.428533 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.428533 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f1188671-4e6b-4695-8519-525d5a1559ed-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.428889 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.428576 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f1188671-4e6b-4695-8519-525d5a1559ed-config-out\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.428889 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.428602 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f1188671-4e6b-4695-8519-525d5a1559ed-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.529412 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.529322 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1188671-4e6b-4695-8519-525d5a1559ed-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.529412 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.529363 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f1188671-4e6b-4695-8519-525d5a1559ed-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.529412 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.529386 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f1188671-4e6b-4695-8519-525d5a1559ed-config-out\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.529412 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.529404 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f1188671-4e6b-4695-8519-525d5a1559ed-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.529730 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.529426 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f1188671-4e6b-4695-8519-525d5a1559ed-web-config\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.529730 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.529459 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f1188671-4e6b-4695-8519-525d5a1559ed-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.529730 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.529475 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1188671-4e6b-4695-8519-525d5a1559ed-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.529730 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.529498 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f1188671-4e6b-4695-8519-525d5a1559ed-config-volume\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.529730 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.529522 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f1188671-4e6b-4695-8519-525d5a1559ed-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.529972 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.529941 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f1188671-4e6b-4695-8519-525d5a1559ed-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.530027 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.529996 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f1188671-4e6b-4695-8519-525d5a1559ed-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.530076 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.530026 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ztvxt\" (UniqueName: \"kubernetes.io/projected/f1188671-4e6b-4695-8519-525d5a1559ed-kube-api-access-ztvxt\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.530130 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.530078 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f1188671-4e6b-4695-8519-525d5a1559ed-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.530379 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.530349 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f1188671-4e6b-4695-8519-525d5a1559ed-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.530500 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.530473 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f1188671-4e6b-4695-8519-525d5a1559ed-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.530648 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.530621 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/f1188671-4e6b-4695-8519-525d5a1559ed-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.532479 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.532431 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f1188671-4e6b-4695-8519-525d5a1559ed-config-out\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.532643 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.532623 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f1188671-4e6b-4695-8519-525d5a1559ed-web-config\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.533150 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.533126 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f1188671-4e6b-4695-8519-525d5a1559ed-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.533286 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.533151 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/f1188671-4e6b-4695-8519-525d5a1559ed-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.533286 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.533208 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/f1188671-4e6b-4695-8519-525d5a1559ed-config-volume\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.533286 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.533213 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/f1188671-4e6b-4695-8519-525d5a1559ed-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.533463 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.533359 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/f1188671-4e6b-4695-8519-525d5a1559ed-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.533463 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.533428 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f1188671-4e6b-4695-8519-525d5a1559ed-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.534200 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.534181 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f1188671-4e6b-4695-8519-525d5a1559ed-tls-assets\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.539405 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.539383 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztvxt\" (UniqueName: \"kubernetes.io/projected/f1188671-4e6b-4695-8519-525d5a1559ed-kube-api-access-ztvxt\") pod \"alertmanager-main-0\" (UID: \"f1188671-4e6b-4695-8519-525d5a1559ed\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.625744 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.625710 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 16:36:23.754274 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:23.754239 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 16:36:23.758660 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:36:23.758632 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf1188671_4e6b_4695_8519_525d5a1559ed.slice/crio-90e57f075798985c6e28406a445a8ffe63cecd352612a34189f763de59a836a9 WatchSource:0}: Error finding container 90e57f075798985c6e28406a445a8ffe63cecd352612a34189f763de59a836a9: Status 404 returned error can't find the container with id 90e57f075798985c6e28406a445a8ffe63cecd352612a34189f763de59a836a9 Apr 23 16:36:24.254517 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:24.254483 2573 generic.go:358] "Generic (PLEG): container finished" podID="f1188671-4e6b-4695-8519-525d5a1559ed" containerID="b4792dbf47c62edf418ab05aae328adafb4cddebca66628a639dae5d1bae6ce2" exitCode=0 Apr 23 16:36:24.254963 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:24.254571 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f1188671-4e6b-4695-8519-525d5a1559ed","Type":"ContainerDied","Data":"b4792dbf47c62edf418ab05aae328adafb4cddebca66628a639dae5d1bae6ce2"} Apr 23 16:36:24.254963 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:24.254607 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f1188671-4e6b-4695-8519-525d5a1559ed","Type":"ContainerStarted","Data":"90e57f075798985c6e28406a445a8ffe63cecd352612a34189f763de59a836a9"} Apr 23 16:36:24.879965 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:24.879868 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b20dd212-869a-4e0b-9556-71ca42353761" path="/var/lib/kubelet/pods/b20dd212-869a-4e0b-9556-71ca42353761/volumes" Apr 23 16:36:25.261410 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:25.261374 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f1188671-4e6b-4695-8519-525d5a1559ed","Type":"ContainerStarted","Data":"d92e4f962183f73d5eb92218b8c9c6f6e0eeda20891db40f9562738164bef71f"} Apr 23 16:36:25.261410 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:25.261411 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f1188671-4e6b-4695-8519-525d5a1559ed","Type":"ContainerStarted","Data":"41880604931d52adab9b3220c48e6b60d62596624c034c63fd18ef521e49a9af"} Apr 23 16:36:25.261837 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:25.261421 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f1188671-4e6b-4695-8519-525d5a1559ed","Type":"ContainerStarted","Data":"1ada4f29824af8c33726c134f746e8530e90e089ab69a789510e5496c6a35283"} Apr 23 16:36:25.261837 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:25.261430 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f1188671-4e6b-4695-8519-525d5a1559ed","Type":"ContainerStarted","Data":"476e947754963766b7c252c67a5556172ab8e449a9c273fde3b645cd9a953c4e"} Apr 23 16:36:25.261837 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:25.261440 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f1188671-4e6b-4695-8519-525d5a1559ed","Type":"ContainerStarted","Data":"538f232c1bd1f65f4f721bf49e4de229b0e535f1ed9e91f2f1cafb0da8b26a64"} Apr 23 16:36:25.261837 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:25.261448 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"f1188671-4e6b-4695-8519-525d5a1559ed","Type":"ContainerStarted","Data":"126dcbf2cca3ce27079e29ff866762a6bf55c9cd0b3c4b49131d4eec62a279a0"} Apr 23 16:36:25.292506 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:25.292446 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.2924292 podStartE2EDuration="2.2924292s" podCreationTimestamp="2026-04-23 16:36:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:36:25.289400526 +0000 UTC m=+90.979186872" watchObservedRunningTime="2026-04-23 16:36:25.2924292 +0000 UTC m=+90.982215547" Apr 23 16:36:25.909815 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:25.909779 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:36:25.910369 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:25.910214 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f8c95658-88b6-40ea-8e55-36e1b9b16d38" containerName="prometheus" containerID="cri-o://1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8" gracePeriod=600 Apr 23 16:36:25.910369 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:25.910225 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f8c95658-88b6-40ea-8e55-36e1b9b16d38" containerName="kube-rbac-proxy" containerID="cri-o://43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58" gracePeriod=600 Apr 23 16:36:25.910369 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:25.910258 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f8c95658-88b6-40ea-8e55-36e1b9b16d38" containerName="thanos-sidecar" containerID="cri-o://d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048" gracePeriod=600 Apr 23 16:36:25.910369 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:25.910276 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f8c95658-88b6-40ea-8e55-36e1b9b16d38" containerName="config-reloader" containerID="cri-o://fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99" gracePeriod=600 Apr 23 16:36:25.910369 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:25.910357 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f8c95658-88b6-40ea-8e55-36e1b9b16d38" containerName="kube-rbac-proxy-thanos" containerID="cri-o://2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219" gracePeriod=600 Apr 23 16:36:25.910844 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:25.910259 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="f8c95658-88b6-40ea-8e55-36e1b9b16d38" containerName="kube-rbac-proxy-web" containerID="cri-o://4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740" gracePeriod=600 Apr 23 16:36:26.157396 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.157372 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.251117 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.251021 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " Apr 23 16:36:26.251117 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.251084 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zd7p2\" (UniqueName: \"kubernetes.io/projected/f8c95658-88b6-40ea-8e55-36e1b9b16d38-kube-api-access-zd7p2\") pod \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " Apr 23 16:36:26.251117 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.251115 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8c95658-88b6-40ea-8e55-36e1b9b16d38-configmap-kubelet-serving-ca-bundle\") pod \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " Apr 23 16:36:26.251379 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.251143 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " Apr 23 16:36:26.251379 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.251166 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-web-config\") pod \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " Apr 23 16:36:26.251379 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.251213 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f8c95658-88b6-40ea-8e55-36e1b9b16d38-tls-assets\") pod \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " Apr 23 16:36:26.251379 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.251239 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-grpc-tls\") pod \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " Apr 23 16:36:26.251379 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.251262 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-metrics-client-certs\") pod \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " Apr 23 16:36:26.251379 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.251289 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-thanos-prometheus-http-client-file\") pod \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " Apr 23 16:36:26.251379 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.251326 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-prometheus-k8s-tls\") pod \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " Apr 23 16:36:26.251379 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.251352 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f8c95658-88b6-40ea-8e55-36e1b9b16d38-config-out\") pod \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " Apr 23 16:36:26.251802 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.251382 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f8c95658-88b6-40ea-8e55-36e1b9b16d38-prometheus-k8s-rulefiles-0\") pod \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " Apr 23 16:36:26.251802 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.251409 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8c95658-88b6-40ea-8e55-36e1b9b16d38-prometheus-trusted-ca-bundle\") pod \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " Apr 23 16:36:26.251802 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.251437 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-config\") pod \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " Apr 23 16:36:26.251802 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.251477 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-kube-rbac-proxy\") pod \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " Apr 23 16:36:26.251802 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.251507 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f8c95658-88b6-40ea-8e55-36e1b9b16d38-configmap-metrics-client-ca\") pod \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " Apr 23 16:36:26.251802 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.251535 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8c95658-88b6-40ea-8e55-36e1b9b16d38-configmap-serving-certs-ca-bundle\") pod \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " Apr 23 16:36:26.251802 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.251561 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f8c95658-88b6-40ea-8e55-36e1b9b16d38-prometheus-k8s-db\") pod \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\" (UID: \"f8c95658-88b6-40ea-8e55-36e1b9b16d38\") " Apr 23 16:36:26.251802 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.251591 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8c95658-88b6-40ea-8e55-36e1b9b16d38-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "f8c95658-88b6-40ea-8e55-36e1b9b16d38" (UID: "f8c95658-88b6-40ea-8e55-36e1b9b16d38"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:36:26.252191 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.251924 2573 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8c95658-88b6-40ea-8e55-36e1b9b16d38-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:36:26.252478 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.252455 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8c95658-88b6-40ea-8e55-36e1b9b16d38-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "f8c95658-88b6-40ea-8e55-36e1b9b16d38" (UID: "f8c95658-88b6-40ea-8e55-36e1b9b16d38"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:36:26.252918 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.252895 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8c95658-88b6-40ea-8e55-36e1b9b16d38-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "f8c95658-88b6-40ea-8e55-36e1b9b16d38" (UID: "f8c95658-88b6-40ea-8e55-36e1b9b16d38"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:36:26.253138 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.253110 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8c95658-88b6-40ea-8e55-36e1b9b16d38-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "f8c95658-88b6-40ea-8e55-36e1b9b16d38" (UID: "f8c95658-88b6-40ea-8e55-36e1b9b16d38"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:36:26.254233 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.253947 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8c95658-88b6-40ea-8e55-36e1b9b16d38-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "f8c95658-88b6-40ea-8e55-36e1b9b16d38" (UID: "f8c95658-88b6-40ea-8e55-36e1b9b16d38"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:36:26.254233 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.254111 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8c95658-88b6-40ea-8e55-36e1b9b16d38-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "f8c95658-88b6-40ea-8e55-36e1b9b16d38" (UID: "f8c95658-88b6-40ea-8e55-36e1b9b16d38"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:36:26.254625 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.254431 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8c95658-88b6-40ea-8e55-36e1b9b16d38-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "f8c95658-88b6-40ea-8e55-36e1b9b16d38" (UID: "f8c95658-88b6-40ea-8e55-36e1b9b16d38"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:36:26.254625 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.254448 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "f8c95658-88b6-40ea-8e55-36e1b9b16d38" (UID: "f8c95658-88b6-40ea-8e55-36e1b9b16d38"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:36:26.254625 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.254568 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "f8c95658-88b6-40ea-8e55-36e1b9b16d38" (UID: "f8c95658-88b6-40ea-8e55-36e1b9b16d38"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:36:26.254806 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.254655 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8c95658-88b6-40ea-8e55-36e1b9b16d38-kube-api-access-zd7p2" (OuterVolumeSpecName: "kube-api-access-zd7p2") pod "f8c95658-88b6-40ea-8e55-36e1b9b16d38" (UID: "f8c95658-88b6-40ea-8e55-36e1b9b16d38"). InnerVolumeSpecName "kube-api-access-zd7p2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:36:26.255392 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.255354 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "f8c95658-88b6-40ea-8e55-36e1b9b16d38" (UID: "f8c95658-88b6-40ea-8e55-36e1b9b16d38"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:36:26.255489 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.255468 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-config" (OuterVolumeSpecName: "config") pod "f8c95658-88b6-40ea-8e55-36e1b9b16d38" (UID: "f8c95658-88b6-40ea-8e55-36e1b9b16d38"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:36:26.255884 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.255859 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f8c95658-88b6-40ea-8e55-36e1b9b16d38-config-out" (OuterVolumeSpecName: "config-out") pod "f8c95658-88b6-40ea-8e55-36e1b9b16d38" (UID: "f8c95658-88b6-40ea-8e55-36e1b9b16d38"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:36:26.256134 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.256113 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "f8c95658-88b6-40ea-8e55-36e1b9b16d38" (UID: "f8c95658-88b6-40ea-8e55-36e1b9b16d38"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:36:26.256199 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.256167 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "f8c95658-88b6-40ea-8e55-36e1b9b16d38" (UID: "f8c95658-88b6-40ea-8e55-36e1b9b16d38"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:36:26.256561 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.256531 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "f8c95658-88b6-40ea-8e55-36e1b9b16d38" (UID: "f8c95658-88b6-40ea-8e55-36e1b9b16d38"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:36:26.256650 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.256618 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "f8c95658-88b6-40ea-8e55-36e1b9b16d38" (UID: "f8c95658-88b6-40ea-8e55-36e1b9b16d38"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:36:26.264176 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.264150 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-web-config" (OuterVolumeSpecName: "web-config") pod "f8c95658-88b6-40ea-8e55-36e1b9b16d38" (UID: "f8c95658-88b6-40ea-8e55-36e1b9b16d38"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:36:26.267506 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.267483 2573 generic.go:358] "Generic (PLEG): container finished" podID="f8c95658-88b6-40ea-8e55-36e1b9b16d38" containerID="2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219" exitCode=0 Apr 23 16:36:26.267506 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.267503 2573 generic.go:358] "Generic (PLEG): container finished" podID="f8c95658-88b6-40ea-8e55-36e1b9b16d38" containerID="43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58" exitCode=0 Apr 23 16:36:26.267666 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.267510 2573 generic.go:358] "Generic (PLEG): container finished" podID="f8c95658-88b6-40ea-8e55-36e1b9b16d38" containerID="4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740" exitCode=0 Apr 23 16:36:26.267666 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.267516 2573 generic.go:358] "Generic (PLEG): container finished" podID="f8c95658-88b6-40ea-8e55-36e1b9b16d38" containerID="d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048" exitCode=0 Apr 23 16:36:26.267666 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.267522 2573 generic.go:358] "Generic (PLEG): container finished" podID="f8c95658-88b6-40ea-8e55-36e1b9b16d38" containerID="fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99" exitCode=0 Apr 23 16:36:26.267666 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.267527 2573 generic.go:358] "Generic (PLEG): container finished" podID="f8c95658-88b6-40ea-8e55-36e1b9b16d38" containerID="1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8" exitCode=0 Apr 23 16:36:26.267666 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.267563 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f8c95658-88b6-40ea-8e55-36e1b9b16d38","Type":"ContainerDied","Data":"2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219"} Apr 23 16:36:26.267666 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.267581 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.267666 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.267603 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f8c95658-88b6-40ea-8e55-36e1b9b16d38","Type":"ContainerDied","Data":"43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58"} Apr 23 16:36:26.267666 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.267616 2573 scope.go:117] "RemoveContainer" containerID="2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219" Apr 23 16:36:26.267666 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.267618 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f8c95658-88b6-40ea-8e55-36e1b9b16d38","Type":"ContainerDied","Data":"4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740"} Apr 23 16:36:26.267666 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.267633 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f8c95658-88b6-40ea-8e55-36e1b9b16d38","Type":"ContainerDied","Data":"d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048"} Apr 23 16:36:26.267666 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.267650 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f8c95658-88b6-40ea-8e55-36e1b9b16d38","Type":"ContainerDied","Data":"fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99"} Apr 23 16:36:26.267666 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.267664 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f8c95658-88b6-40ea-8e55-36e1b9b16d38","Type":"ContainerDied","Data":"1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8"} Apr 23 16:36:26.268164 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.267678 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"f8c95658-88b6-40ea-8e55-36e1b9b16d38","Type":"ContainerDied","Data":"ee6a37b1d7abf4a7da393d0a2688d97e9d449cb99850d4f6f613aa8657931ee3"} Apr 23 16:36:26.275092 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.274940 2573 scope.go:117] "RemoveContainer" containerID="43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58" Apr 23 16:36:26.282673 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.282651 2573 scope.go:117] "RemoveContainer" containerID="4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740" Apr 23 16:36:26.288828 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.288809 2573 scope.go:117] "RemoveContainer" containerID="d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048" Apr 23 16:36:26.291770 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.291752 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:36:26.297421 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.297399 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:36:26.298364 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.298344 2573 scope.go:117] "RemoveContainer" containerID="fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99" Apr 23 16:36:26.304520 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.304494 2573 scope.go:117] "RemoveContainer" containerID="1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8" Apr 23 16:36:26.310922 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.310907 2573 scope.go:117] "RemoveContainer" containerID="0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2" Apr 23 16:36:26.316771 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.316756 2573 scope.go:117] "RemoveContainer" containerID="2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219" Apr 23 16:36:26.317011 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:36:26.316994 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219\": container with ID starting with 2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219 not found: ID does not exist" containerID="2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219" Apr 23 16:36:26.317059 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.317020 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219"} err="failed to get container status \"2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219\": rpc error: code = NotFound desc = could not find container \"2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219\": container with ID starting with 2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219 not found: ID does not exist" Apr 23 16:36:26.317059 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.317039 2573 scope.go:117] "RemoveContainer" containerID="43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58" Apr 23 16:36:26.317252 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:36:26.317236 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58\": container with ID starting with 43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58 not found: ID does not exist" containerID="43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58" Apr 23 16:36:26.317302 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.317257 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58"} err="failed to get container status \"43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58\": rpc error: code = NotFound desc = could not find container \"43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58\": container with ID starting with 43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58 not found: ID does not exist" Apr 23 16:36:26.317302 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.317273 2573 scope.go:117] "RemoveContainer" containerID="4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740" Apr 23 16:36:26.317455 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:36:26.317440 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740\": container with ID starting with 4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740 not found: ID does not exist" containerID="4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740" Apr 23 16:36:26.317491 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.317458 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740"} err="failed to get container status \"4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740\": rpc error: code = NotFound desc = could not find container \"4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740\": container with ID starting with 4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740 not found: ID does not exist" Apr 23 16:36:26.317491 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.317469 2573 scope.go:117] "RemoveContainer" containerID="d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048" Apr 23 16:36:26.317636 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:36:26.317622 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048\": container with ID starting with d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048 not found: ID does not exist" containerID="d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048" Apr 23 16:36:26.317678 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.317639 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048"} err="failed to get container status \"d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048\": rpc error: code = NotFound desc = could not find container \"d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048\": container with ID starting with d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048 not found: ID does not exist" Apr 23 16:36:26.317678 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.317651 2573 scope.go:117] "RemoveContainer" containerID="fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99" Apr 23 16:36:26.317878 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:36:26.317864 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99\": container with ID starting with fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99 not found: ID does not exist" containerID="fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99" Apr 23 16:36:26.317921 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.317881 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99"} err="failed to get container status \"fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99\": rpc error: code = NotFound desc = could not find container \"fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99\": container with ID starting with fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99 not found: ID does not exist" Apr 23 16:36:26.317921 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.317893 2573 scope.go:117] "RemoveContainer" containerID="1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8" Apr 23 16:36:26.318107 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:36:26.318090 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8\": container with ID starting with 1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8 not found: ID does not exist" containerID="1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8" Apr 23 16:36:26.318145 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.318113 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8"} err="failed to get container status \"1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8\": rpc error: code = NotFound desc = could not find container \"1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8\": container with ID starting with 1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8 not found: ID does not exist" Apr 23 16:36:26.318145 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.318129 2573 scope.go:117] "RemoveContainer" containerID="0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2" Apr 23 16:36:26.318336 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:36:26.318321 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2\": container with ID starting with 0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2 not found: ID does not exist" containerID="0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2" Apr 23 16:36:26.318370 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.318341 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2"} err="failed to get container status \"0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2\": rpc error: code = NotFound desc = could not find container \"0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2\": container with ID starting with 0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2 not found: ID does not exist" Apr 23 16:36:26.318370 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.318358 2573 scope.go:117] "RemoveContainer" containerID="2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219" Apr 23 16:36:26.318576 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.318557 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219"} err="failed to get container status \"2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219\": rpc error: code = NotFound desc = could not find container \"2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219\": container with ID starting with 2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219 not found: ID does not exist" Apr 23 16:36:26.318618 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.318577 2573 scope.go:117] "RemoveContainer" containerID="43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58" Apr 23 16:36:26.318847 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.318829 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58"} err="failed to get container status \"43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58\": rpc error: code = NotFound desc = could not find container \"43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58\": container with ID starting with 43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58 not found: ID does not exist" Apr 23 16:36:26.318898 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.318848 2573 scope.go:117] "RemoveContainer" containerID="4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740" Apr 23 16:36:26.319049 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.319030 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740"} err="failed to get container status \"4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740\": rpc error: code = NotFound desc = could not find container \"4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740\": container with ID starting with 4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740 not found: ID does not exist" Apr 23 16:36:26.319097 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.319049 2573 scope.go:117] "RemoveContainer" containerID="d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048" Apr 23 16:36:26.319253 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.319238 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048"} err="failed to get container status \"d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048\": rpc error: code = NotFound desc = could not find container \"d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048\": container with ID starting with d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048 not found: ID does not exist" Apr 23 16:36:26.319306 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.319253 2573 scope.go:117] "RemoveContainer" containerID="fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99" Apr 23 16:36:26.319496 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.319472 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99"} err="failed to get container status \"fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99\": rpc error: code = NotFound desc = could not find container \"fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99\": container with ID starting with fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99 not found: ID does not exist" Apr 23 16:36:26.319496 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.319492 2573 scope.go:117] "RemoveContainer" containerID="1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8" Apr 23 16:36:26.319775 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.319755 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8"} err="failed to get container status \"1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8\": rpc error: code = NotFound desc = could not find container \"1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8\": container with ID starting with 1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8 not found: ID does not exist" Apr 23 16:36:26.319775 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.319774 2573 scope.go:117] "RemoveContainer" containerID="0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2" Apr 23 16:36:26.320061 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.320028 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2"} err="failed to get container status \"0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2\": rpc error: code = NotFound desc = could not find container \"0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2\": container with ID starting with 0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2 not found: ID does not exist" Apr 23 16:36:26.320061 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.320062 2573 scope.go:117] "RemoveContainer" containerID="2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219" Apr 23 16:36:26.320303 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.320284 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219"} err="failed to get container status \"2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219\": rpc error: code = NotFound desc = could not find container \"2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219\": container with ID starting with 2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219 not found: ID does not exist" Apr 23 16:36:26.320303 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.320303 2573 scope.go:117] "RemoveContainer" containerID="43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58" Apr 23 16:36:26.320499 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.320484 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58"} err="failed to get container status \"43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58\": rpc error: code = NotFound desc = could not find container \"43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58\": container with ID starting with 43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58 not found: ID does not exist" Apr 23 16:36:26.320542 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.320500 2573 scope.go:117] "RemoveContainer" containerID="4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740" Apr 23 16:36:26.320722 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.320684 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740"} err="failed to get container status \"4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740\": rpc error: code = NotFound desc = could not find container \"4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740\": container with ID starting with 4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740 not found: ID does not exist" Apr 23 16:36:26.320789 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.320723 2573 scope.go:117] "RemoveContainer" containerID="d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048" Apr 23 16:36:26.320944 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.320926 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048"} err="failed to get container status \"d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048\": rpc error: code = NotFound desc = could not find container \"d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048\": container with ID starting with d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048 not found: ID does not exist" Apr 23 16:36:26.321004 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.320944 2573 scope.go:117] "RemoveContainer" containerID="fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99" Apr 23 16:36:26.321164 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.321141 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99"} err="failed to get container status \"fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99\": rpc error: code = NotFound desc = could not find container \"fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99\": container with ID starting with fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99 not found: ID does not exist" Apr 23 16:36:26.321212 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.321164 2573 scope.go:117] "RemoveContainer" containerID="1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8" Apr 23 16:36:26.321362 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.321343 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8"} err="failed to get container status \"1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8\": rpc error: code = NotFound desc = could not find container \"1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8\": container with ID starting with 1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8 not found: ID does not exist" Apr 23 16:36:26.321362 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.321362 2573 scope.go:117] "RemoveContainer" containerID="0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2" Apr 23 16:36:26.321555 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.321541 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2"} err="failed to get container status \"0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2\": rpc error: code = NotFound desc = could not find container \"0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2\": container with ID starting with 0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2 not found: ID does not exist" Apr 23 16:36:26.321604 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.321555 2573 scope.go:117] "RemoveContainer" containerID="2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219" Apr 23 16:36:26.321850 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.321831 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219"} err="failed to get container status \"2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219\": rpc error: code = NotFound desc = could not find container \"2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219\": container with ID starting with 2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219 not found: ID does not exist" Apr 23 16:36:26.321927 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.321858 2573 scope.go:117] "RemoveContainer" containerID="43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58" Apr 23 16:36:26.322061 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.322044 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58"} err="failed to get container status \"43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58\": rpc error: code = NotFound desc = could not find container \"43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58\": container with ID starting with 43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58 not found: ID does not exist" Apr 23 16:36:26.322122 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.322063 2573 scope.go:117] "RemoveContainer" containerID="4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740" Apr 23 16:36:26.322316 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.322300 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740"} err="failed to get container status \"4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740\": rpc error: code = NotFound desc = could not find container \"4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740\": container with ID starting with 4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740 not found: ID does not exist" Apr 23 16:36:26.322367 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.322317 2573 scope.go:117] "RemoveContainer" containerID="d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048" Apr 23 16:36:26.322512 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.322495 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048"} err="failed to get container status \"d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048\": rpc error: code = NotFound desc = could not find container \"d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048\": container with ID starting with d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048 not found: ID does not exist" Apr 23 16:36:26.322556 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.322513 2573 scope.go:117] "RemoveContainer" containerID="fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99" Apr 23 16:36:26.322729 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.322711 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99"} err="failed to get container status \"fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99\": rpc error: code = NotFound desc = could not find container \"fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99\": container with ID starting with fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99 not found: ID does not exist" Apr 23 16:36:26.322773 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.322730 2573 scope.go:117] "RemoveContainer" containerID="1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8" Apr 23 16:36:26.322927 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.322909 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8"} err="failed to get container status \"1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8\": rpc error: code = NotFound desc = could not find container \"1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8\": container with ID starting with 1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8 not found: ID does not exist" Apr 23 16:36:26.323001 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.322929 2573 scope.go:117] "RemoveContainer" containerID="0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2" Apr 23 16:36:26.323164 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.323145 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2"} err="failed to get container status \"0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2\": rpc error: code = NotFound desc = could not find container \"0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2\": container with ID starting with 0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2 not found: ID does not exist" Apr 23 16:36:26.323215 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.323166 2573 scope.go:117] "RemoveContainer" containerID="2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219" Apr 23 16:36:26.323365 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.323348 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219"} err="failed to get container status \"2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219\": rpc error: code = NotFound desc = could not find container \"2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219\": container with ID starting with 2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219 not found: ID does not exist" Apr 23 16:36:26.323403 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.323365 2573 scope.go:117] "RemoveContainer" containerID="43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58" Apr 23 16:36:26.323561 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.323545 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58"} err="failed to get container status \"43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58\": rpc error: code = NotFound desc = could not find container \"43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58\": container with ID starting with 43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58 not found: ID does not exist" Apr 23 16:36:26.323613 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.323562 2573 scope.go:117] "RemoveContainer" containerID="4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740" Apr 23 16:36:26.323769 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.323751 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740"} err="failed to get container status \"4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740\": rpc error: code = NotFound desc = could not find container \"4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740\": container with ID starting with 4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740 not found: ID does not exist" Apr 23 16:36:26.323836 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.323771 2573 scope.go:117] "RemoveContainer" containerID="d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048" Apr 23 16:36:26.324006 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.323990 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048"} err="failed to get container status \"d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048\": rpc error: code = NotFound desc = could not find container \"d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048\": container with ID starting with d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048 not found: ID does not exist" Apr 23 16:36:26.324055 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.324007 2573 scope.go:117] "RemoveContainer" containerID="fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99" Apr 23 16:36:26.324200 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.324182 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99"} err="failed to get container status \"fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99\": rpc error: code = NotFound desc = could not find container \"fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99\": container with ID starting with fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99 not found: ID does not exist" Apr 23 16:36:26.324258 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.324201 2573 scope.go:117] "RemoveContainer" containerID="1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8" Apr 23 16:36:26.324406 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.324391 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8"} err="failed to get container status \"1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8\": rpc error: code = NotFound desc = could not find container \"1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8\": container with ID starting with 1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8 not found: ID does not exist" Apr 23 16:36:26.324450 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.324406 2573 scope.go:117] "RemoveContainer" containerID="0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2" Apr 23 16:36:26.324588 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.324573 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2"} err="failed to get container status \"0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2\": rpc error: code = NotFound desc = could not find container \"0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2\": container with ID starting with 0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2 not found: ID does not exist" Apr 23 16:36:26.324653 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.324590 2573 scope.go:117] "RemoveContainer" containerID="2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219" Apr 23 16:36:26.324806 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.324791 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219"} err="failed to get container status \"2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219\": rpc error: code = NotFound desc = could not find container \"2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219\": container with ID starting with 2d56b6656d01458750d1e955d87b443582845d7e43c76d4aa1b3fbcbb13cd219 not found: ID does not exist" Apr 23 16:36:26.324857 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.324807 2573 scope.go:117] "RemoveContainer" containerID="43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58" Apr 23 16:36:26.324998 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.324982 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58"} err="failed to get container status \"43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58\": rpc error: code = NotFound desc = could not find container \"43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58\": container with ID starting with 43304659f931e2eb73209136ef092e00bcf913732b84126ab3bdd87877558b58 not found: ID does not exist" Apr 23 16:36:26.325042 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.324998 2573 scope.go:117] "RemoveContainer" containerID="4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740" Apr 23 16:36:26.325191 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.325176 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740"} err="failed to get container status \"4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740\": rpc error: code = NotFound desc = could not find container \"4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740\": container with ID starting with 4579e007465f6b651055b980460074bf5a3ce2d330e94d000e56aebc2515d740 not found: ID does not exist" Apr 23 16:36:26.325239 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.325190 2573 scope.go:117] "RemoveContainer" containerID="d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048" Apr 23 16:36:26.325377 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.325359 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048"} err="failed to get container status \"d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048\": rpc error: code = NotFound desc = could not find container \"d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048\": container with ID starting with d20b199bca4d7e31deaf4aa70aeaa08c74fd774543f591d41112967de1d1a048 not found: ID does not exist" Apr 23 16:36:26.325443 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.325379 2573 scope.go:117] "RemoveContainer" containerID="fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99" Apr 23 16:36:26.325582 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.325567 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99"} err="failed to get container status \"fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99\": rpc error: code = NotFound desc = could not find container \"fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99\": container with ID starting with fc8d458683fb0e9184be9ecf3404c53aeacfaefe5511d4afa2cce1a426bb3e99 not found: ID does not exist" Apr 23 16:36:26.325623 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.325582 2573 scope.go:117] "RemoveContainer" containerID="1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8" Apr 23 16:36:26.325775 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.325754 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8"} err="failed to get container status \"1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8\": rpc error: code = NotFound desc = could not find container \"1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8\": container with ID starting with 1f089ab235526ecde8f07d3219e4262848a4f40074ff355984ca501b2149d3e8 not found: ID does not exist" Apr 23 16:36:26.325843 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.325778 2573 scope.go:117] "RemoveContainer" containerID="0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2" Apr 23 16:36:26.325972 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.325958 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2"} err="failed to get container status \"0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2\": rpc error: code = NotFound desc = could not find container \"0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2\": container with ID starting with 0a2961aaead309eeff8e619c6b3b471b2f1a4f75b72febad7ab5b6deafbb57b2 not found: ID does not exist" Apr 23 16:36:26.333149 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.333129 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:36:26.333394 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.333383 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8c95658-88b6-40ea-8e55-36e1b9b16d38" containerName="thanos-sidecar" Apr 23 16:36:26.333440 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.333396 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8c95658-88b6-40ea-8e55-36e1b9b16d38" containerName="thanos-sidecar" Apr 23 16:36:26.333440 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.333405 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8c95658-88b6-40ea-8e55-36e1b9b16d38" containerName="kube-rbac-proxy-thanos" Apr 23 16:36:26.333440 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.333410 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8c95658-88b6-40ea-8e55-36e1b9b16d38" containerName="kube-rbac-proxy-thanos" Apr 23 16:36:26.333440 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.333418 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8c95658-88b6-40ea-8e55-36e1b9b16d38" containerName="config-reloader" Apr 23 16:36:26.333440 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.333423 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8c95658-88b6-40ea-8e55-36e1b9b16d38" containerName="config-reloader" Apr 23 16:36:26.333440 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.333430 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8c95658-88b6-40ea-8e55-36e1b9b16d38" containerName="kube-rbac-proxy-web" Apr 23 16:36:26.333440 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.333436 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8c95658-88b6-40ea-8e55-36e1b9b16d38" containerName="kube-rbac-proxy-web" Apr 23 16:36:26.333440 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.333441 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8c95658-88b6-40ea-8e55-36e1b9b16d38" containerName="kube-rbac-proxy" Apr 23 16:36:26.333654 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.333447 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8c95658-88b6-40ea-8e55-36e1b9b16d38" containerName="kube-rbac-proxy" Apr 23 16:36:26.333654 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.333456 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8c95658-88b6-40ea-8e55-36e1b9b16d38" containerName="prometheus" Apr 23 16:36:26.333654 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.333461 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8c95658-88b6-40ea-8e55-36e1b9b16d38" containerName="prometheus" Apr 23 16:36:26.333654 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.333467 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f8c95658-88b6-40ea-8e55-36e1b9b16d38" containerName="init-config-reloader" Apr 23 16:36:26.333654 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.333472 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8c95658-88b6-40ea-8e55-36e1b9b16d38" containerName="init-config-reloader" Apr 23 16:36:26.333654 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.333516 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8c95658-88b6-40ea-8e55-36e1b9b16d38" containerName="kube-rbac-proxy-thanos" Apr 23 16:36:26.333654 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.333523 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8c95658-88b6-40ea-8e55-36e1b9b16d38" containerName="config-reloader" Apr 23 16:36:26.333654 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.333532 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8c95658-88b6-40ea-8e55-36e1b9b16d38" containerName="prometheus" Apr 23 16:36:26.333654 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.333562 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8c95658-88b6-40ea-8e55-36e1b9b16d38" containerName="thanos-sidecar" Apr 23 16:36:26.333654 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.333568 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8c95658-88b6-40ea-8e55-36e1b9b16d38" containerName="kube-rbac-proxy" Apr 23 16:36:26.333654 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.333575 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f8c95658-88b6-40ea-8e55-36e1b9b16d38" containerName="kube-rbac-proxy-web" Apr 23 16:36:26.338374 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.338360 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.340660 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.340638 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 23 16:36:26.340782 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.340675 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 16:36:26.340782 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.340672 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 23 16:36:26.340782 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.340729 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-8n72m0o1i5hvc\"" Apr 23 16:36:26.341056 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.341027 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 23 16:36:26.341160 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.341142 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 23 16:36:26.341231 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.341172 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 23 16:36:26.341328 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.341250 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 23 16:36:26.341328 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.341251 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 23 16:36:26.341328 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.341278 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 23 16:36:26.341458 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.341423 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 23 16:36:26.341458 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.341431 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-jf7w4\"" Apr 23 16:36:26.341669 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.341657 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 23 16:36:26.345212 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.345147 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 23 16:36:26.346997 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.346978 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 23 16:36:26.351796 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.351778 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:36:26.353255 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.353060 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zd7p2\" (UniqueName: \"kubernetes.io/projected/f8c95658-88b6-40ea-8e55-36e1b9b16d38-kube-api-access-zd7p2\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:36:26.353255 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.353087 2573 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:36:26.353255 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.353105 2573 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-web-config\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:36:26.353255 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.353119 2573 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/f8c95658-88b6-40ea-8e55-36e1b9b16d38-tls-assets\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:36:26.353255 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.353132 2573 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-grpc-tls\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:36:26.353255 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.353145 2573 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-metrics-client-certs\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:36:26.353255 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.353158 2573 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-thanos-prometheus-http-client-file\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:36:26.353255 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.353171 2573 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-prometheus-k8s-tls\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:36:26.353255 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.353185 2573 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/f8c95658-88b6-40ea-8e55-36e1b9b16d38-config-out\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:36:26.353255 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.353201 2573 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/f8c95658-88b6-40ea-8e55-36e1b9b16d38-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:36:26.353255 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.353214 2573 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8c95658-88b6-40ea-8e55-36e1b9b16d38-prometheus-trusted-ca-bundle\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:36:26.353255 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.353227 2573 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-config\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:36:26.353255 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.353242 2573 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-kube-rbac-proxy\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:36:26.353255 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.353257 2573 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f8c95658-88b6-40ea-8e55-36e1b9b16d38-configmap-metrics-client-ca\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:36:26.353877 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.353272 2573 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8c95658-88b6-40ea-8e55-36e1b9b16d38-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:36:26.353877 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.353286 2573 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/f8c95658-88b6-40ea-8e55-36e1b9b16d38-prometheus-k8s-db\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:36:26.353877 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.353301 2573 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/f8c95658-88b6-40ea-8e55-36e1b9b16d38-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:36:26.454080 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.454040 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bfd3bfb-a0c6-4571-a32d-702092eaf236-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.454080 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.454084 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1bfd3bfb-a0c6-4571-a32d-702092eaf236-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.454306 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.454119 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1bfd3bfb-a0c6-4571-a32d-702092eaf236-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.454306 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.454170 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1bfd3bfb-a0c6-4571-a32d-702092eaf236-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.454306 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.454205 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1bfd3bfb-a0c6-4571-a32d-702092eaf236-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.454306 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.454228 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sbr77\" (UniqueName: \"kubernetes.io/projected/1bfd3bfb-a0c6-4571-a32d-702092eaf236-kube-api-access-sbr77\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.454306 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.454292 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1bfd3bfb-a0c6-4571-a32d-702092eaf236-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.454472 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.454319 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1bfd3bfb-a0c6-4571-a32d-702092eaf236-config-out\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.454472 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.454348 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bfd3bfb-a0c6-4571-a32d-702092eaf236-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.454472 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.454366 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1bfd3bfb-a0c6-4571-a32d-702092eaf236-config\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.454472 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.454407 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1bfd3bfb-a0c6-4571-a32d-702092eaf236-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.454472 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.454439 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1bfd3bfb-a0c6-4571-a32d-702092eaf236-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.454472 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.454463 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1bfd3bfb-a0c6-4571-a32d-702092eaf236-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.454635 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.454486 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1bfd3bfb-a0c6-4571-a32d-702092eaf236-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.454635 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.454503 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1bfd3bfb-a0c6-4571-a32d-702092eaf236-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.454635 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.454522 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bfd3bfb-a0c6-4571-a32d-702092eaf236-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.454635 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.454541 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1bfd3bfb-a0c6-4571-a32d-702092eaf236-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.454635 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.454557 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1bfd3bfb-a0c6-4571-a32d-702092eaf236-web-config\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.555969 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.555880 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1bfd3bfb-a0c6-4571-a32d-702092eaf236-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.555969 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.555921 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1bfd3bfb-a0c6-4571-a32d-702092eaf236-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.555969 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.555937 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sbr77\" (UniqueName: \"kubernetes.io/projected/1bfd3bfb-a0c6-4571-a32d-702092eaf236-kube-api-access-sbr77\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.555969 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.555957 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1bfd3bfb-a0c6-4571-a32d-702092eaf236-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.555969 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.555975 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1bfd3bfb-a0c6-4571-a32d-702092eaf236-config-out\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.556327 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.555994 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bfd3bfb-a0c6-4571-a32d-702092eaf236-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.556327 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.556125 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/1bfd3bfb-a0c6-4571-a32d-702092eaf236-config\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.556327 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.556158 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1bfd3bfb-a0c6-4571-a32d-702092eaf236-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.556327 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.556192 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1bfd3bfb-a0c6-4571-a32d-702092eaf236-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.556327 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.556226 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1bfd3bfb-a0c6-4571-a32d-702092eaf236-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.556327 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.556260 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1bfd3bfb-a0c6-4571-a32d-702092eaf236-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.556327 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.556288 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1bfd3bfb-a0c6-4571-a32d-702092eaf236-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.556327 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.556321 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bfd3bfb-a0c6-4571-a32d-702092eaf236-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.556753 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.556352 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1bfd3bfb-a0c6-4571-a32d-702092eaf236-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.556753 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.556376 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1bfd3bfb-a0c6-4571-a32d-702092eaf236-web-config\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.556753 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.556423 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bfd3bfb-a0c6-4571-a32d-702092eaf236-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.556753 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.556455 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1bfd3bfb-a0c6-4571-a32d-702092eaf236-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.556753 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.556507 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1bfd3bfb-a0c6-4571-a32d-702092eaf236-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.557240 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.557214 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bfd3bfb-a0c6-4571-a32d-702092eaf236-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.558111 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.558088 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1bfd3bfb-a0c6-4571-a32d-702092eaf236-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.558883 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.558853 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bfd3bfb-a0c6-4571-a32d-702092eaf236-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.558974 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.558915 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1bfd3bfb-a0c6-4571-a32d-702092eaf236-config-out\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.559031 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.558978 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1bfd3bfb-a0c6-4571-a32d-702092eaf236-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.559197 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.559178 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1bfd3bfb-a0c6-4571-a32d-702092eaf236-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.559304 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.559284 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/1bfd3bfb-a0c6-4571-a32d-702092eaf236-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.559377 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.559317 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1bfd3bfb-a0c6-4571-a32d-702092eaf236-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.559681 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.559655 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/1bfd3bfb-a0c6-4571-a32d-702092eaf236-config\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.559792 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.559660 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bfd3bfb-a0c6-4571-a32d-702092eaf236-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.559900 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.559864 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/1bfd3bfb-a0c6-4571-a32d-702092eaf236-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.560064 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.560043 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/1bfd3bfb-a0c6-4571-a32d-702092eaf236-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.561465 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.561436 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/1bfd3bfb-a0c6-4571-a32d-702092eaf236-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.561465 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.561449 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/1bfd3bfb-a0c6-4571-a32d-702092eaf236-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.561601 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.561521 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1bfd3bfb-a0c6-4571-a32d-702092eaf236-web-config\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.561740 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.561725 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/1bfd3bfb-a0c6-4571-a32d-702092eaf236-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.562927 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.562909 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/1bfd3bfb-a0c6-4571-a32d-702092eaf236-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.563299 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.563279 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sbr77\" (UniqueName: \"kubernetes.io/projected/1bfd3bfb-a0c6-4571-a32d-702092eaf236-kube-api-access-sbr77\") pod \"prometheus-k8s-0\" (UID: \"1bfd3bfb-a0c6-4571-a32d-702092eaf236\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.648478 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.648430 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:26.775355 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.775320 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 16:36:26.779263 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:36:26.779236 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1bfd3bfb_a0c6_4571_a32d_702092eaf236.slice/crio-ef87c35111455d1846e8684d866d3e50d9a401ea88ad621a225a2aa9ec39d094 WatchSource:0}: Error finding container ef87c35111455d1846e8684d866d3e50d9a401ea88ad621a225a2aa9ec39d094: Status 404 returned error can't find the container with id ef87c35111455d1846e8684d866d3e50d9a401ea88ad621a225a2aa9ec39d094 Apr 23 16:36:26.880427 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:26.880396 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8c95658-88b6-40ea-8e55-36e1b9b16d38" path="/var/lib/kubelet/pods/f8c95658-88b6-40ea-8e55-36e1b9b16d38/volumes" Apr 23 16:36:27.272784 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:27.272744 2573 generic.go:358] "Generic (PLEG): container finished" podID="1bfd3bfb-a0c6-4571-a32d-702092eaf236" containerID="079c109b8337aa6f0932a2e1c65194fa3b386de4d6455c4d9394a48acf527a91" exitCode=0 Apr 23 16:36:27.273208 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:27.272831 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1bfd3bfb-a0c6-4571-a32d-702092eaf236","Type":"ContainerDied","Data":"079c109b8337aa6f0932a2e1c65194fa3b386de4d6455c4d9394a48acf527a91"} Apr 23 16:36:27.273208 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:27.272866 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1bfd3bfb-a0c6-4571-a32d-702092eaf236","Type":"ContainerStarted","Data":"ef87c35111455d1846e8684d866d3e50d9a401ea88ad621a225a2aa9ec39d094"} Apr 23 16:36:28.280400 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:28.280357 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1bfd3bfb-a0c6-4571-a32d-702092eaf236","Type":"ContainerStarted","Data":"44c787a869cbc2580cb3bb5f9dad22f62658158aa6619a8d561704803a3a9fcb"} Apr 23 16:36:28.280400 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:28.280401 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1bfd3bfb-a0c6-4571-a32d-702092eaf236","Type":"ContainerStarted","Data":"fb082d797f36641d8b046cc32239ec208f39b290a080f7abb47a9a080e9d27a4"} Apr 23 16:36:28.280889 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:28.280411 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1bfd3bfb-a0c6-4571-a32d-702092eaf236","Type":"ContainerStarted","Data":"b3324894460886494ec41d2803ab7b795077f702c19f8c92ddf19598936e55d9"} Apr 23 16:36:28.280889 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:28.280420 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1bfd3bfb-a0c6-4571-a32d-702092eaf236","Type":"ContainerStarted","Data":"a75b00d8b0ca9bd7775465a6e42190a83224a1523c29e68ac571695c7646ad59"} Apr 23 16:36:28.280889 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:28.280429 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1bfd3bfb-a0c6-4571-a32d-702092eaf236","Type":"ContainerStarted","Data":"a1e05d6c59ebfa342ac01d1f02017a9a6e35f099b68cacd2e32c1dfe84af87b8"} Apr 23 16:36:28.280889 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:28.280437 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"1bfd3bfb-a0c6-4571-a32d-702092eaf236","Type":"ContainerStarted","Data":"69faeae77c503f1713086b595eebebcabcda6b67596b4b30d0ca6c62b243c548"} Apr 23 16:36:28.311478 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:28.311419 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.311403071 podStartE2EDuration="2.311403071s" podCreationTimestamp="2026-04-23 16:36:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:36:28.309363837 +0000 UTC m=+93.999150184" watchObservedRunningTime="2026-04-23 16:36:28.311403071 +0000 UTC m=+94.001189440" Apr 23 16:36:31.649068 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:31.649031 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:36:53.629601 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:53.629562 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-xv7s5"] Apr 23 16:36:53.634432 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:53.634407 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xv7s5" Apr 23 16:36:53.636400 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:53.636380 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 16:36:53.641387 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:53.641289 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xv7s5"] Apr 23 16:36:53.672473 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:53.672448 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3a80e579-6e61-4497-9477-5154f3af3b17-dbus\") pod \"global-pull-secret-syncer-xv7s5\" (UID: \"3a80e579-6e61-4497-9477-5154f3af3b17\") " pod="kube-system/global-pull-secret-syncer-xv7s5" Apr 23 16:36:53.672582 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:53.672494 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3a80e579-6e61-4497-9477-5154f3af3b17-kubelet-config\") pod \"global-pull-secret-syncer-xv7s5\" (UID: \"3a80e579-6e61-4497-9477-5154f3af3b17\") " pod="kube-system/global-pull-secret-syncer-xv7s5" Apr 23 16:36:53.672582 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:53.672550 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3a80e579-6e61-4497-9477-5154f3af3b17-original-pull-secret\") pod \"global-pull-secret-syncer-xv7s5\" (UID: \"3a80e579-6e61-4497-9477-5154f3af3b17\") " pod="kube-system/global-pull-secret-syncer-xv7s5" Apr 23 16:36:53.773814 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:53.773773 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3a80e579-6e61-4497-9477-5154f3af3b17-kubelet-config\") pod \"global-pull-secret-syncer-xv7s5\" (UID: \"3a80e579-6e61-4497-9477-5154f3af3b17\") " pod="kube-system/global-pull-secret-syncer-xv7s5" Apr 23 16:36:53.773967 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:53.773863 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3a80e579-6e61-4497-9477-5154f3af3b17-original-pull-secret\") pod \"global-pull-secret-syncer-xv7s5\" (UID: \"3a80e579-6e61-4497-9477-5154f3af3b17\") " pod="kube-system/global-pull-secret-syncer-xv7s5" Apr 23 16:36:53.773967 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:53.773920 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/3a80e579-6e61-4497-9477-5154f3af3b17-kubelet-config\") pod \"global-pull-secret-syncer-xv7s5\" (UID: \"3a80e579-6e61-4497-9477-5154f3af3b17\") " pod="kube-system/global-pull-secret-syncer-xv7s5" Apr 23 16:36:53.773967 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:53.773959 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3a80e579-6e61-4497-9477-5154f3af3b17-dbus\") pod \"global-pull-secret-syncer-xv7s5\" (UID: \"3a80e579-6e61-4497-9477-5154f3af3b17\") " pod="kube-system/global-pull-secret-syncer-xv7s5" Apr 23 16:36:53.774115 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:53.774101 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/3a80e579-6e61-4497-9477-5154f3af3b17-dbus\") pod \"global-pull-secret-syncer-xv7s5\" (UID: \"3a80e579-6e61-4497-9477-5154f3af3b17\") " pod="kube-system/global-pull-secret-syncer-xv7s5" Apr 23 16:36:53.776007 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:53.775989 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/3a80e579-6e61-4497-9477-5154f3af3b17-original-pull-secret\") pod \"global-pull-secret-syncer-xv7s5\" (UID: \"3a80e579-6e61-4497-9477-5154f3af3b17\") " pod="kube-system/global-pull-secret-syncer-xv7s5" Apr 23 16:36:53.943661 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:53.943636 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-xv7s5" Apr 23 16:36:54.064723 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:54.064664 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-xv7s5"] Apr 23 16:36:54.068773 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:36:54.068747 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a80e579_6e61_4497_9477_5154f3af3b17.slice/crio-147b69efb43599e0f6851f5b266d726c9be6ea2ba06373391242afa401cdeddf WatchSource:0}: Error finding container 147b69efb43599e0f6851f5b266d726c9be6ea2ba06373391242afa401cdeddf: Status 404 returned error can't find the container with id 147b69efb43599e0f6851f5b266d726c9be6ea2ba06373391242afa401cdeddf Apr 23 16:36:54.351606 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:54.351528 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xv7s5" event={"ID":"3a80e579-6e61-4497-9477-5154f3af3b17","Type":"ContainerStarted","Data":"147b69efb43599e0f6851f5b266d726c9be6ea2ba06373391242afa401cdeddf"} Apr 23 16:36:59.368674 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:59.368634 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-xv7s5" event={"ID":"3a80e579-6e61-4497-9477-5154f3af3b17","Type":"ContainerStarted","Data":"ba32428344d356382449e6a930b364fe41eac1e7e517b407d6cf98925944acea"} Apr 23 16:36:59.391123 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:36:59.391067 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-xv7s5" podStartSLOduration=1.9284461099999999 podStartE2EDuration="6.391049779s" podCreationTimestamp="2026-04-23 16:36:53 +0000 UTC" firstStartedPulling="2026-04-23 16:36:54.070295793 +0000 UTC m=+119.760082117" lastFinishedPulling="2026-04-23 16:36:58.532899454 +0000 UTC m=+124.222685786" observedRunningTime="2026-04-23 16:36:59.390094543 +0000 UTC m=+125.079880890" watchObservedRunningTime="2026-04-23 16:36:59.391049779 +0000 UTC m=+125.080836127" Apr 23 16:37:26.649469 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:37:26.649434 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:26.664567 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:37:26.664540 2573 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:37:27.456149 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:37:27.456120 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 16:39:54.730944 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:39:54.730913 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wd2cz_52988e90-484a-49cd-98f6-5510a28890d6/ovn-acl-logging/0.log" Apr 23 16:39:54.731763 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:39:54.731737 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wd2cz_52988e90-484a-49cd-98f6-5510a28890d6/ovn-acl-logging/0.log" Apr 23 16:39:54.733476 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:39:54.733460 2573 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 16:40:29.310862 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:29.310824 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-q2rt4"] Apr 23 16:40:29.314018 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:29.314003 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-q2rt4" Apr 23 16:40:29.315991 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:29.315967 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 23 16:40:29.316096 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:29.315998 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 23 16:40:29.316096 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:29.316002 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 23 16:40:29.316285 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:29.316270 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-ckzvv\"" Apr 23 16:40:29.320212 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:29.320191 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-q2rt4"] Apr 23 16:40:29.443609 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:29.443576 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rchnz\" (UniqueName: \"kubernetes.io/projected/901d2ca7-8019-4f76-af7c-f4abd010f487-kube-api-access-rchnz\") pod \"s3-init-q2rt4\" (UID: \"901d2ca7-8019-4f76-af7c-f4abd010f487\") " pod="kserve/s3-init-q2rt4" Apr 23 16:40:29.544859 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:29.544823 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rchnz\" (UniqueName: \"kubernetes.io/projected/901d2ca7-8019-4f76-af7c-f4abd010f487-kube-api-access-rchnz\") pod \"s3-init-q2rt4\" (UID: \"901d2ca7-8019-4f76-af7c-f4abd010f487\") " pod="kserve/s3-init-q2rt4" Apr 23 16:40:29.553181 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:29.553155 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rchnz\" (UniqueName: \"kubernetes.io/projected/901d2ca7-8019-4f76-af7c-f4abd010f487-kube-api-access-rchnz\") pod \"s3-init-q2rt4\" (UID: \"901d2ca7-8019-4f76-af7c-f4abd010f487\") " pod="kserve/s3-init-q2rt4" Apr 23 16:40:29.631304 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:29.631238 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-q2rt4" Apr 23 16:40:29.744738 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:29.744708 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-q2rt4"] Apr 23 16:40:29.747499 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:40:29.747473 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod901d2ca7_8019_4f76_af7c_f4abd010f487.slice/crio-2bc15937551684aef7b3e80ddc0d49ca9eb1d3083e7a67b9747aa93bfd564565 WatchSource:0}: Error finding container 2bc15937551684aef7b3e80ddc0d49ca9eb1d3083e7a67b9747aa93bfd564565: Status 404 returned error can't find the container with id 2bc15937551684aef7b3e80ddc0d49ca9eb1d3083e7a67b9747aa93bfd564565 Apr 23 16:40:29.749188 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:29.749170 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:40:29.927232 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:29.927203 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-q2rt4" event={"ID":"901d2ca7-8019-4f76-af7c-f4abd010f487","Type":"ContainerStarted","Data":"2bc15937551684aef7b3e80ddc0d49ca9eb1d3083e7a67b9747aa93bfd564565"} Apr 23 16:40:34.943789 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:34.943756 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-q2rt4" event={"ID":"901d2ca7-8019-4f76-af7c-f4abd010f487","Type":"ContainerStarted","Data":"6101363de34e753521e19fcfc9f69242a1a333aaac3bf3241486c9ec8759e357"} Apr 23 16:40:34.961424 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:34.961382 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-q2rt4" podStartSLOduration=1.527374194 podStartE2EDuration="5.961367956s" podCreationTimestamp="2026-04-23 16:40:29 +0000 UTC" firstStartedPulling="2026-04-23 16:40:29.749346859 +0000 UTC m=+335.439133189" lastFinishedPulling="2026-04-23 16:40:34.183340623 +0000 UTC m=+339.873126951" observedRunningTime="2026-04-23 16:40:34.959666701 +0000 UTC m=+340.649453047" watchObservedRunningTime="2026-04-23 16:40:34.961367956 +0000 UTC m=+340.651154301" Apr 23 16:40:37.952771 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:37.952734 2573 generic.go:358] "Generic (PLEG): container finished" podID="901d2ca7-8019-4f76-af7c-f4abd010f487" containerID="6101363de34e753521e19fcfc9f69242a1a333aaac3bf3241486c9ec8759e357" exitCode=0 Apr 23 16:40:37.953144 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:37.952784 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-q2rt4" event={"ID":"901d2ca7-8019-4f76-af7c-f4abd010f487","Type":"ContainerDied","Data":"6101363de34e753521e19fcfc9f69242a1a333aaac3bf3241486c9ec8759e357"} Apr 23 16:40:39.073363 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:39.073340 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-q2rt4" Apr 23 16:40:39.236005 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:39.235925 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rchnz\" (UniqueName: \"kubernetes.io/projected/901d2ca7-8019-4f76-af7c-f4abd010f487-kube-api-access-rchnz\") pod \"901d2ca7-8019-4f76-af7c-f4abd010f487\" (UID: \"901d2ca7-8019-4f76-af7c-f4abd010f487\") " Apr 23 16:40:39.238009 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:39.237988 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/901d2ca7-8019-4f76-af7c-f4abd010f487-kube-api-access-rchnz" (OuterVolumeSpecName: "kube-api-access-rchnz") pod "901d2ca7-8019-4f76-af7c-f4abd010f487" (UID: "901d2ca7-8019-4f76-af7c-f4abd010f487"). InnerVolumeSpecName "kube-api-access-rchnz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:40:39.336356 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:39.336321 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rchnz\" (UniqueName: \"kubernetes.io/projected/901d2ca7-8019-4f76-af7c-f4abd010f487-kube-api-access-rchnz\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:40:39.959578 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:39.959542 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-q2rt4" event={"ID":"901d2ca7-8019-4f76-af7c-f4abd010f487","Type":"ContainerDied","Data":"2bc15937551684aef7b3e80ddc0d49ca9eb1d3083e7a67b9747aa93bfd564565"} Apr 23 16:40:39.959578 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:39.959576 2573 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2bc15937551684aef7b3e80ddc0d49ca9eb1d3083e7a67b9747aa93bfd564565" Apr 23 16:40:39.959857 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:39.959598 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-q2rt4" Apr 23 16:40:49.098492 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:49.098459 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52"] Apr 23 16:40:49.098878 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:49.098741 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="901d2ca7-8019-4f76-af7c-f4abd010f487" containerName="s3-init" Apr 23 16:40:49.098878 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:49.098751 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="901d2ca7-8019-4f76-af7c-f4abd010f487" containerName="s3-init" Apr 23 16:40:49.098878 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:49.098802 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="901d2ca7-8019-4f76-af7c-f4abd010f487" containerName="s3-init" Apr 23 16:40:49.102059 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:49.102043 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" Apr 23 16:40:49.104005 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:49.103987 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-p7xp4\"" Apr 23 16:40:49.111122 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:49.111103 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52"] Apr 23 16:40:49.202796 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:49.202766 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad0c19ac-6781-491e-accf-80afed0e81f8-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52\" (UID: \"ad0c19ac-6781-491e-accf-80afed0e81f8\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" Apr 23 16:40:49.303907 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:49.303872 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad0c19ac-6781-491e-accf-80afed0e81f8-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52\" (UID: \"ad0c19ac-6781-491e-accf-80afed0e81f8\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" Apr 23 16:40:49.304258 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:49.304234 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad0c19ac-6781-491e-accf-80afed0e81f8-kserve-provision-location\") pod \"isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52\" (UID: \"ad0c19ac-6781-491e-accf-80afed0e81f8\") " pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" Apr 23 16:40:49.411667 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:49.411638 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" Apr 23 16:40:49.530473 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:49.530439 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52"] Apr 23 16:40:49.533666 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:40:49.533633 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad0c19ac_6781_491e_accf_80afed0e81f8.slice/crio-ac9489df60fbc25a0c665d278e5763378f0f77133153791ac2e404840c35c1cf WatchSource:0}: Error finding container ac9489df60fbc25a0c665d278e5763378f0f77133153791ac2e404840c35c1cf: Status 404 returned error can't find the container with id ac9489df60fbc25a0c665d278e5763378f0f77133153791ac2e404840c35c1cf Apr 23 16:40:49.986726 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:49.986673 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" event={"ID":"ad0c19ac-6781-491e-accf-80afed0e81f8","Type":"ContainerStarted","Data":"ac9489df60fbc25a0c665d278e5763378f0f77133153791ac2e404840c35c1cf"} Apr 23 16:40:55.001576 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:55.001537 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" event={"ID":"ad0c19ac-6781-491e-accf-80afed0e81f8","Type":"ContainerStarted","Data":"75590575736b79fdbcc27700628560406d54f3ad9301c7c69b1d6fa916ff5cf9"} Apr 23 16:40:58.010849 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:58.010817 2573 generic.go:358] "Generic (PLEG): container finished" podID="ad0c19ac-6781-491e-accf-80afed0e81f8" containerID="75590575736b79fdbcc27700628560406d54f3ad9301c7c69b1d6fa916ff5cf9" exitCode=0 Apr 23 16:40:58.011182 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:40:58.010881 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" event={"ID":"ad0c19ac-6781-491e-accf-80afed0e81f8","Type":"ContainerDied","Data":"75590575736b79fdbcc27700628560406d54f3ad9301c7c69b1d6fa916ff5cf9"} Apr 23 16:41:12.058217 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:41:12.058177 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" event={"ID":"ad0c19ac-6781-491e-accf-80afed0e81f8","Type":"ContainerStarted","Data":"a8017226b84c0c8e0a43b5bacd8d67f8b3f9b354b69c00846208ec711b88b079"} Apr 23 16:41:14.064913 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:41:14.064869 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" event={"ID":"ad0c19ac-6781-491e-accf-80afed0e81f8","Type":"ContainerStarted","Data":"c048ee9f4189c152d0df33b4ab37b5a1518245f7d5a7f830fa8df007340e1c9c"} Apr 23 16:41:14.065277 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:41:14.065132 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" Apr 23 16:41:14.066431 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:41:14.066386 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" podUID="ad0c19ac-6781-491e-accf-80afed0e81f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.16:8080: connect: connection refused" Apr 23 16:41:14.083863 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:41:14.083821 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" podStartSLOduration=0.759857283 podStartE2EDuration="25.083808265s" podCreationTimestamp="2026-04-23 16:40:49 +0000 UTC" firstStartedPulling="2026-04-23 16:40:49.535876805 +0000 UTC m=+355.225663129" lastFinishedPulling="2026-04-23 16:41:13.859827783 +0000 UTC m=+379.549614111" observedRunningTime="2026-04-23 16:41:14.082352216 +0000 UTC m=+379.772138562" watchObservedRunningTime="2026-04-23 16:41:14.083808265 +0000 UTC m=+379.773594610" Apr 23 16:41:15.067400 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:41:15.067370 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" Apr 23 16:41:15.067794 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:41:15.067477 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" podUID="ad0c19ac-6781-491e-accf-80afed0e81f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.16:8080: connect: connection refused" Apr 23 16:41:15.068486 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:41:15.068459 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" podUID="ad0c19ac-6781-491e-accf-80afed0e81f8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:41:16.069946 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:41:16.069909 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" podUID="ad0c19ac-6781-491e-accf-80afed0e81f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.16:8080: connect: connection refused" Apr 23 16:41:16.070383 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:41:16.070208 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" podUID="ad0c19ac-6781-491e-accf-80afed0e81f8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:41:26.070067 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:41:26.070020 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" podUID="ad0c19ac-6781-491e-accf-80afed0e81f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.16:8080: connect: connection refused" Apr 23 16:41:26.070554 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:41:26.070531 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" podUID="ad0c19ac-6781-491e-accf-80afed0e81f8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:41:36.070776 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:41:36.070723 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" podUID="ad0c19ac-6781-491e-accf-80afed0e81f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.16:8080: connect: connection refused" Apr 23 16:41:36.071257 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:41:36.071232 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" podUID="ad0c19ac-6781-491e-accf-80afed0e81f8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:41:46.070797 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:41:46.070752 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" podUID="ad0c19ac-6781-491e-accf-80afed0e81f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.16:8080: connect: connection refused" Apr 23 16:41:46.071280 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:41:46.071241 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" podUID="ad0c19ac-6781-491e-accf-80afed0e81f8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:41:56.070096 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:41:56.070057 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" podUID="ad0c19ac-6781-491e-accf-80afed0e81f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.16:8080: connect: connection refused" Apr 23 16:41:56.070568 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:41:56.070543 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" podUID="ad0c19ac-6781-491e-accf-80afed0e81f8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:42:06.070090 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:06.069992 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" podUID="ad0c19ac-6781-491e-accf-80afed0e81f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.16:8080: connect: connection refused" Apr 23 16:42:06.070459 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:06.070396 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" podUID="ad0c19ac-6781-491e-accf-80afed0e81f8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:42:16.070522 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:16.070491 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" Apr 23 16:42:16.070919 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:16.070682 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" Apr 23 16:42:24.166956 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:24.166915 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52"] Apr 23 16:42:24.167688 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:24.167280 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" podUID="ad0c19ac-6781-491e-accf-80afed0e81f8" containerName="kserve-container" containerID="cri-o://a8017226b84c0c8e0a43b5bacd8d67f8b3f9b354b69c00846208ec711b88b079" gracePeriod=30 Apr 23 16:42:24.167688 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:24.167355 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" podUID="ad0c19ac-6781-491e-accf-80afed0e81f8" containerName="agent" containerID="cri-o://c048ee9f4189c152d0df33b4ab37b5a1518245f7d5a7f830fa8df007340e1c9c" gracePeriod=30 Apr 23 16:42:24.253213 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:24.253175 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-96835-predictor-765ddcd4f4-svcw9"] Apr 23 16:42:24.256355 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:24.256339 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-96835-predictor-765ddcd4f4-svcw9" Apr 23 16:42:24.271051 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:24.271006 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-96835-predictor-765ddcd4f4-svcw9"] Apr 23 16:42:24.326442 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:24.326415 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-96835-predictor-786b8577d7-2g5q6"] Apr 23 16:42:24.329536 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:24.329518 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-96835-predictor-786b8577d7-2g5q6" Apr 23 16:42:24.333167 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:24.333147 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d755ee27-2178-439f-ad75-5175f4b374e1-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-96835-predictor-765ddcd4f4-svcw9\" (UID: \"d755ee27-2178-439f-ad75-5175f4b374e1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-96835-predictor-765ddcd4f4-svcw9" Apr 23 16:42:24.339958 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:24.339939 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-96835-predictor-786b8577d7-2g5q6"] Apr 23 16:42:24.433832 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:24.433756 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/edc93650-3c52-4975-b764-f8211583c4dc-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-96835-predictor-786b8577d7-2g5q6\" (UID: \"edc93650-3c52-4975-b764-f8211583c4dc\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-96835-predictor-786b8577d7-2g5q6" Apr 23 16:42:24.433832 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:24.433807 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d755ee27-2178-439f-ad75-5175f4b374e1-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-96835-predictor-765ddcd4f4-svcw9\" (UID: \"d755ee27-2178-439f-ad75-5175f4b374e1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-96835-predictor-765ddcd4f4-svcw9" Apr 23 16:42:24.434119 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:24.434103 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d755ee27-2178-439f-ad75-5175f4b374e1-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-96835-predictor-765ddcd4f4-svcw9\" (UID: \"d755ee27-2178-439f-ad75-5175f4b374e1\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-96835-predictor-765ddcd4f4-svcw9" Apr 23 16:42:24.534550 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:24.534519 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/edc93650-3c52-4975-b764-f8211583c4dc-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-96835-predictor-786b8577d7-2g5q6\" (UID: \"edc93650-3c52-4975-b764-f8211583c4dc\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-96835-predictor-786b8577d7-2g5q6" Apr 23 16:42:24.534862 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:24.534845 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/edc93650-3c52-4975-b764-f8211583c4dc-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-96835-predictor-786b8577d7-2g5q6\" (UID: \"edc93650-3c52-4975-b764-f8211583c4dc\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-96835-predictor-786b8577d7-2g5q6" Apr 23 16:42:24.570651 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:24.570634 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-96835-predictor-765ddcd4f4-svcw9" Apr 23 16:42:24.638575 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:24.638544 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-96835-predictor-786b8577d7-2g5q6" Apr 23 16:42:24.696731 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:24.696679 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-96835-predictor-765ddcd4f4-svcw9"] Apr 23 16:42:24.701102 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:42:24.701074 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd755ee27_2178_439f_ad75_5175f4b374e1.slice/crio-ef3c49c61138e26b0d8e074ee4921a1c226a6bbcc53d899d45e24934c764a10e WatchSource:0}: Error finding container ef3c49c61138e26b0d8e074ee4921a1c226a6bbcc53d899d45e24934c764a10e: Status 404 returned error can't find the container with id ef3c49c61138e26b0d8e074ee4921a1c226a6bbcc53d899d45e24934c764a10e Apr 23 16:42:24.767085 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:24.767051 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-96835-predictor-786b8577d7-2g5q6"] Apr 23 16:42:24.769370 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:42:24.769346 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedc93650_3c52_4975_b764_f8211583c4dc.slice/crio-bf6e5ce81122bd5600ab0e6c8de54f21493d81187d7c67c97ea075b82ab0b2b7 WatchSource:0}: Error finding container bf6e5ce81122bd5600ab0e6c8de54f21493d81187d7c67c97ea075b82ab0b2b7: Status 404 returned error can't find the container with id bf6e5ce81122bd5600ab0e6c8de54f21493d81187d7c67c97ea075b82ab0b2b7 Apr 23 16:42:25.256384 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:25.256345 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-96835-predictor-786b8577d7-2g5q6" event={"ID":"edc93650-3c52-4975-b764-f8211583c4dc","Type":"ContainerStarted","Data":"efb678cb42b711fe2784c7acbc287739c26c25da28628b5c2d929913d1c3a001"} Apr 23 16:42:25.256845 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:25.256390 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-96835-predictor-786b8577d7-2g5q6" event={"ID":"edc93650-3c52-4975-b764-f8211583c4dc","Type":"ContainerStarted","Data":"bf6e5ce81122bd5600ab0e6c8de54f21493d81187d7c67c97ea075b82ab0b2b7"} Apr 23 16:42:25.257756 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:25.257730 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-96835-predictor-765ddcd4f4-svcw9" event={"ID":"d755ee27-2178-439f-ad75-5175f4b374e1","Type":"ContainerStarted","Data":"530a68a45deb8f888eb66e5e9422ce79ebf09724a70bb351bd76833b0e8f0940"} Apr 23 16:42:25.257851 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:25.257765 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-96835-predictor-765ddcd4f4-svcw9" event={"ID":"d755ee27-2178-439f-ad75-5175f4b374e1","Type":"ContainerStarted","Data":"ef3c49c61138e26b0d8e074ee4921a1c226a6bbcc53d899d45e24934c764a10e"} Apr 23 16:42:26.070024 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:26.069979 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" podUID="ad0c19ac-6781-491e-accf-80afed0e81f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.16:8080: connect: connection refused" Apr 23 16:42:26.070305 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:26.070269 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" podUID="ad0c19ac-6781-491e-accf-80afed0e81f8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:42:28.268945 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:28.268904 2573 generic.go:358] "Generic (PLEG): container finished" podID="ad0c19ac-6781-491e-accf-80afed0e81f8" containerID="a8017226b84c0c8e0a43b5bacd8d67f8b3f9b354b69c00846208ec711b88b079" exitCode=0 Apr 23 16:42:28.269239 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:28.268947 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" event={"ID":"ad0c19ac-6781-491e-accf-80afed0e81f8","Type":"ContainerDied","Data":"a8017226b84c0c8e0a43b5bacd8d67f8b3f9b354b69c00846208ec711b88b079"} Apr 23 16:42:29.273023 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:29.272986 2573 generic.go:358] "Generic (PLEG): container finished" podID="d755ee27-2178-439f-ad75-5175f4b374e1" containerID="530a68a45deb8f888eb66e5e9422ce79ebf09724a70bb351bd76833b0e8f0940" exitCode=0 Apr 23 16:42:29.273444 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:29.273059 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-96835-predictor-765ddcd4f4-svcw9" event={"ID":"d755ee27-2178-439f-ad75-5175f4b374e1","Type":"ContainerDied","Data":"530a68a45deb8f888eb66e5e9422ce79ebf09724a70bb351bd76833b0e8f0940"} Apr 23 16:42:29.274280 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:29.274255 2573 generic.go:358] "Generic (PLEG): container finished" podID="edc93650-3c52-4975-b764-f8211583c4dc" containerID="efb678cb42b711fe2784c7acbc287739c26c25da28628b5c2d929913d1c3a001" exitCode=0 Apr 23 16:42:29.274366 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:29.274288 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-96835-predictor-786b8577d7-2g5q6" event={"ID":"edc93650-3c52-4975-b764-f8211583c4dc","Type":"ContainerDied","Data":"efb678cb42b711fe2784c7acbc287739c26c25da28628b5c2d929913d1c3a001"} Apr 23 16:42:30.280177 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:30.280134 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-96835-predictor-765ddcd4f4-svcw9" event={"ID":"d755ee27-2178-439f-ad75-5175f4b374e1","Type":"ContainerStarted","Data":"f7d4485b55842ec3abd7778a6b3bacdeb60895310b385e560f2e0109893612fd"} Apr 23 16:42:30.280744 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:30.280656 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-96835-predictor-765ddcd4f4-svcw9" Apr 23 16:42:30.281801 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:30.281768 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-96835-predictor-765ddcd4f4-svcw9" podUID="d755ee27-2178-439f-ad75-5175f4b374e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.17:8080: connect: connection refused" Apr 23 16:42:30.300133 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:30.300036 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-96835-predictor-765ddcd4f4-svcw9" podStartSLOduration=6.300018254 podStartE2EDuration="6.300018254s" podCreationTimestamp="2026-04-23 16:42:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:42:30.298249579 +0000 UTC m=+455.988035925" watchObservedRunningTime="2026-04-23 16:42:30.300018254 +0000 UTC m=+455.989804603" Apr 23 16:42:31.284428 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:31.284384 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-96835-predictor-765ddcd4f4-svcw9" podUID="d755ee27-2178-439f-ad75-5175f4b374e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.17:8080: connect: connection refused" Apr 23 16:42:36.070310 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:36.070257 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" podUID="ad0c19ac-6781-491e-accf-80afed0e81f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.16:8080: connect: connection refused" Apr 23 16:42:36.070821 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:36.070787 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" podUID="ad0c19ac-6781-491e-accf-80afed0e81f8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:42:41.285170 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:41.285120 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-96835-predictor-765ddcd4f4-svcw9" podUID="d755ee27-2178-439f-ad75-5175f4b374e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.17:8080: connect: connection refused" Apr 23 16:42:46.070551 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:46.070502 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" podUID="ad0c19ac-6781-491e-accf-80afed0e81f8" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.16:8080: connect: connection refused" Apr 23 16:42:46.071024 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:46.070658 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" Apr 23 16:42:46.071024 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:46.070804 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" podUID="ad0c19ac-6781-491e-accf-80afed0e81f8" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:42:46.071024 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:46.070919 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" Apr 23 16:42:48.332875 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:48.332842 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-96835-predictor-786b8577d7-2g5q6" event={"ID":"edc93650-3c52-4975-b764-f8211583c4dc","Type":"ContainerStarted","Data":"4dba521e2b5962ebc2e051ecd8bb5af79d4b73c9cce662ac16b78064c2d4fbd1"} Apr 23 16:42:48.333267 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:48.333228 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-96835-predictor-786b8577d7-2g5q6" Apr 23 16:42:48.334288 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:48.334259 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-96835-predictor-786b8577d7-2g5q6" podUID="edc93650-3c52-4975-b764-f8211583c4dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.18:8080: connect: connection refused" Apr 23 16:42:48.351378 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:48.351334 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-96835-predictor-786b8577d7-2g5q6" podStartSLOduration=6.037787094 podStartE2EDuration="24.351321003s" podCreationTimestamp="2026-04-23 16:42:24 +0000 UTC" firstStartedPulling="2026-04-23 16:42:29.275426292 +0000 UTC m=+454.965212617" lastFinishedPulling="2026-04-23 16:42:47.588960197 +0000 UTC m=+473.278746526" observedRunningTime="2026-04-23 16:42:48.349910726 +0000 UTC m=+474.039697071" watchObservedRunningTime="2026-04-23 16:42:48.351321003 +0000 UTC m=+474.041107350" Apr 23 16:42:49.336472 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:49.336433 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-96835-predictor-786b8577d7-2g5q6" podUID="edc93650-3c52-4975-b764-f8211583c4dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.18:8080: connect: connection refused" Apr 23 16:42:51.284871 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:51.284828 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-96835-predictor-765ddcd4f4-svcw9" podUID="d755ee27-2178-439f-ad75-5175f4b374e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.17:8080: connect: connection refused" Apr 23 16:42:54.300113 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:54.300092 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" Apr 23 16:42:54.352031 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:54.351996 2573 generic.go:358] "Generic (PLEG): container finished" podID="ad0c19ac-6781-491e-accf-80afed0e81f8" containerID="c048ee9f4189c152d0df33b4ab37b5a1518245f7d5a7f830fa8df007340e1c9c" exitCode=0 Apr 23 16:42:54.352194 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:54.352085 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" event={"ID":"ad0c19ac-6781-491e-accf-80afed0e81f8","Type":"ContainerDied","Data":"c048ee9f4189c152d0df33b4ab37b5a1518245f7d5a7f830fa8df007340e1c9c"} Apr 23 16:42:54.352194 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:54.352132 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" event={"ID":"ad0c19ac-6781-491e-accf-80afed0e81f8","Type":"ContainerDied","Data":"ac9489df60fbc25a0c665d278e5763378f0f77133153791ac2e404840c35c1cf"} Apr 23 16:42:54.352194 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:54.352148 2573 scope.go:117] "RemoveContainer" containerID="c048ee9f4189c152d0df33b4ab37b5a1518245f7d5a7f830fa8df007340e1c9c" Apr 23 16:42:54.352194 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:54.352100 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52" Apr 23 16:42:54.359325 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:54.359309 2573 scope.go:117] "RemoveContainer" containerID="a8017226b84c0c8e0a43b5bacd8d67f8b3f9b354b69c00846208ec711b88b079" Apr 23 16:42:54.365785 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:54.365767 2573 scope.go:117] "RemoveContainer" containerID="75590575736b79fdbcc27700628560406d54f3ad9301c7c69b1d6fa916ff5cf9" Apr 23 16:42:54.371956 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:54.371940 2573 scope.go:117] "RemoveContainer" containerID="c048ee9f4189c152d0df33b4ab37b5a1518245f7d5a7f830fa8df007340e1c9c" Apr 23 16:42:54.372177 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:42:54.372159 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c048ee9f4189c152d0df33b4ab37b5a1518245f7d5a7f830fa8df007340e1c9c\": container with ID starting with c048ee9f4189c152d0df33b4ab37b5a1518245f7d5a7f830fa8df007340e1c9c not found: ID does not exist" containerID="c048ee9f4189c152d0df33b4ab37b5a1518245f7d5a7f830fa8df007340e1c9c" Apr 23 16:42:54.372244 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:54.372189 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c048ee9f4189c152d0df33b4ab37b5a1518245f7d5a7f830fa8df007340e1c9c"} err="failed to get container status \"c048ee9f4189c152d0df33b4ab37b5a1518245f7d5a7f830fa8df007340e1c9c\": rpc error: code = NotFound desc = could not find container \"c048ee9f4189c152d0df33b4ab37b5a1518245f7d5a7f830fa8df007340e1c9c\": container with ID starting with c048ee9f4189c152d0df33b4ab37b5a1518245f7d5a7f830fa8df007340e1c9c not found: ID does not exist" Apr 23 16:42:54.372244 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:54.372214 2573 scope.go:117] "RemoveContainer" containerID="a8017226b84c0c8e0a43b5bacd8d67f8b3f9b354b69c00846208ec711b88b079" Apr 23 16:42:54.372435 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:42:54.372419 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8017226b84c0c8e0a43b5bacd8d67f8b3f9b354b69c00846208ec711b88b079\": container with ID starting with a8017226b84c0c8e0a43b5bacd8d67f8b3f9b354b69c00846208ec711b88b079 not found: ID does not exist" containerID="a8017226b84c0c8e0a43b5bacd8d67f8b3f9b354b69c00846208ec711b88b079" Apr 23 16:42:54.372477 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:54.372441 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8017226b84c0c8e0a43b5bacd8d67f8b3f9b354b69c00846208ec711b88b079"} err="failed to get container status \"a8017226b84c0c8e0a43b5bacd8d67f8b3f9b354b69c00846208ec711b88b079\": rpc error: code = NotFound desc = could not find container \"a8017226b84c0c8e0a43b5bacd8d67f8b3f9b354b69c00846208ec711b88b079\": container with ID starting with a8017226b84c0c8e0a43b5bacd8d67f8b3f9b354b69c00846208ec711b88b079 not found: ID does not exist" Apr 23 16:42:54.372477 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:54.372456 2573 scope.go:117] "RemoveContainer" containerID="75590575736b79fdbcc27700628560406d54f3ad9301c7c69b1d6fa916ff5cf9" Apr 23 16:42:54.372652 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:42:54.372635 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"75590575736b79fdbcc27700628560406d54f3ad9301c7c69b1d6fa916ff5cf9\": container with ID starting with 75590575736b79fdbcc27700628560406d54f3ad9301c7c69b1d6fa916ff5cf9 not found: ID does not exist" containerID="75590575736b79fdbcc27700628560406d54f3ad9301c7c69b1d6fa916ff5cf9" Apr 23 16:42:54.372784 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:54.372659 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"75590575736b79fdbcc27700628560406d54f3ad9301c7c69b1d6fa916ff5cf9"} err="failed to get container status \"75590575736b79fdbcc27700628560406d54f3ad9301c7c69b1d6fa916ff5cf9\": rpc error: code = NotFound desc = could not find container \"75590575736b79fdbcc27700628560406d54f3ad9301c7c69b1d6fa916ff5cf9\": container with ID starting with 75590575736b79fdbcc27700628560406d54f3ad9301c7c69b1d6fa916ff5cf9 not found: ID does not exist" Apr 23 16:42:54.377939 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:54.377923 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad0c19ac-6781-491e-accf-80afed0e81f8-kserve-provision-location\") pod \"ad0c19ac-6781-491e-accf-80afed0e81f8\" (UID: \"ad0c19ac-6781-491e-accf-80afed0e81f8\") " Apr 23 16:42:54.378192 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:54.378173 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad0c19ac-6781-491e-accf-80afed0e81f8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ad0c19ac-6781-491e-accf-80afed0e81f8" (UID: "ad0c19ac-6781-491e-accf-80afed0e81f8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:42:54.478952 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:54.478883 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad0c19ac-6781-491e-accf-80afed0e81f8-kserve-provision-location\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:42:54.672206 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:54.672177 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52"] Apr 23 16:42:54.676218 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:54.676195 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-raw-sklearn-batcher-2511f-predictor-696c76f45b-5fr52"] Apr 23 16:42:54.879037 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:54.879005 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad0c19ac-6781-491e-accf-80afed0e81f8" path="/var/lib/kubelet/pods/ad0c19ac-6781-491e-accf-80afed0e81f8/volumes" Apr 23 16:42:59.337025 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:42:59.336981 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-96835-predictor-786b8577d7-2g5q6" podUID="edc93650-3c52-4975-b764-f8211583c4dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.18:8080: connect: connection refused" Apr 23 16:43:01.284961 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:01.284911 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-96835-predictor-765ddcd4f4-svcw9" podUID="d755ee27-2178-439f-ad75-5175f4b374e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.17:8080: connect: connection refused" Apr 23 16:43:09.336627 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:09.336583 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-96835-predictor-786b8577d7-2g5q6" podUID="edc93650-3c52-4975-b764-f8211583c4dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.18:8080: connect: connection refused" Apr 23 16:43:11.285134 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:11.285088 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-96835-predictor-765ddcd4f4-svcw9" podUID="d755ee27-2178-439f-ad75-5175f4b374e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.17:8080: connect: connection refused" Apr 23 16:43:19.336864 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:19.336820 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-96835-predictor-786b8577d7-2g5q6" podUID="edc93650-3c52-4975-b764-f8211583c4dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.18:8080: connect: connection refused" Apr 23 16:43:21.285262 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:21.285220 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-96835-predictor-765ddcd4f4-svcw9" podUID="d755ee27-2178-439f-ad75-5175f4b374e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.17:8080: connect: connection refused" Apr 23 16:43:29.336519 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:29.336471 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-96835-predictor-786b8577d7-2g5q6" podUID="edc93650-3c52-4975-b764-f8211583c4dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.18:8080: connect: connection refused" Apr 23 16:43:31.284861 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:31.284811 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-96835-predictor-765ddcd4f4-svcw9" podUID="d755ee27-2178-439f-ad75-5175f4b374e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.17:8080: connect: connection refused" Apr 23 16:43:33.876084 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:33.876002 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-96835-predictor-765ddcd4f4-svcw9" podUID="d755ee27-2178-439f-ad75-5175f4b374e1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.17:8080: connect: connection refused" Apr 23 16:43:39.337354 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:39.337259 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-96835-predictor-786b8577d7-2g5q6" podUID="edc93650-3c52-4975-b764-f8211583c4dc" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.18:8080: connect: connection refused" Apr 23 16:43:43.876917 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:43.876880 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-96835-predictor-765ddcd4f4-svcw9" Apr 23 16:43:49.337809 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:49.337773 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-96835-predictor-786b8577d7-2g5q6" Apr 23 16:43:54.316202 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:54.316167 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-96835-69c94ddcbc-qmgwk"] Apr 23 16:43:54.316624 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:54.316494 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad0c19ac-6781-491e-accf-80afed0e81f8" containerName="kserve-container" Apr 23 16:43:54.316624 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:54.316508 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad0c19ac-6781-491e-accf-80afed0e81f8" containerName="kserve-container" Apr 23 16:43:54.316624 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:54.316519 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad0c19ac-6781-491e-accf-80afed0e81f8" containerName="agent" Apr 23 16:43:54.316624 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:54.316524 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad0c19ac-6781-491e-accf-80afed0e81f8" containerName="agent" Apr 23 16:43:54.316624 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:54.316538 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad0c19ac-6781-491e-accf-80afed0e81f8" containerName="storage-initializer" Apr 23 16:43:54.316624 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:54.316544 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad0c19ac-6781-491e-accf-80afed0e81f8" containerName="storage-initializer" Apr 23 16:43:54.316624 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:54.316591 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad0c19ac-6781-491e-accf-80afed0e81f8" containerName="kserve-container" Apr 23 16:43:54.316624 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:54.316602 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad0c19ac-6781-491e-accf-80afed0e81f8" containerName="agent" Apr 23 16:43:54.319572 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:54.319558 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-96835-69c94ddcbc-qmgwk" Apr 23 16:43:54.321815 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:54.321791 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-96835-serving-cert\"" Apr 23 16:43:54.322109 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:54.322092 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-96835-kube-rbac-proxy-sar-config\"" Apr 23 16:43:54.322322 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:54.322305 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 16:43:54.337741 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:54.337719 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-96835-69c94ddcbc-qmgwk"] Apr 23 16:43:54.434285 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:54.434256 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73853204-7b37-4cdb-a493-2c659301ebbe-openshift-service-ca-bundle\") pod \"model-chainer-raw-96835-69c94ddcbc-qmgwk\" (UID: \"73853204-7b37-4cdb-a493-2c659301ebbe\") " pod="kserve-ci-e2e-test/model-chainer-raw-96835-69c94ddcbc-qmgwk" Apr 23 16:43:54.434455 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:54.434309 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/73853204-7b37-4cdb-a493-2c659301ebbe-proxy-tls\") pod \"model-chainer-raw-96835-69c94ddcbc-qmgwk\" (UID: \"73853204-7b37-4cdb-a493-2c659301ebbe\") " pod="kserve-ci-e2e-test/model-chainer-raw-96835-69c94ddcbc-qmgwk" Apr 23 16:43:54.535132 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:54.535100 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73853204-7b37-4cdb-a493-2c659301ebbe-openshift-service-ca-bundle\") pod \"model-chainer-raw-96835-69c94ddcbc-qmgwk\" (UID: \"73853204-7b37-4cdb-a493-2c659301ebbe\") " pod="kserve-ci-e2e-test/model-chainer-raw-96835-69c94ddcbc-qmgwk" Apr 23 16:43:54.535279 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:54.535151 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/73853204-7b37-4cdb-a493-2c659301ebbe-proxy-tls\") pod \"model-chainer-raw-96835-69c94ddcbc-qmgwk\" (UID: \"73853204-7b37-4cdb-a493-2c659301ebbe\") " pod="kserve-ci-e2e-test/model-chainer-raw-96835-69c94ddcbc-qmgwk" Apr 23 16:43:54.535774 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:54.535749 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73853204-7b37-4cdb-a493-2c659301ebbe-openshift-service-ca-bundle\") pod \"model-chainer-raw-96835-69c94ddcbc-qmgwk\" (UID: \"73853204-7b37-4cdb-a493-2c659301ebbe\") " pod="kserve-ci-e2e-test/model-chainer-raw-96835-69c94ddcbc-qmgwk" Apr 23 16:43:54.537460 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:54.537442 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/73853204-7b37-4cdb-a493-2c659301ebbe-proxy-tls\") pod \"model-chainer-raw-96835-69c94ddcbc-qmgwk\" (UID: \"73853204-7b37-4cdb-a493-2c659301ebbe\") " pod="kserve-ci-e2e-test/model-chainer-raw-96835-69c94ddcbc-qmgwk" Apr 23 16:43:54.629627 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:54.629563 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-96835-69c94ddcbc-qmgwk" Apr 23 16:43:54.751676 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:54.751649 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-96835-69c94ddcbc-qmgwk"] Apr 23 16:43:54.755324 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:43:54.755294 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod73853204_7b37_4cdb_a493_2c659301ebbe.slice/crio-ad21df46661f45af9b9df734afa45a6d99790a6bff4294048f3a7831e11e2c3e WatchSource:0}: Error finding container ad21df46661f45af9b9df734afa45a6d99790a6bff4294048f3a7831e11e2c3e: Status 404 returned error can't find the container with id ad21df46661f45af9b9df734afa45a6d99790a6bff4294048f3a7831e11e2c3e Apr 23 16:43:55.520672 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:55.520633 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-96835-69c94ddcbc-qmgwk" event={"ID":"73853204-7b37-4cdb-a493-2c659301ebbe","Type":"ContainerStarted","Data":"ad21df46661f45af9b9df734afa45a6d99790a6bff4294048f3a7831e11e2c3e"} Apr 23 16:43:57.528141 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:57.528103 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-96835-69c94ddcbc-qmgwk" event={"ID":"73853204-7b37-4cdb-a493-2c659301ebbe","Type":"ContainerStarted","Data":"f22d81f695f326024f1c6f5fca3a7d0e06271db3b93e7a60548949a3e162444f"} Apr 23 16:43:57.528514 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:57.528246 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-96835-69c94ddcbc-qmgwk" Apr 23 16:43:57.547987 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:43:57.547937 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-96835-69c94ddcbc-qmgwk" podStartSLOduration=0.966413753 podStartE2EDuration="3.547923123s" podCreationTimestamp="2026-04-23 16:43:54 +0000 UTC" firstStartedPulling="2026-04-23 16:43:54.756946604 +0000 UTC m=+540.446732927" lastFinishedPulling="2026-04-23 16:43:57.33845597 +0000 UTC m=+543.028242297" observedRunningTime="2026-04-23 16:43:57.546584122 +0000 UTC m=+543.236370467" watchObservedRunningTime="2026-04-23 16:43:57.547923123 +0000 UTC m=+543.237709490" Apr 23 16:44:03.536377 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:03.536345 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-96835-69c94ddcbc-qmgwk" Apr 23 16:44:04.384898 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:04.384863 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-96835-69c94ddcbc-qmgwk"] Apr 23 16:44:04.385139 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:04.385100 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-96835-69c94ddcbc-qmgwk" podUID="73853204-7b37-4cdb-a493-2c659301ebbe" containerName="model-chainer-raw-96835" containerID="cri-o://f22d81f695f326024f1c6f5fca3a7d0e06271db3b93e7a60548949a3e162444f" gracePeriod=30 Apr 23 16:44:04.567980 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:04.567949 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-96835-predictor-765ddcd4f4-svcw9"] Apr 23 16:44:04.568368 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:04.568215 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-96835-predictor-765ddcd4f4-svcw9" podUID="d755ee27-2178-439f-ad75-5175f4b374e1" containerName="kserve-container" containerID="cri-o://f7d4485b55842ec3abd7778a6b3bacdeb60895310b385e560f2e0109893612fd" gracePeriod=30 Apr 23 16:44:04.624373 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:04.624343 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-016c1-predictor-7787658cd6-t9vzr"] Apr 23 16:44:04.627688 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:04.627669 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-016c1-predictor-7787658cd6-t9vzr" Apr 23 16:44:04.651714 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:04.651675 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-016c1-predictor-7787658cd6-t9vzr"] Apr 23 16:44:04.715100 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:04.715075 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8df1030-fcb2-4b7c-a9fa-69735a57dc2e-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-016c1-predictor-7787658cd6-t9vzr\" (UID: \"a8df1030-fcb2-4b7c-a9fa-69735a57dc2e\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-016c1-predictor-7787658cd6-t9vzr" Apr 23 16:44:04.726504 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:04.726480 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-016c1-predictor-7b6c866ddc-8f9tx"] Apr 23 16:44:04.729950 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:04.729934 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-016c1-predictor-7b6c866ddc-8f9tx" Apr 23 16:44:04.745455 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:04.745436 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-016c1-predictor-7b6c866ddc-8f9tx"] Apr 23 16:44:04.816235 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:04.816198 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8df1030-fcb2-4b7c-a9fa-69735a57dc2e-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-016c1-predictor-7787658cd6-t9vzr\" (UID: \"a8df1030-fcb2-4b7c-a9fa-69735a57dc2e\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-016c1-predictor-7787658cd6-t9vzr" Apr 23 16:44:04.816417 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:04.816341 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/854a4946-6a5a-4130-bcd1-565998483712-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-016c1-predictor-7b6c866ddc-8f9tx\" (UID: \"854a4946-6a5a-4130-bcd1-565998483712\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-016c1-predictor-7b6c866ddc-8f9tx" Apr 23 16:44:04.816617 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:04.816593 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8df1030-fcb2-4b7c-a9fa-69735a57dc2e-kserve-provision-location\") pod \"isvc-sklearn-graph-raw-hpa-016c1-predictor-7787658cd6-t9vzr\" (UID: \"a8df1030-fcb2-4b7c-a9fa-69735a57dc2e\") " pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-016c1-predictor-7787658cd6-t9vzr" Apr 23 16:44:04.866839 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:04.866814 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-96835-predictor-786b8577d7-2g5q6"] Apr 23 16:44:04.867166 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:04.867138 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-96835-predictor-786b8577d7-2g5q6" podUID="edc93650-3c52-4975-b764-f8211583c4dc" containerName="kserve-container" containerID="cri-o://4dba521e2b5962ebc2e051ecd8bb5af79d4b73c9cce662ac16b78064c2d4fbd1" gracePeriod=30 Apr 23 16:44:04.917124 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:04.917045 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/854a4946-6a5a-4130-bcd1-565998483712-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-016c1-predictor-7b6c866ddc-8f9tx\" (UID: \"854a4946-6a5a-4130-bcd1-565998483712\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-016c1-predictor-7b6c866ddc-8f9tx" Apr 23 16:44:04.917382 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:04.917365 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/854a4946-6a5a-4130-bcd1-565998483712-kserve-provision-location\") pod \"isvc-xgboost-graph-raw-hpa-016c1-predictor-7b6c866ddc-8f9tx\" (UID: \"854a4946-6a5a-4130-bcd1-565998483712\") " pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-016c1-predictor-7b6c866ddc-8f9tx" Apr 23 16:44:04.937387 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:04.937370 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-016c1-predictor-7787658cd6-t9vzr" Apr 23 16:44:05.039218 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:05.039189 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-016c1-predictor-7b6c866ddc-8f9tx" Apr 23 16:44:05.059116 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:05.059093 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-016c1-predictor-7787658cd6-t9vzr"] Apr 23 16:44:05.061285 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:44:05.061254 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8df1030_fcb2_4b7c_a9fa_69735a57dc2e.slice/crio-5709d3d4e0105ad563bc6280452fef33cda4250f74c327eaa9e5933f596a86fe WatchSource:0}: Error finding container 5709d3d4e0105ad563bc6280452fef33cda4250f74c327eaa9e5933f596a86fe: Status 404 returned error can't find the container with id 5709d3d4e0105ad563bc6280452fef33cda4250f74c327eaa9e5933f596a86fe Apr 23 16:44:05.166670 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:05.166507 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-016c1-predictor-7b6c866ddc-8f9tx"] Apr 23 16:44:05.169182 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:44:05.169117 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod854a4946_6a5a_4130_bcd1_565998483712.slice/crio-0dde51958a1cb882a828a0a88aa588adcedc3ff42bbcba853c5121e180950be5 WatchSource:0}: Error finding container 0dde51958a1cb882a828a0a88aa588adcedc3ff42bbcba853c5121e180950be5: Status 404 returned error can't find the container with id 0dde51958a1cb882a828a0a88aa588adcedc3ff42bbcba853c5121e180950be5 Apr 23 16:44:05.556050 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:05.555953 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-016c1-predictor-7787658cd6-t9vzr" event={"ID":"a8df1030-fcb2-4b7c-a9fa-69735a57dc2e","Type":"ContainerStarted","Data":"9588826c4ed8169b4360ac770a718fe57b546f88db792035ecd1a2eebc69aefe"} Apr 23 16:44:05.556050 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:05.555999 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-016c1-predictor-7787658cd6-t9vzr" event={"ID":"a8df1030-fcb2-4b7c-a9fa-69735a57dc2e","Type":"ContainerStarted","Data":"5709d3d4e0105ad563bc6280452fef33cda4250f74c327eaa9e5933f596a86fe"} Apr 23 16:44:05.557530 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:05.557507 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-016c1-predictor-7b6c866ddc-8f9tx" event={"ID":"854a4946-6a5a-4130-bcd1-565998483712","Type":"ContainerStarted","Data":"cfb96e2d4f0a467fba4bab2916c838bdf6cfd65ab45e3f00c77b68538170f3a9"} Apr 23 16:44:05.557530 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:05.557533 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-016c1-predictor-7b6c866ddc-8f9tx" event={"ID":"854a4946-6a5a-4130-bcd1-565998483712","Type":"ContainerStarted","Data":"0dde51958a1cb882a828a0a88aa588adcedc3ff42bbcba853c5121e180950be5"} Apr 23 16:44:08.416681 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:08.416659 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-96835-predictor-786b8577d7-2g5q6" Apr 23 16:44:08.535310 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:08.535221 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-96835-69c94ddcbc-qmgwk" podUID="73853204-7b37-4cdb-a493-2c659301ebbe" containerName="model-chainer-raw-96835" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:44:08.543306 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:08.543284 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/edc93650-3c52-4975-b764-f8211583c4dc-kserve-provision-location\") pod \"edc93650-3c52-4975-b764-f8211583c4dc\" (UID: \"edc93650-3c52-4975-b764-f8211583c4dc\") " Apr 23 16:44:08.543557 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:08.543534 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edc93650-3c52-4975-b764-f8211583c4dc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "edc93650-3c52-4975-b764-f8211583c4dc" (UID: "edc93650-3c52-4975-b764-f8211583c4dc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:44:08.567563 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:08.567536 2573 generic.go:358] "Generic (PLEG): container finished" podID="edc93650-3c52-4975-b764-f8211583c4dc" containerID="4dba521e2b5962ebc2e051ecd8bb5af79d4b73c9cce662ac16b78064c2d4fbd1" exitCode=0 Apr 23 16:44:08.567659 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:08.567608 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-96835-predictor-786b8577d7-2g5q6" Apr 23 16:44:08.567659 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:08.567623 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-96835-predictor-786b8577d7-2g5q6" event={"ID":"edc93650-3c52-4975-b764-f8211583c4dc","Type":"ContainerDied","Data":"4dba521e2b5962ebc2e051ecd8bb5af79d4b73c9cce662ac16b78064c2d4fbd1"} Apr 23 16:44:08.567794 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:08.567661 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-96835-predictor-786b8577d7-2g5q6" event={"ID":"edc93650-3c52-4975-b764-f8211583c4dc","Type":"ContainerDied","Data":"bf6e5ce81122bd5600ab0e6c8de54f21493d81187d7c67c97ea075b82ab0b2b7"} Apr 23 16:44:08.567794 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:08.567677 2573 scope.go:117] "RemoveContainer" containerID="4dba521e2b5962ebc2e051ecd8bb5af79d4b73c9cce662ac16b78064c2d4fbd1" Apr 23 16:44:08.618387 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:08.618368 2573 scope.go:117] "RemoveContainer" containerID="efb678cb42b711fe2784c7acbc287739c26c25da28628b5c2d929913d1c3a001" Apr 23 16:44:08.625191 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:08.625170 2573 scope.go:117] "RemoveContainer" containerID="4dba521e2b5962ebc2e051ecd8bb5af79d4b73c9cce662ac16b78064c2d4fbd1" Apr 23 16:44:08.625472 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:44:08.625449 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4dba521e2b5962ebc2e051ecd8bb5af79d4b73c9cce662ac16b78064c2d4fbd1\": container with ID starting with 4dba521e2b5962ebc2e051ecd8bb5af79d4b73c9cce662ac16b78064c2d4fbd1 not found: ID does not exist" containerID="4dba521e2b5962ebc2e051ecd8bb5af79d4b73c9cce662ac16b78064c2d4fbd1" Apr 23 16:44:08.625539 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:08.625488 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4dba521e2b5962ebc2e051ecd8bb5af79d4b73c9cce662ac16b78064c2d4fbd1"} err="failed to get container status \"4dba521e2b5962ebc2e051ecd8bb5af79d4b73c9cce662ac16b78064c2d4fbd1\": rpc error: code = NotFound desc = could not find container \"4dba521e2b5962ebc2e051ecd8bb5af79d4b73c9cce662ac16b78064c2d4fbd1\": container with ID starting with 4dba521e2b5962ebc2e051ecd8bb5af79d4b73c9cce662ac16b78064c2d4fbd1 not found: ID does not exist" Apr 23 16:44:08.625539 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:08.625511 2573 scope.go:117] "RemoveContainer" containerID="efb678cb42b711fe2784c7acbc287739c26c25da28628b5c2d929913d1c3a001" Apr 23 16:44:08.625844 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:44:08.625813 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"efb678cb42b711fe2784c7acbc287739c26c25da28628b5c2d929913d1c3a001\": container with ID starting with efb678cb42b711fe2784c7acbc287739c26c25da28628b5c2d929913d1c3a001 not found: ID does not exist" containerID="efb678cb42b711fe2784c7acbc287739c26c25da28628b5c2d929913d1c3a001" Apr 23 16:44:08.625913 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:08.625847 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"efb678cb42b711fe2784c7acbc287739c26c25da28628b5c2d929913d1c3a001"} err="failed to get container status \"efb678cb42b711fe2784c7acbc287739c26c25da28628b5c2d929913d1c3a001\": rpc error: code = NotFound desc = could not find container \"efb678cb42b711fe2784c7acbc287739c26c25da28628b5c2d929913d1c3a001\": container with ID starting with efb678cb42b711fe2784c7acbc287739c26c25da28628b5c2d929913d1c3a001 not found: ID does not exist" Apr 23 16:44:08.627136 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:08.627116 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-96835-predictor-786b8577d7-2g5q6"] Apr 23 16:44:08.630556 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:08.630533 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-96835-predictor-786b8577d7-2g5q6"] Apr 23 16:44:08.644614 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:08.644594 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/edc93650-3c52-4975-b764-f8211583c4dc-kserve-provision-location\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:44:08.706108 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:08.706090 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-96835-predictor-765ddcd4f4-svcw9" Apr 23 16:44:08.846238 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:08.846148 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d755ee27-2178-439f-ad75-5175f4b374e1-kserve-provision-location\") pod \"d755ee27-2178-439f-ad75-5175f4b374e1\" (UID: \"d755ee27-2178-439f-ad75-5175f4b374e1\") " Apr 23 16:44:08.846510 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:08.846483 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d755ee27-2178-439f-ad75-5175f4b374e1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d755ee27-2178-439f-ad75-5175f4b374e1" (UID: "d755ee27-2178-439f-ad75-5175f4b374e1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:44:08.880435 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:08.880402 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edc93650-3c52-4975-b764-f8211583c4dc" path="/var/lib/kubelet/pods/edc93650-3c52-4975-b764-f8211583c4dc/volumes" Apr 23 16:44:08.947116 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:08.947090 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d755ee27-2178-439f-ad75-5175f4b374e1-kserve-provision-location\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:44:09.571245 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:09.571160 2573 generic.go:358] "Generic (PLEG): container finished" podID="854a4946-6a5a-4130-bcd1-565998483712" containerID="cfb96e2d4f0a467fba4bab2916c838bdf6cfd65ab45e3f00c77b68538170f3a9" exitCode=0 Apr 23 16:44:09.571672 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:09.571240 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-016c1-predictor-7b6c866ddc-8f9tx" event={"ID":"854a4946-6a5a-4130-bcd1-565998483712","Type":"ContainerDied","Data":"cfb96e2d4f0a467fba4bab2916c838bdf6cfd65ab45e3f00c77b68538170f3a9"} Apr 23 16:44:09.572634 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:09.572612 2573 generic.go:358] "Generic (PLEG): container finished" podID="d755ee27-2178-439f-ad75-5175f4b374e1" containerID="f7d4485b55842ec3abd7778a6b3bacdeb60895310b385e560f2e0109893612fd" exitCode=0 Apr 23 16:44:09.572732 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:09.572672 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-96835-predictor-765ddcd4f4-svcw9" Apr 23 16:44:09.572732 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:09.572682 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-96835-predictor-765ddcd4f4-svcw9" event={"ID":"d755ee27-2178-439f-ad75-5175f4b374e1","Type":"ContainerDied","Data":"f7d4485b55842ec3abd7778a6b3bacdeb60895310b385e560f2e0109893612fd"} Apr 23 16:44:09.572846 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:09.572741 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-96835-predictor-765ddcd4f4-svcw9" event={"ID":"d755ee27-2178-439f-ad75-5175f4b374e1","Type":"ContainerDied","Data":"ef3c49c61138e26b0d8e074ee4921a1c226a6bbcc53d899d45e24934c764a10e"} Apr 23 16:44:09.572846 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:09.572758 2573 scope.go:117] "RemoveContainer" containerID="f7d4485b55842ec3abd7778a6b3bacdeb60895310b385e560f2e0109893612fd" Apr 23 16:44:09.573930 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:09.573911 2573 generic.go:358] "Generic (PLEG): container finished" podID="a8df1030-fcb2-4b7c-a9fa-69735a57dc2e" containerID="9588826c4ed8169b4360ac770a718fe57b546f88db792035ecd1a2eebc69aefe" exitCode=0 Apr 23 16:44:09.574023 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:09.573980 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-016c1-predictor-7787658cd6-t9vzr" event={"ID":"a8df1030-fcb2-4b7c-a9fa-69735a57dc2e","Type":"ContainerDied","Data":"9588826c4ed8169b4360ac770a718fe57b546f88db792035ecd1a2eebc69aefe"} Apr 23 16:44:09.580732 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:09.580651 2573 scope.go:117] "RemoveContainer" containerID="530a68a45deb8f888eb66e5e9422ce79ebf09724a70bb351bd76833b0e8f0940" Apr 23 16:44:09.589005 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:09.588984 2573 scope.go:117] "RemoveContainer" containerID="f7d4485b55842ec3abd7778a6b3bacdeb60895310b385e560f2e0109893612fd" Apr 23 16:44:09.589268 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:44:09.589250 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7d4485b55842ec3abd7778a6b3bacdeb60895310b385e560f2e0109893612fd\": container with ID starting with f7d4485b55842ec3abd7778a6b3bacdeb60895310b385e560f2e0109893612fd not found: ID does not exist" containerID="f7d4485b55842ec3abd7778a6b3bacdeb60895310b385e560f2e0109893612fd" Apr 23 16:44:09.589328 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:09.589276 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7d4485b55842ec3abd7778a6b3bacdeb60895310b385e560f2e0109893612fd"} err="failed to get container status \"f7d4485b55842ec3abd7778a6b3bacdeb60895310b385e560f2e0109893612fd\": rpc error: code = NotFound desc = could not find container \"f7d4485b55842ec3abd7778a6b3bacdeb60895310b385e560f2e0109893612fd\": container with ID starting with f7d4485b55842ec3abd7778a6b3bacdeb60895310b385e560f2e0109893612fd not found: ID does not exist" Apr 23 16:44:09.589328 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:09.589292 2573 scope.go:117] "RemoveContainer" containerID="530a68a45deb8f888eb66e5e9422ce79ebf09724a70bb351bd76833b0e8f0940" Apr 23 16:44:09.589521 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:44:09.589506 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"530a68a45deb8f888eb66e5e9422ce79ebf09724a70bb351bd76833b0e8f0940\": container with ID starting with 530a68a45deb8f888eb66e5e9422ce79ebf09724a70bb351bd76833b0e8f0940 not found: ID does not exist" containerID="530a68a45deb8f888eb66e5e9422ce79ebf09724a70bb351bd76833b0e8f0940" Apr 23 16:44:09.589563 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:09.589525 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"530a68a45deb8f888eb66e5e9422ce79ebf09724a70bb351bd76833b0e8f0940"} err="failed to get container status \"530a68a45deb8f888eb66e5e9422ce79ebf09724a70bb351bd76833b0e8f0940\": rpc error: code = NotFound desc = could not find container \"530a68a45deb8f888eb66e5e9422ce79ebf09724a70bb351bd76833b0e8f0940\": container with ID starting with 530a68a45deb8f888eb66e5e9422ce79ebf09724a70bb351bd76833b0e8f0940 not found: ID does not exist" Apr 23 16:44:09.597217 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:09.597175 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-96835-predictor-765ddcd4f4-svcw9"] Apr 23 16:44:09.598678 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:09.598656 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-96835-predictor-765ddcd4f4-svcw9"] Apr 23 16:44:10.579784 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:10.579743 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-016c1-predictor-7787658cd6-t9vzr" event={"ID":"a8df1030-fcb2-4b7c-a9fa-69735a57dc2e","Type":"ContainerStarted","Data":"9c83266d7b3202943f5c7d248a644e789d7868f968e31e2938f1d7d3ca1dae67"} Apr 23 16:44:10.580263 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:10.580099 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-016c1-predictor-7787658cd6-t9vzr" Apr 23 16:44:10.581373 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:10.581352 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-016c1-predictor-7b6c866ddc-8f9tx" event={"ID":"854a4946-6a5a-4130-bcd1-565998483712","Type":"ContainerStarted","Data":"f66ca3c18dec30ca4b66b7a467b9d2b0a3cf875213b384877ac20e69f65e3a09"} Apr 23 16:44:10.581468 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:10.581448 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-016c1-predictor-7787658cd6-t9vzr" podUID="a8df1030-fcb2-4b7c-a9fa-69735a57dc2e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 23 16:44:10.581642 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:10.581627 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-016c1-predictor-7b6c866ddc-8f9tx" Apr 23 16:44:10.582452 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:10.582431 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-016c1-predictor-7b6c866ddc-8f9tx" podUID="854a4946-6a5a-4130-bcd1-565998483712" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 23 16:44:10.595896 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:10.595836 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-016c1-predictor-7787658cd6-t9vzr" podStartSLOduration=6.595821484 podStartE2EDuration="6.595821484s" podCreationTimestamp="2026-04-23 16:44:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:44:10.595219112 +0000 UTC m=+556.285005457" watchObservedRunningTime="2026-04-23 16:44:10.595821484 +0000 UTC m=+556.285607828" Apr 23 16:44:10.611163 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:10.611126 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-016c1-predictor-7b6c866ddc-8f9tx" podStartSLOduration=6.611114848 podStartE2EDuration="6.611114848s" podCreationTimestamp="2026-04-23 16:44:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:44:10.609269609 +0000 UTC m=+556.299055955" watchObservedRunningTime="2026-04-23 16:44:10.611114848 +0000 UTC m=+556.300901193" Apr 23 16:44:10.879572 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:10.879496 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d755ee27-2178-439f-ad75-5175f4b374e1" path="/var/lib/kubelet/pods/d755ee27-2178-439f-ad75-5175f4b374e1/volumes" Apr 23 16:44:11.584566 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:11.584522 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-016c1-predictor-7b6c866ddc-8f9tx" podUID="854a4946-6a5a-4130-bcd1-565998483712" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 23 16:44:11.585017 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:11.584594 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-016c1-predictor-7787658cd6-t9vzr" podUID="a8df1030-fcb2-4b7c-a9fa-69735a57dc2e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 23 16:44:13.535242 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:13.535204 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-96835-69c94ddcbc-qmgwk" podUID="73853204-7b37-4cdb-a493-2c659301ebbe" containerName="model-chainer-raw-96835" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:44:18.535175 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:18.535130 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-96835-69c94ddcbc-qmgwk" podUID="73853204-7b37-4cdb-a493-2c659301ebbe" containerName="model-chainer-raw-96835" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:44:18.535549 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:18.535250 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-96835-69c94ddcbc-qmgwk" Apr 23 16:44:21.585063 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:21.585024 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-016c1-predictor-7b6c866ddc-8f9tx" podUID="854a4946-6a5a-4130-bcd1-565998483712" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 23 16:44:21.585428 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:21.585023 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-016c1-predictor-7787658cd6-t9vzr" podUID="a8df1030-fcb2-4b7c-a9fa-69735a57dc2e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 23 16:44:23.535279 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:23.535243 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-96835-69c94ddcbc-qmgwk" podUID="73853204-7b37-4cdb-a493-2c659301ebbe" containerName="model-chainer-raw-96835" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:44:28.535496 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:28.535451 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-96835-69c94ddcbc-qmgwk" podUID="73853204-7b37-4cdb-a493-2c659301ebbe" containerName="model-chainer-raw-96835" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:44:31.585289 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:31.585251 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-016c1-predictor-7787658cd6-t9vzr" podUID="a8df1030-fcb2-4b7c-a9fa-69735a57dc2e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 23 16:44:31.585664 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:31.585248 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-016c1-predictor-7b6c866ddc-8f9tx" podUID="854a4946-6a5a-4130-bcd1-565998483712" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 23 16:44:33.535530 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:33.535494 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-96835-69c94ddcbc-qmgwk" podUID="73853204-7b37-4cdb-a493-2c659301ebbe" containerName="model-chainer-raw-96835" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:44:34.521736 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:34.521715 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-96835-69c94ddcbc-qmgwk" Apr 23 16:44:34.553819 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:34.553789 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73853204-7b37-4cdb-a493-2c659301ebbe-openshift-service-ca-bundle\") pod \"73853204-7b37-4cdb-a493-2c659301ebbe\" (UID: \"73853204-7b37-4cdb-a493-2c659301ebbe\") " Apr 23 16:44:34.554165 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:34.553883 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/73853204-7b37-4cdb-a493-2c659301ebbe-proxy-tls\") pod \"73853204-7b37-4cdb-a493-2c659301ebbe\" (UID: \"73853204-7b37-4cdb-a493-2c659301ebbe\") " Apr 23 16:44:34.554165 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:34.554092 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73853204-7b37-4cdb-a493-2c659301ebbe-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "73853204-7b37-4cdb-a493-2c659301ebbe" (UID: "73853204-7b37-4cdb-a493-2c659301ebbe"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:44:34.555894 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:34.555873 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/73853204-7b37-4cdb-a493-2c659301ebbe-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "73853204-7b37-4cdb-a493-2c659301ebbe" (UID: "73853204-7b37-4cdb-a493-2c659301ebbe"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:44:34.652806 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:34.652771 2573 generic.go:358] "Generic (PLEG): container finished" podID="73853204-7b37-4cdb-a493-2c659301ebbe" containerID="f22d81f695f326024f1c6f5fca3a7d0e06271db3b93e7a60548949a3e162444f" exitCode=0 Apr 23 16:44:34.653000 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:34.652841 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-96835-69c94ddcbc-qmgwk" Apr 23 16:44:34.653000 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:34.652860 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-96835-69c94ddcbc-qmgwk" event={"ID":"73853204-7b37-4cdb-a493-2c659301ebbe","Type":"ContainerDied","Data":"f22d81f695f326024f1c6f5fca3a7d0e06271db3b93e7a60548949a3e162444f"} Apr 23 16:44:34.653000 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:34.652911 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-96835-69c94ddcbc-qmgwk" event={"ID":"73853204-7b37-4cdb-a493-2c659301ebbe","Type":"ContainerDied","Data":"ad21df46661f45af9b9df734afa45a6d99790a6bff4294048f3a7831e11e2c3e"} Apr 23 16:44:34.653000 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:34.652928 2573 scope.go:117] "RemoveContainer" containerID="f22d81f695f326024f1c6f5fca3a7d0e06271db3b93e7a60548949a3e162444f" Apr 23 16:44:34.654468 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:34.654444 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/73853204-7b37-4cdb-a493-2c659301ebbe-proxy-tls\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:44:34.654567 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:34.654477 2573 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/73853204-7b37-4cdb-a493-2c659301ebbe-openshift-service-ca-bundle\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:44:34.660816 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:34.660796 2573 scope.go:117] "RemoveContainer" containerID="f22d81f695f326024f1c6f5fca3a7d0e06271db3b93e7a60548949a3e162444f" Apr 23 16:44:34.661067 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:44:34.661049 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f22d81f695f326024f1c6f5fca3a7d0e06271db3b93e7a60548949a3e162444f\": container with ID starting with f22d81f695f326024f1c6f5fca3a7d0e06271db3b93e7a60548949a3e162444f not found: ID does not exist" containerID="f22d81f695f326024f1c6f5fca3a7d0e06271db3b93e7a60548949a3e162444f" Apr 23 16:44:34.661125 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:34.661074 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f22d81f695f326024f1c6f5fca3a7d0e06271db3b93e7a60548949a3e162444f"} err="failed to get container status \"f22d81f695f326024f1c6f5fca3a7d0e06271db3b93e7a60548949a3e162444f\": rpc error: code = NotFound desc = could not find container \"f22d81f695f326024f1c6f5fca3a7d0e06271db3b93e7a60548949a3e162444f\": container with ID starting with f22d81f695f326024f1c6f5fca3a7d0e06271db3b93e7a60548949a3e162444f not found: ID does not exist" Apr 23 16:44:34.675130 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:34.675111 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-96835-69c94ddcbc-qmgwk"] Apr 23 16:44:34.683009 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:34.682990 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-96835-69c94ddcbc-qmgwk"] Apr 23 16:44:34.879865 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:34.879833 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73853204-7b37-4cdb-a493-2c659301ebbe" path="/var/lib/kubelet/pods/73853204-7b37-4cdb-a493-2c659301ebbe/volumes" Apr 23 16:44:41.585212 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:41.585173 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-016c1-predictor-7787658cd6-t9vzr" podUID="a8df1030-fcb2-4b7c-a9fa-69735a57dc2e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 23 16:44:41.585572 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:41.585173 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-016c1-predictor-7b6c866ddc-8f9tx" podUID="854a4946-6a5a-4130-bcd1-565998483712" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 23 16:44:51.584891 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:51.584843 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-016c1-predictor-7787658cd6-t9vzr" podUID="a8df1030-fcb2-4b7c-a9fa-69735a57dc2e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 23 16:44:51.585277 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:51.584855 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-016c1-predictor-7b6c866ddc-8f9tx" podUID="854a4946-6a5a-4130-bcd1-565998483712" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 23 16:44:54.753351 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:54.753324 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wd2cz_52988e90-484a-49cd-98f6-5510a28890d6/ovn-acl-logging/0.log" Apr 23 16:44:54.753799 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:44:54.753667 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wd2cz_52988e90-484a-49cd-98f6-5510a28890d6/ovn-acl-logging/0.log" Apr 23 16:45:01.585429 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:01.585376 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-016c1-predictor-7787658cd6-t9vzr" podUID="a8df1030-fcb2-4b7c-a9fa-69735a57dc2e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 23 16:45:01.585921 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:01.585376 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-016c1-predictor-7b6c866ddc-8f9tx" podUID="854a4946-6a5a-4130-bcd1-565998483712" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.21:8080: connect: connection refused" Apr 23 16:45:11.585341 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:11.585242 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-016c1-predictor-7787658cd6-t9vzr" podUID="a8df1030-fcb2-4b7c-a9fa-69735a57dc2e" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.20:8080: connect: connection refused" Apr 23 16:45:11.585879 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:11.585812 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-016c1-predictor-7b6c866ddc-8f9tx" Apr 23 16:45:21.585891 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:21.585856 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-016c1-predictor-7787658cd6-t9vzr" Apr 23 16:45:34.641444 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:34.641415 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-016c1-68d557966c-8cbj7"] Apr 23 16:45:34.641861 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:34.641680 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d755ee27-2178-439f-ad75-5175f4b374e1" containerName="storage-initializer" Apr 23 16:45:34.641861 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:34.641706 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d755ee27-2178-439f-ad75-5175f4b374e1" containerName="storage-initializer" Apr 23 16:45:34.641861 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:34.641719 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="73853204-7b37-4cdb-a493-2c659301ebbe" containerName="model-chainer-raw-96835" Apr 23 16:45:34.641861 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:34.641728 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="73853204-7b37-4cdb-a493-2c659301ebbe" containerName="model-chainer-raw-96835" Apr 23 16:45:34.641861 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:34.641749 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="edc93650-3c52-4975-b764-f8211583c4dc" containerName="storage-initializer" Apr 23 16:45:34.641861 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:34.641755 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="edc93650-3c52-4975-b764-f8211583c4dc" containerName="storage-initializer" Apr 23 16:45:34.641861 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:34.641765 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d755ee27-2178-439f-ad75-5175f4b374e1" containerName="kserve-container" Apr 23 16:45:34.641861 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:34.641771 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="d755ee27-2178-439f-ad75-5175f4b374e1" containerName="kserve-container" Apr 23 16:45:34.641861 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:34.641781 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="edc93650-3c52-4975-b764-f8211583c4dc" containerName="kserve-container" Apr 23 16:45:34.641861 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:34.641787 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="edc93650-3c52-4975-b764-f8211583c4dc" containerName="kserve-container" Apr 23 16:45:34.641861 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:34.641831 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="edc93650-3c52-4975-b764-f8211583c4dc" containerName="kserve-container" Apr 23 16:45:34.641861 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:34.641840 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="73853204-7b37-4cdb-a493-2c659301ebbe" containerName="model-chainer-raw-96835" Apr 23 16:45:34.641861 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:34.641845 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="d755ee27-2178-439f-ad75-5175f4b374e1" containerName="kserve-container" Apr 23 16:45:34.644541 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:34.644526 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-016c1-68d557966c-8cbj7" Apr 23 16:45:34.646591 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:34.646571 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-016c1-kube-rbac-proxy-sar-config\"" Apr 23 16:45:34.647349 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:34.647334 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 16:45:34.647349 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:34.647341 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"model-chainer-raw-hpa-016c1-serving-cert\"" Apr 23 16:45:34.653521 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:34.653501 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-016c1-68d557966c-8cbj7"] Apr 23 16:45:34.718547 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:34.718521 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c54a407c-b99a-49ff-b972-ba01a4c366d3-proxy-tls\") pod \"model-chainer-raw-hpa-016c1-68d557966c-8cbj7\" (UID: \"c54a407c-b99a-49ff-b972-ba01a4c366d3\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-016c1-68d557966c-8cbj7" Apr 23 16:45:34.718682 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:34.718568 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c54a407c-b99a-49ff-b972-ba01a4c366d3-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-016c1-68d557966c-8cbj7\" (UID: \"c54a407c-b99a-49ff-b972-ba01a4c366d3\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-016c1-68d557966c-8cbj7" Apr 23 16:45:34.819016 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:34.818986 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c54a407c-b99a-49ff-b972-ba01a4c366d3-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-016c1-68d557966c-8cbj7\" (UID: \"c54a407c-b99a-49ff-b972-ba01a4c366d3\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-016c1-68d557966c-8cbj7" Apr 23 16:45:34.819155 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:34.819066 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c54a407c-b99a-49ff-b972-ba01a4c366d3-proxy-tls\") pod \"model-chainer-raw-hpa-016c1-68d557966c-8cbj7\" (UID: \"c54a407c-b99a-49ff-b972-ba01a4c366d3\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-016c1-68d557966c-8cbj7" Apr 23 16:45:34.819847 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:34.819810 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c54a407c-b99a-49ff-b972-ba01a4c366d3-openshift-service-ca-bundle\") pod \"model-chainer-raw-hpa-016c1-68d557966c-8cbj7\" (UID: \"c54a407c-b99a-49ff-b972-ba01a4c366d3\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-016c1-68d557966c-8cbj7" Apr 23 16:45:34.821432 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:34.821410 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c54a407c-b99a-49ff-b972-ba01a4c366d3-proxy-tls\") pod \"model-chainer-raw-hpa-016c1-68d557966c-8cbj7\" (UID: \"c54a407c-b99a-49ff-b972-ba01a4c366d3\") " pod="kserve-ci-e2e-test/model-chainer-raw-hpa-016c1-68d557966c-8cbj7" Apr 23 16:45:34.954788 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:34.954766 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-016c1-68d557966c-8cbj7" Apr 23 16:45:35.081675 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:35.081648 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-016c1-68d557966c-8cbj7"] Apr 23 16:45:35.085253 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:45:35.085223 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc54a407c_b99a_49ff_b972_ba01a4c366d3.slice/crio-7f4936a6a20b61c9ab53b98a428117f4ca9267089c44ef3f73a32cc757d51699 WatchSource:0}: Error finding container 7f4936a6a20b61c9ab53b98a428117f4ca9267089c44ef3f73a32cc757d51699: Status 404 returned error can't find the container with id 7f4936a6a20b61c9ab53b98a428117f4ca9267089c44ef3f73a32cc757d51699 Apr 23 16:45:35.087414 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:35.087401 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:45:35.824968 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:35.824928 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-016c1-68d557966c-8cbj7" event={"ID":"c54a407c-b99a-49ff-b972-ba01a4c366d3","Type":"ContainerStarted","Data":"7cdf2b63475dde137d69751b8488dcd6e9b6c9e2dada9adcc0a704b742758ff9"} Apr 23 16:45:35.824968 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:35.824967 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-016c1-68d557966c-8cbj7" event={"ID":"c54a407c-b99a-49ff-b972-ba01a4c366d3","Type":"ContainerStarted","Data":"7f4936a6a20b61c9ab53b98a428117f4ca9267089c44ef3f73a32cc757d51699"} Apr 23 16:45:35.825469 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:35.825059 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-016c1-68d557966c-8cbj7" Apr 23 16:45:35.839421 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:35.839380 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-016c1-68d557966c-8cbj7" podStartSLOduration=1.839367963 podStartE2EDuration="1.839367963s" podCreationTimestamp="2026-04-23 16:45:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:45:35.838526374 +0000 UTC m=+641.528312836" watchObservedRunningTime="2026-04-23 16:45:35.839367963 +0000 UTC m=+641.529154309" Apr 23 16:45:41.833117 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:41.833087 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-016c1-68d557966c-8cbj7" Apr 23 16:45:44.709072 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:44.709036 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-016c1-68d557966c-8cbj7"] Apr 23 16:45:44.709450 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:44.709230 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-016c1-68d557966c-8cbj7" podUID="c54a407c-b99a-49ff-b972-ba01a4c366d3" containerName="model-chainer-raw-hpa-016c1" containerID="cri-o://7cdf2b63475dde137d69751b8488dcd6e9b6c9e2dada9adcc0a704b742758ff9" gracePeriod=30 Apr 23 16:45:44.960455 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:44.960380 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-016c1-predictor-7787658cd6-t9vzr"] Apr 23 16:45:44.960706 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:44.960667 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-016c1-predictor-7787658cd6-t9vzr" podUID="a8df1030-fcb2-4b7c-a9fa-69735a57dc2e" containerName="kserve-container" containerID="cri-o://9c83266d7b3202943f5c7d248a644e789d7868f968e31e2938f1d7d3ca1dae67" gracePeriod=30 Apr 23 16:45:45.021913 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:45.021881 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-016c1-predictor-7b6c866ddc-8f9tx"] Apr 23 16:45:45.022147 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:45.022126 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-016c1-predictor-7b6c866ddc-8f9tx" podUID="854a4946-6a5a-4130-bcd1-565998483712" containerName="kserve-container" containerID="cri-o://f66ca3c18dec30ca4b66b7a467b9d2b0a3cf875213b384877ac20e69f65e3a09" gracePeriod=30 Apr 23 16:45:46.831829 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:46.831782 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-016c1-68d557966c-8cbj7" podUID="c54a407c-b99a-49ff-b972-ba01a4c366d3" containerName="model-chainer-raw-hpa-016c1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:45:48.456912 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:48.456880 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-016c1-predictor-7b6c866ddc-8f9tx" Apr 23 16:45:48.526062 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:48.525984 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/854a4946-6a5a-4130-bcd1-565998483712-kserve-provision-location\") pod \"854a4946-6a5a-4130-bcd1-565998483712\" (UID: \"854a4946-6a5a-4130-bcd1-565998483712\") " Apr 23 16:45:48.526286 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:48.526263 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/854a4946-6a5a-4130-bcd1-565998483712-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "854a4946-6a5a-4130-bcd1-565998483712" (UID: "854a4946-6a5a-4130-bcd1-565998483712"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:45:48.627322 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:48.627293 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/854a4946-6a5a-4130-bcd1-565998483712-kserve-provision-location\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:45:48.862222 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:48.862124 2573 generic.go:358] "Generic (PLEG): container finished" podID="854a4946-6a5a-4130-bcd1-565998483712" containerID="f66ca3c18dec30ca4b66b7a467b9d2b0a3cf875213b384877ac20e69f65e3a09" exitCode=0 Apr 23 16:45:48.862222 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:48.862172 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-016c1-predictor-7b6c866ddc-8f9tx" event={"ID":"854a4946-6a5a-4130-bcd1-565998483712","Type":"ContainerDied","Data":"f66ca3c18dec30ca4b66b7a467b9d2b0a3cf875213b384877ac20e69f65e3a09"} Apr 23 16:45:48.862222 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:48.862202 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-016c1-predictor-7b6c866ddc-8f9tx" Apr 23 16:45:48.862222 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:48.862214 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-016c1-predictor-7b6c866ddc-8f9tx" event={"ID":"854a4946-6a5a-4130-bcd1-565998483712","Type":"ContainerDied","Data":"0dde51958a1cb882a828a0a88aa588adcedc3ff42bbcba853c5121e180950be5"} Apr 23 16:45:48.862496 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:48.862233 2573 scope.go:117] "RemoveContainer" containerID="f66ca3c18dec30ca4b66b7a467b9d2b0a3cf875213b384877ac20e69f65e3a09" Apr 23 16:45:48.870619 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:48.870597 2573 scope.go:117] "RemoveContainer" containerID="cfb96e2d4f0a467fba4bab2916c838bdf6cfd65ab45e3f00c77b68538170f3a9" Apr 23 16:45:48.879951 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:48.879938 2573 scope.go:117] "RemoveContainer" containerID="f66ca3c18dec30ca4b66b7a467b9d2b0a3cf875213b384877ac20e69f65e3a09" Apr 23 16:45:48.880192 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:45:48.880174 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f66ca3c18dec30ca4b66b7a467b9d2b0a3cf875213b384877ac20e69f65e3a09\": container with ID starting with f66ca3c18dec30ca4b66b7a467b9d2b0a3cf875213b384877ac20e69f65e3a09 not found: ID does not exist" containerID="f66ca3c18dec30ca4b66b7a467b9d2b0a3cf875213b384877ac20e69f65e3a09" Apr 23 16:45:48.880299 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:48.880206 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f66ca3c18dec30ca4b66b7a467b9d2b0a3cf875213b384877ac20e69f65e3a09"} err="failed to get container status \"f66ca3c18dec30ca4b66b7a467b9d2b0a3cf875213b384877ac20e69f65e3a09\": rpc error: code = NotFound desc = could not find container \"f66ca3c18dec30ca4b66b7a467b9d2b0a3cf875213b384877ac20e69f65e3a09\": container with ID starting with f66ca3c18dec30ca4b66b7a467b9d2b0a3cf875213b384877ac20e69f65e3a09 not found: ID does not exist" Apr 23 16:45:48.880299 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:48.880229 2573 scope.go:117] "RemoveContainer" containerID="cfb96e2d4f0a467fba4bab2916c838bdf6cfd65ab45e3f00c77b68538170f3a9" Apr 23 16:45:48.880542 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:45:48.880510 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfb96e2d4f0a467fba4bab2916c838bdf6cfd65ab45e3f00c77b68538170f3a9\": container with ID starting with cfb96e2d4f0a467fba4bab2916c838bdf6cfd65ab45e3f00c77b68538170f3a9 not found: ID does not exist" containerID="cfb96e2d4f0a467fba4bab2916c838bdf6cfd65ab45e3f00c77b68538170f3a9" Apr 23 16:45:48.880609 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:48.880543 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfb96e2d4f0a467fba4bab2916c838bdf6cfd65ab45e3f00c77b68538170f3a9"} err="failed to get container status \"cfb96e2d4f0a467fba4bab2916c838bdf6cfd65ab45e3f00c77b68538170f3a9\": rpc error: code = NotFound desc = could not find container \"cfb96e2d4f0a467fba4bab2916c838bdf6cfd65ab45e3f00c77b68538170f3a9\": container with ID starting with cfb96e2d4f0a467fba4bab2916c838bdf6cfd65ab45e3f00c77b68538170f3a9 not found: ID does not exist" Apr 23 16:45:48.882141 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:48.882123 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-016c1-predictor-7b6c866ddc-8f9tx"] Apr 23 16:45:48.887840 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:48.887822 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-xgboost-graph-raw-hpa-016c1-predictor-7b6c866ddc-8f9tx"] Apr 23 16:45:49.093080 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:49.093061 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-016c1-predictor-7787658cd6-t9vzr" Apr 23 16:45:49.232701 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:49.232669 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8df1030-fcb2-4b7c-a9fa-69735a57dc2e-kserve-provision-location\") pod \"a8df1030-fcb2-4b7c-a9fa-69735a57dc2e\" (UID: \"a8df1030-fcb2-4b7c-a9fa-69735a57dc2e\") " Apr 23 16:45:49.232999 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:49.232975 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8df1030-fcb2-4b7c-a9fa-69735a57dc2e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a8df1030-fcb2-4b7c-a9fa-69735a57dc2e" (UID: "a8df1030-fcb2-4b7c-a9fa-69735a57dc2e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:45:49.333702 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:49.333673 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a8df1030-fcb2-4b7c-a9fa-69735a57dc2e-kserve-provision-location\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:45:49.866933 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:49.866896 2573 generic.go:358] "Generic (PLEG): container finished" podID="a8df1030-fcb2-4b7c-a9fa-69735a57dc2e" containerID="9c83266d7b3202943f5c7d248a644e789d7868f968e31e2938f1d7d3ca1dae67" exitCode=0 Apr 23 16:45:49.867314 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:49.866945 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-016c1-predictor-7787658cd6-t9vzr" event={"ID":"a8df1030-fcb2-4b7c-a9fa-69735a57dc2e","Type":"ContainerDied","Data":"9c83266d7b3202943f5c7d248a644e789d7868f968e31e2938f1d7d3ca1dae67"} Apr 23 16:45:49.867314 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:49.866966 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-016c1-predictor-7787658cd6-t9vzr" event={"ID":"a8df1030-fcb2-4b7c-a9fa-69735a57dc2e","Type":"ContainerDied","Data":"5709d3d4e0105ad563bc6280452fef33cda4250f74c327eaa9e5933f596a86fe"} Apr 23 16:45:49.867314 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:49.866981 2573 scope.go:117] "RemoveContainer" containerID="9c83266d7b3202943f5c7d248a644e789d7868f968e31e2938f1d7d3ca1dae67" Apr 23 16:45:49.867314 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:49.866983 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-016c1-predictor-7787658cd6-t9vzr" Apr 23 16:45:49.875014 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:49.874997 2573 scope.go:117] "RemoveContainer" containerID="9588826c4ed8169b4360ac770a718fe57b546f88db792035ecd1a2eebc69aefe" Apr 23 16:45:49.881415 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:49.881399 2573 scope.go:117] "RemoveContainer" containerID="9c83266d7b3202943f5c7d248a644e789d7868f968e31e2938f1d7d3ca1dae67" Apr 23 16:45:49.881655 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:45:49.881639 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c83266d7b3202943f5c7d248a644e789d7868f968e31e2938f1d7d3ca1dae67\": container with ID starting with 9c83266d7b3202943f5c7d248a644e789d7868f968e31e2938f1d7d3ca1dae67 not found: ID does not exist" containerID="9c83266d7b3202943f5c7d248a644e789d7868f968e31e2938f1d7d3ca1dae67" Apr 23 16:45:49.881778 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:49.881666 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c83266d7b3202943f5c7d248a644e789d7868f968e31e2938f1d7d3ca1dae67"} err="failed to get container status \"9c83266d7b3202943f5c7d248a644e789d7868f968e31e2938f1d7d3ca1dae67\": rpc error: code = NotFound desc = could not find container \"9c83266d7b3202943f5c7d248a644e789d7868f968e31e2938f1d7d3ca1dae67\": container with ID starting with 9c83266d7b3202943f5c7d248a644e789d7868f968e31e2938f1d7d3ca1dae67 not found: ID does not exist" Apr 23 16:45:49.881778 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:49.881709 2573 scope.go:117] "RemoveContainer" containerID="9588826c4ed8169b4360ac770a718fe57b546f88db792035ecd1a2eebc69aefe" Apr 23 16:45:49.881933 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:45:49.881916 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9588826c4ed8169b4360ac770a718fe57b546f88db792035ecd1a2eebc69aefe\": container with ID starting with 9588826c4ed8169b4360ac770a718fe57b546f88db792035ecd1a2eebc69aefe not found: ID does not exist" containerID="9588826c4ed8169b4360ac770a718fe57b546f88db792035ecd1a2eebc69aefe" Apr 23 16:45:49.881971 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:49.881939 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9588826c4ed8169b4360ac770a718fe57b546f88db792035ecd1a2eebc69aefe"} err="failed to get container status \"9588826c4ed8169b4360ac770a718fe57b546f88db792035ecd1a2eebc69aefe\": rpc error: code = NotFound desc = could not find container \"9588826c4ed8169b4360ac770a718fe57b546f88db792035ecd1a2eebc69aefe\": container with ID starting with 9588826c4ed8169b4360ac770a718fe57b546f88db792035ecd1a2eebc69aefe not found: ID does not exist" Apr 23 16:45:49.887090 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:49.887072 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-016c1-predictor-7787658cd6-t9vzr"] Apr 23 16:45:49.890096 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:49.890079 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-graph-raw-hpa-016c1-predictor-7787658cd6-t9vzr"] Apr 23 16:45:50.879977 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:50.879943 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="854a4946-6a5a-4130-bcd1-565998483712" path="/var/lib/kubelet/pods/854a4946-6a5a-4130-bcd1-565998483712/volumes" Apr 23 16:45:50.880335 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:50.880269 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8df1030-fcb2-4b7c-a9fa-69735a57dc2e" path="/var/lib/kubelet/pods/a8df1030-fcb2-4b7c-a9fa-69735a57dc2e/volumes" Apr 23 16:45:51.832210 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:51.832174 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-016c1-68d557966c-8cbj7" podUID="c54a407c-b99a-49ff-b972-ba01a4c366d3" containerName="model-chainer-raw-hpa-016c1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:45:54.912360 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:54.912320 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22"] Apr 23 16:45:54.912805 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:54.912647 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8df1030-fcb2-4b7c-a9fa-69735a57dc2e" containerName="kserve-container" Apr 23 16:45:54.917454 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:54.913023 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8df1030-fcb2-4b7c-a9fa-69735a57dc2e" containerName="kserve-container" Apr 23 16:45:54.917454 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:54.913071 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="854a4946-6a5a-4130-bcd1-565998483712" containerName="storage-initializer" Apr 23 16:45:54.917454 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:54.913082 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="854a4946-6a5a-4130-bcd1-565998483712" containerName="storage-initializer" Apr 23 16:45:54.917454 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:54.913109 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="854a4946-6a5a-4130-bcd1-565998483712" containerName="kserve-container" Apr 23 16:45:54.917454 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:54.913117 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="854a4946-6a5a-4130-bcd1-565998483712" containerName="kserve-container" Apr 23 16:45:54.917454 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:54.913211 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a8df1030-fcb2-4b7c-a9fa-69735a57dc2e" containerName="storage-initializer" Apr 23 16:45:54.917454 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:54.913231 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8df1030-fcb2-4b7c-a9fa-69735a57dc2e" containerName="storage-initializer" Apr 23 16:45:54.917454 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:54.913409 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="854a4946-6a5a-4130-bcd1-565998483712" containerName="kserve-container" Apr 23 16:45:54.917454 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:54.913422 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="a8df1030-fcb2-4b7c-a9fa-69735a57dc2e" containerName="kserve-container" Apr 23 16:45:54.919043 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:54.919018 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" Apr 23 16:45:54.922094 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:54.922070 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22"] Apr 23 16:45:55.073290 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:55.073217 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38654de2-f382-4ee1-b9f1-c05237b9f2da-kserve-provision-location\") pod \"isvc-logger-raw-884e2-predictor-7554bf599d-zvt22\" (UID: \"38654de2-f382-4ee1-b9f1-c05237b9f2da\") " pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" Apr 23 16:45:55.174165 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:55.174133 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38654de2-f382-4ee1-b9f1-c05237b9f2da-kserve-provision-location\") pod \"isvc-logger-raw-884e2-predictor-7554bf599d-zvt22\" (UID: \"38654de2-f382-4ee1-b9f1-c05237b9f2da\") " pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" Apr 23 16:45:55.174473 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:55.174455 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38654de2-f382-4ee1-b9f1-c05237b9f2da-kserve-provision-location\") pod \"isvc-logger-raw-884e2-predictor-7554bf599d-zvt22\" (UID: \"38654de2-f382-4ee1-b9f1-c05237b9f2da\") " pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" Apr 23 16:45:55.230002 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:55.229975 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" Apr 23 16:45:55.346791 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:55.346714 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22"] Apr 23 16:45:55.886887 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:55.886845 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" event={"ID":"38654de2-f382-4ee1-b9f1-c05237b9f2da","Type":"ContainerStarted","Data":"1d674261bc90963abc739764de5842eb090a4278ade8cdd29eb31477cbc99eab"} Apr 23 16:45:55.886887 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:55.886885 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" event={"ID":"38654de2-f382-4ee1-b9f1-c05237b9f2da","Type":"ContainerStarted","Data":"9498808daccf50f43241f8283d938257c0f7f24a6875225ddb1c5b1b39237372"} Apr 23 16:45:56.832109 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:56.832073 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-016c1-68d557966c-8cbj7" podUID="c54a407c-b99a-49ff-b972-ba01a4c366d3" containerName="model-chainer-raw-hpa-016c1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:45:56.832466 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:56.832184 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-016c1-68d557966c-8cbj7" Apr 23 16:45:59.899847 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:59.899814 2573 generic.go:358] "Generic (PLEG): container finished" podID="38654de2-f382-4ee1-b9f1-c05237b9f2da" containerID="1d674261bc90963abc739764de5842eb090a4278ade8cdd29eb31477cbc99eab" exitCode=0 Apr 23 16:45:59.900211 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:45:59.899874 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" event={"ID":"38654de2-f382-4ee1-b9f1-c05237b9f2da","Type":"ContainerDied","Data":"1d674261bc90963abc739764de5842eb090a4278ade8cdd29eb31477cbc99eab"} Apr 23 16:46:00.903597 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:00.903563 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" event={"ID":"38654de2-f382-4ee1-b9f1-c05237b9f2da","Type":"ContainerStarted","Data":"03a487100498ba1c437c7cf4d0838e1a6fa79cfd1ca06b83d4c42f146b3624d9"} Apr 23 16:46:00.903597 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:00.903599 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" event={"ID":"38654de2-f382-4ee1-b9f1-c05237b9f2da","Type":"ContainerStarted","Data":"4e322aa589e68014e8fefc14efe0788e83f9f52464db078e648e58837056ba08"} Apr 23 16:46:00.904069 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:00.903955 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" Apr 23 16:46:00.905116 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:00.905091 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" podUID="38654de2-f382-4ee1-b9f1-c05237b9f2da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 23 16:46:00.920306 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:00.920265 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" podStartSLOduration=6.920252695 podStartE2EDuration="6.920252695s" podCreationTimestamp="2026-04-23 16:45:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:46:00.918735397 +0000 UTC m=+666.608521743" watchObservedRunningTime="2026-04-23 16:46:00.920252695 +0000 UTC m=+666.610039041" Apr 23 16:46:01.831720 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:01.831669 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-016c1-68d557966c-8cbj7" podUID="c54a407c-b99a-49ff-b972-ba01a4c366d3" containerName="model-chainer-raw-hpa-016c1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:46:01.906281 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:01.906250 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" Apr 23 16:46:01.906685 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:01.906368 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" podUID="38654de2-f382-4ee1-b9f1-c05237b9f2da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 23 16:46:01.907376 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:01.907351 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" podUID="38654de2-f382-4ee1-b9f1-c05237b9f2da" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:46:02.908740 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:02.908682 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" podUID="38654de2-f382-4ee1-b9f1-c05237b9f2da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 23 16:46:02.909123 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:02.909023 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" podUID="38654de2-f382-4ee1-b9f1-c05237b9f2da" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:46:06.832474 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:06.832428 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-016c1-68d557966c-8cbj7" podUID="c54a407c-b99a-49ff-b972-ba01a4c366d3" containerName="model-chainer-raw-hpa-016c1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:46:11.832019 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:11.831981 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-016c1-68d557966c-8cbj7" podUID="c54a407c-b99a-49ff-b972-ba01a4c366d3" containerName="model-chainer-raw-hpa-016c1" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:46:12.909206 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:12.909162 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" podUID="38654de2-f382-4ee1-b9f1-c05237b9f2da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 23 16:46:12.909674 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:12.909650 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" podUID="38654de2-f382-4ee1-b9f1-c05237b9f2da" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:46:14.731382 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:46:14.731349 2573 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc54a407c_b99a_49ff_b972_ba01a4c366d3.slice/crio-conmon-7cdf2b63475dde137d69751b8488dcd6e9b6c9e2dada9adcc0a704b742758ff9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc54a407c_b99a_49ff_b972_ba01a4c366d3.slice/crio-7cdf2b63475dde137d69751b8488dcd6e9b6c9e2dada9adcc0a704b742758ff9.scope\": RecentStats: unable to find data in memory cache]" Apr 23 16:46:14.731671 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:46:14.731532 2573 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc54a407c_b99a_49ff_b972_ba01a4c366d3.slice/crio-conmon-7cdf2b63475dde137d69751b8488dcd6e9b6c9e2dada9adcc0a704b742758ff9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc54a407c_b99a_49ff_b972_ba01a4c366d3.slice/crio-7cdf2b63475dde137d69751b8488dcd6e9b6c9e2dada9adcc0a704b742758ff9.scope\": RecentStats: unable to find data in memory cache]" Apr 23 16:46:14.732236 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:46:14.731960 2573 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc54a407c_b99a_49ff_b972_ba01a4c366d3.slice/crio-7cdf2b63475dde137d69751b8488dcd6e9b6c9e2dada9adcc0a704b742758ff9.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc54a407c_b99a_49ff_b972_ba01a4c366d3.slice/crio-conmon-7cdf2b63475dde137d69751b8488dcd6e9b6c9e2dada9adcc0a704b742758ff9.scope\": RecentStats: unable to find data in memory cache]" Apr 23 16:46:14.854372 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:14.854350 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-016c1-68d557966c-8cbj7" Apr 23 16:46:14.940389 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:14.940356 2573 generic.go:358] "Generic (PLEG): container finished" podID="c54a407c-b99a-49ff-b972-ba01a4c366d3" containerID="7cdf2b63475dde137d69751b8488dcd6e9b6c9e2dada9adcc0a704b742758ff9" exitCode=0 Apr 23 16:46:14.940536 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:14.940433 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-016c1-68d557966c-8cbj7" Apr 23 16:46:14.940536 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:14.940438 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-016c1-68d557966c-8cbj7" event={"ID":"c54a407c-b99a-49ff-b972-ba01a4c366d3","Type":"ContainerDied","Data":"7cdf2b63475dde137d69751b8488dcd6e9b6c9e2dada9adcc0a704b742758ff9"} Apr 23 16:46:14.940536 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:14.940473 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/model-chainer-raw-hpa-016c1-68d557966c-8cbj7" event={"ID":"c54a407c-b99a-49ff-b972-ba01a4c366d3","Type":"ContainerDied","Data":"7f4936a6a20b61c9ab53b98a428117f4ca9267089c44ef3f73a32cc757d51699"} Apr 23 16:46:14.940536 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:14.940488 2573 scope.go:117] "RemoveContainer" containerID="7cdf2b63475dde137d69751b8488dcd6e9b6c9e2dada9adcc0a704b742758ff9" Apr 23 16:46:14.948747 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:14.948728 2573 scope.go:117] "RemoveContainer" containerID="7cdf2b63475dde137d69751b8488dcd6e9b6c9e2dada9adcc0a704b742758ff9" Apr 23 16:46:14.949015 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:46:14.948995 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7cdf2b63475dde137d69751b8488dcd6e9b6c9e2dada9adcc0a704b742758ff9\": container with ID starting with 7cdf2b63475dde137d69751b8488dcd6e9b6c9e2dada9adcc0a704b742758ff9 not found: ID does not exist" containerID="7cdf2b63475dde137d69751b8488dcd6e9b6c9e2dada9adcc0a704b742758ff9" Apr 23 16:46:14.949086 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:14.949022 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7cdf2b63475dde137d69751b8488dcd6e9b6c9e2dada9adcc0a704b742758ff9"} err="failed to get container status \"7cdf2b63475dde137d69751b8488dcd6e9b6c9e2dada9adcc0a704b742758ff9\": rpc error: code = NotFound desc = could not find container \"7cdf2b63475dde137d69751b8488dcd6e9b6c9e2dada9adcc0a704b742758ff9\": container with ID starting with 7cdf2b63475dde137d69751b8488dcd6e9b6c9e2dada9adcc0a704b742758ff9 not found: ID does not exist" Apr 23 16:46:15.018801 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:15.018743 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c54a407c-b99a-49ff-b972-ba01a4c366d3-openshift-service-ca-bundle\") pod \"c54a407c-b99a-49ff-b972-ba01a4c366d3\" (UID: \"c54a407c-b99a-49ff-b972-ba01a4c366d3\") " Apr 23 16:46:15.018801 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:15.018776 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c54a407c-b99a-49ff-b972-ba01a4c366d3-proxy-tls\") pod \"c54a407c-b99a-49ff-b972-ba01a4c366d3\" (UID: \"c54a407c-b99a-49ff-b972-ba01a4c366d3\") " Apr 23 16:46:15.019078 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:15.019055 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c54a407c-b99a-49ff-b972-ba01a4c366d3-openshift-service-ca-bundle" (OuterVolumeSpecName: "openshift-service-ca-bundle") pod "c54a407c-b99a-49ff-b972-ba01a4c366d3" (UID: "c54a407c-b99a-49ff-b972-ba01a4c366d3"). InnerVolumeSpecName "openshift-service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:46:15.020727 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:15.020705 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c54a407c-b99a-49ff-b972-ba01a4c366d3-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "c54a407c-b99a-49ff-b972-ba01a4c366d3" (UID: "c54a407c-b99a-49ff-b972-ba01a4c366d3"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 16:46:15.119511 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:15.119486 2573 reconciler_common.go:299] "Volume detached for volume \"openshift-service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c54a407c-b99a-49ff-b972-ba01a4c366d3-openshift-service-ca-bundle\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:46:15.119511 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:15.119510 2573 reconciler_common.go:299] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/c54a407c-b99a-49ff-b972-ba01a4c366d3-proxy-tls\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:46:15.259495 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:15.259465 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-016c1-68d557966c-8cbj7"] Apr 23 16:46:15.264024 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:15.264004 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/model-chainer-raw-hpa-016c1-68d557966c-8cbj7"] Apr 23 16:46:16.879280 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:16.879247 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c54a407c-b99a-49ff-b972-ba01a4c366d3" path="/var/lib/kubelet/pods/c54a407c-b99a-49ff-b972-ba01a4c366d3/volumes" Apr 23 16:46:22.908680 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:22.908638 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" podUID="38654de2-f382-4ee1-b9f1-c05237b9f2da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 23 16:46:22.909201 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:22.909182 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" podUID="38654de2-f382-4ee1-b9f1-c05237b9f2da" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:46:32.909413 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:32.909371 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" podUID="38654de2-f382-4ee1-b9f1-c05237b9f2da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 23 16:46:32.909934 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:32.909914 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" podUID="38654de2-f382-4ee1-b9f1-c05237b9f2da" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:46:42.909270 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:42.909164 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" podUID="38654de2-f382-4ee1-b9f1-c05237b9f2da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 23 16:46:42.909664 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:42.909596 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" podUID="38654de2-f382-4ee1-b9f1-c05237b9f2da" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:46:52.909087 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:52.909035 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" podUID="38654de2-f382-4ee1-b9f1-c05237b9f2da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 23 16:46:52.909477 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:46:52.909415 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" podUID="38654de2-f382-4ee1-b9f1-c05237b9f2da" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:47:02.909899 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:02.909863 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" Apr 23 16:47:02.910264 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:02.909977 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" Apr 23 16:47:10.136752 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:10.136719 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22"] Apr 23 16:47:10.137237 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:10.137033 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" podUID="38654de2-f382-4ee1-b9f1-c05237b9f2da" containerName="kserve-container" containerID="cri-o://4e322aa589e68014e8fefc14efe0788e83f9f52464db078e648e58837056ba08" gracePeriod=30 Apr 23 16:47:10.137237 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:10.137117 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" podUID="38654de2-f382-4ee1-b9f1-c05237b9f2da" containerName="agent" containerID="cri-o://03a487100498ba1c437c7cf4d0838e1a6fa79cfd1ca06b83d4c42f146b3624d9" gracePeriod=30 Apr 23 16:47:10.145423 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:10.145401 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml"] Apr 23 16:47:10.145739 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:10.145724 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c54a407c-b99a-49ff-b972-ba01a4c366d3" containerName="model-chainer-raw-hpa-016c1" Apr 23 16:47:10.145787 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:10.145742 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="c54a407c-b99a-49ff-b972-ba01a4c366d3" containerName="model-chainer-raw-hpa-016c1" Apr 23 16:47:10.145822 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:10.145800 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="c54a407c-b99a-49ff-b972-ba01a4c366d3" containerName="model-chainer-raw-hpa-016c1" Apr 23 16:47:10.148556 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:10.148541 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml" Apr 23 16:47:10.159345 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:10.159323 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml"] Apr 23 16:47:10.236960 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:10.236927 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b27ec6bc-c9cf-45cb-ae79-ed4801163827-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml\" (UID: \"b27ec6bc-c9cf-45cb-ae79-ed4801163827\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml" Apr 23 16:47:10.338260 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:10.338214 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b27ec6bc-c9cf-45cb-ae79-ed4801163827-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml\" (UID: \"b27ec6bc-c9cf-45cb-ae79-ed4801163827\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml" Apr 23 16:47:10.338599 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:10.338576 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b27ec6bc-c9cf-45cb-ae79-ed4801163827-kserve-provision-location\") pod \"isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml\" (UID: \"b27ec6bc-c9cf-45cb-ae79-ed4801163827\") " pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml" Apr 23 16:47:10.458096 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:10.458064 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml" Apr 23 16:47:10.572241 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:10.572080 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml"] Apr 23 16:47:10.574799 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:47:10.574760 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb27ec6bc_c9cf_45cb_ae79_ed4801163827.slice/crio-b1eedb093551ed625887e0e1ea9a20a6810a75309a2f4d010002f68ec5f06e49 WatchSource:0}: Error finding container b1eedb093551ed625887e0e1ea9a20a6810a75309a2f4d010002f68ec5f06e49: Status 404 returned error can't find the container with id b1eedb093551ed625887e0e1ea9a20a6810a75309a2f4d010002f68ec5f06e49 Apr 23 16:47:11.088081 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:11.088043 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml" event={"ID":"b27ec6bc-c9cf-45cb-ae79-ed4801163827","Type":"ContainerStarted","Data":"7456107a478a575de4b123ed9e25f7099f63e8b6d9425f10898272b5cb603bf4"} Apr 23 16:47:11.088081 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:11.088079 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml" event={"ID":"b27ec6bc-c9cf-45cb-ae79-ed4801163827","Type":"ContainerStarted","Data":"b1eedb093551ed625887e0e1ea9a20a6810a75309a2f4d010002f68ec5f06e49"} Apr 23 16:47:12.908641 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:12.908600 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" podUID="38654de2-f382-4ee1-b9f1-c05237b9f2da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 23 16:47:12.909098 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:12.908946 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" podUID="38654de2-f382-4ee1-b9f1-c05237b9f2da" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:47:14.097621 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:14.097592 2573 generic.go:358] "Generic (PLEG): container finished" podID="b27ec6bc-c9cf-45cb-ae79-ed4801163827" containerID="7456107a478a575de4b123ed9e25f7099f63e8b6d9425f10898272b5cb603bf4" exitCode=0 Apr 23 16:47:14.097621 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:14.097616 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml" event={"ID":"b27ec6bc-c9cf-45cb-ae79-ed4801163827","Type":"ContainerDied","Data":"7456107a478a575de4b123ed9e25f7099f63e8b6d9425f10898272b5cb603bf4"} Apr 23 16:47:15.101366 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:15.101331 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml" event={"ID":"b27ec6bc-c9cf-45cb-ae79-ed4801163827","Type":"ContainerStarted","Data":"2931f316b2d55babb18c33f428c878c86c3f78b4cd41ddc9286b3824bdb687f6"} Apr 23 16:47:15.101799 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:15.101631 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml" Apr 23 16:47:15.102854 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:15.102829 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml" podUID="b27ec6bc-c9cf-45cb-ae79-ed4801163827" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 23 16:47:15.103279 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:15.103259 2573 generic.go:358] "Generic (PLEG): container finished" podID="38654de2-f382-4ee1-b9f1-c05237b9f2da" containerID="4e322aa589e68014e8fefc14efe0788e83f9f52464db078e648e58837056ba08" exitCode=0 Apr 23 16:47:15.103348 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:15.103327 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" event={"ID":"38654de2-f382-4ee1-b9f1-c05237b9f2da","Type":"ContainerDied","Data":"4e322aa589e68014e8fefc14efe0788e83f9f52464db078e648e58837056ba08"} Apr 23 16:47:15.115825 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:15.115786 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml" podStartSLOduration=5.11577395 podStartE2EDuration="5.11577395s" podCreationTimestamp="2026-04-23 16:47:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:47:15.115398133 +0000 UTC m=+740.805184490" watchObservedRunningTime="2026-04-23 16:47:15.11577395 +0000 UTC m=+740.805560296" Apr 23 16:47:16.106178 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:16.106136 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml" podUID="b27ec6bc-c9cf-45cb-ae79-ed4801163827" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 23 16:47:22.909032 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:22.908983 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" podUID="38654de2-f382-4ee1-b9f1-c05237b9f2da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 23 16:47:22.909418 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:22.909310 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" podUID="38654de2-f382-4ee1-b9f1-c05237b9f2da" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:47:26.106639 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:26.106593 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml" podUID="b27ec6bc-c9cf-45cb-ae79-ed4801163827" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 23 16:47:32.909431 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:32.909389 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" podUID="38654de2-f382-4ee1-b9f1-c05237b9f2da" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.23:8080: connect: connection refused" Apr 23 16:47:32.909843 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:32.909535 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" Apr 23 16:47:32.909843 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:32.909721 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" podUID="38654de2-f382-4ee1-b9f1-c05237b9f2da" containerName="agent" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 16:47:32.909843 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:32.909811 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" Apr 23 16:47:36.106391 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:36.106338 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml" podUID="b27ec6bc-c9cf-45cb-ae79-ed4801163827" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 23 16:47:40.172263 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:40.172222 2573 generic.go:358] "Generic (PLEG): container finished" podID="38654de2-f382-4ee1-b9f1-c05237b9f2da" containerID="03a487100498ba1c437c7cf4d0838e1a6fa79cfd1ca06b83d4c42f146b3624d9" exitCode=137 Apr 23 16:47:40.172616 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:40.172297 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" event={"ID":"38654de2-f382-4ee1-b9f1-c05237b9f2da","Type":"ContainerDied","Data":"03a487100498ba1c437c7cf4d0838e1a6fa79cfd1ca06b83d4c42f146b3624d9"} Apr 23 16:47:40.281713 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:40.281670 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" Apr 23 16:47:40.368466 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:40.368424 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38654de2-f382-4ee1-b9f1-c05237b9f2da-kserve-provision-location\") pod \"38654de2-f382-4ee1-b9f1-c05237b9f2da\" (UID: \"38654de2-f382-4ee1-b9f1-c05237b9f2da\") " Apr 23 16:47:40.368824 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:40.368794 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38654de2-f382-4ee1-b9f1-c05237b9f2da-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "38654de2-f382-4ee1-b9f1-c05237b9f2da" (UID: "38654de2-f382-4ee1-b9f1-c05237b9f2da"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:47:40.469011 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:40.468970 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38654de2-f382-4ee1-b9f1-c05237b9f2da-kserve-provision-location\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:47:41.176871 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:41.176829 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" event={"ID":"38654de2-f382-4ee1-b9f1-c05237b9f2da","Type":"ContainerDied","Data":"9498808daccf50f43241f8283d938257c0f7f24a6875225ddb1c5b1b39237372"} Apr 23 16:47:41.177331 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:41.176892 2573 scope.go:117] "RemoveContainer" containerID="03a487100498ba1c437c7cf4d0838e1a6fa79cfd1ca06b83d4c42f146b3624d9" Apr 23 16:47:41.177331 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:41.176953 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22" Apr 23 16:47:41.186790 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:41.186767 2573 scope.go:117] "RemoveContainer" containerID="4e322aa589e68014e8fefc14efe0788e83f9f52464db078e648e58837056ba08" Apr 23 16:47:41.190977 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:41.190952 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22"] Apr 23 16:47:41.194990 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:41.194963 2573 scope.go:117] "RemoveContainer" containerID="1d674261bc90963abc739764de5842eb090a4278ade8cdd29eb31477cbc99eab" Apr 23 16:47:41.196173 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:41.196078 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-logger-raw-884e2-predictor-7554bf599d-zvt22"] Apr 23 16:47:42.880377 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:42.880332 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38654de2-f382-4ee1-b9f1-c05237b9f2da" path="/var/lib/kubelet/pods/38654de2-f382-4ee1-b9f1-c05237b9f2da/volumes" Apr 23 16:47:46.107061 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:46.107017 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml" podUID="b27ec6bc-c9cf-45cb-ae79-ed4801163827" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 23 16:47:56.106588 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:47:56.106546 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml" podUID="b27ec6bc-c9cf-45cb-ae79-ed4801163827" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 23 16:48:06.107184 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:48:06.107093 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml" podUID="b27ec6bc-c9cf-45cb-ae79-ed4801163827" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 23 16:48:16.106438 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:48:16.106394 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml" podUID="b27ec6bc-c9cf-45cb-ae79-ed4801163827" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 23 16:48:26.106345 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:48:26.106301 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml" podUID="b27ec6bc-c9cf-45cb-ae79-ed4801163827" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 23 16:48:36.106537 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:48:36.106496 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml" podUID="b27ec6bc-c9cf-45cb-ae79-ed4801163827" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 23 16:48:45.876756 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:48:45.876719 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml" podUID="b27ec6bc-c9cf-45cb-ae79-ed4801163827" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 23 16:48:55.877209 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:48:55.877167 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml" podUID="b27ec6bc-c9cf-45cb-ae79-ed4801163827" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 23 16:49:05.877214 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:05.877168 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml" podUID="b27ec6bc-c9cf-45cb-ae79-ed4801163827" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 23 16:49:15.877485 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:15.877439 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml" podUID="b27ec6bc-c9cf-45cb-ae79-ed4801163827" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 23 16:49:25.878073 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:25.878038 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml" Apr 23 16:49:30.310481 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:30.310449 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml"] Apr 23 16:49:30.310880 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:30.310750 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml" podUID="b27ec6bc-c9cf-45cb-ae79-ed4801163827" containerName="kserve-container" containerID="cri-o://2931f316b2d55babb18c33f428c878c86c3f78b4cd41ddc9286b3824bdb687f6" gracePeriod=30 Apr 23 16:49:30.403515 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:30.403482 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-primary-b6e5e0-predictor-5d7b47d467-7zxxt"] Apr 23 16:49:30.403831 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:30.403818 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38654de2-f382-4ee1-b9f1-c05237b9f2da" containerName="storage-initializer" Apr 23 16:49:30.403882 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:30.403833 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="38654de2-f382-4ee1-b9f1-c05237b9f2da" containerName="storage-initializer" Apr 23 16:49:30.403882 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:30.403844 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38654de2-f382-4ee1-b9f1-c05237b9f2da" containerName="kserve-container" Apr 23 16:49:30.403882 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:30.403849 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="38654de2-f382-4ee1-b9f1-c05237b9f2da" containerName="kserve-container" Apr 23 16:49:30.403882 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:30.403867 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38654de2-f382-4ee1-b9f1-c05237b9f2da" containerName="agent" Apr 23 16:49:30.403882 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:30.403873 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="38654de2-f382-4ee1-b9f1-c05237b9f2da" containerName="agent" Apr 23 16:49:30.404028 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:30.403914 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="38654de2-f382-4ee1-b9f1-c05237b9f2da" containerName="kserve-container" Apr 23 16:49:30.404028 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:30.403924 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="38654de2-f382-4ee1-b9f1-c05237b9f2da" containerName="agent" Apr 23 16:49:30.406826 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:30.406811 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-b6e5e0-predictor-5d7b47d467-7zxxt" Apr 23 16:49:30.413648 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:30.413625 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-b6e5e0-predictor-5d7b47d467-7zxxt"] Apr 23 16:49:30.481854 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:30.481833 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b2697a8c-e60d-45fb-b91b-c7b9a513dbc3-kserve-provision-location\") pod \"isvc-primary-b6e5e0-predictor-5d7b47d467-7zxxt\" (UID: \"b2697a8c-e60d-45fb-b91b-c7b9a513dbc3\") " pod="kserve-ci-e2e-test/isvc-primary-b6e5e0-predictor-5d7b47d467-7zxxt" Apr 23 16:49:30.583180 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:30.583106 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b2697a8c-e60d-45fb-b91b-c7b9a513dbc3-kserve-provision-location\") pod \"isvc-primary-b6e5e0-predictor-5d7b47d467-7zxxt\" (UID: \"b2697a8c-e60d-45fb-b91b-c7b9a513dbc3\") " pod="kserve-ci-e2e-test/isvc-primary-b6e5e0-predictor-5d7b47d467-7zxxt" Apr 23 16:49:30.583467 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:30.583449 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b2697a8c-e60d-45fb-b91b-c7b9a513dbc3-kserve-provision-location\") pod \"isvc-primary-b6e5e0-predictor-5d7b47d467-7zxxt\" (UID: \"b2697a8c-e60d-45fb-b91b-c7b9a513dbc3\") " pod="kserve-ci-e2e-test/isvc-primary-b6e5e0-predictor-5d7b47d467-7zxxt" Apr 23 16:49:30.738016 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:30.737988 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-b6e5e0-predictor-5d7b47d467-7zxxt" Apr 23 16:49:30.853231 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:30.853154 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-b6e5e0-predictor-5d7b47d467-7zxxt"] Apr 23 16:49:30.856125 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:49:30.856103 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2697a8c_e60d_45fb_b91b_c7b9a513dbc3.slice/crio-e881170b1daae08365d847942c85752498bf0e6eaa585ff35e5f213c634506da WatchSource:0}: Error finding container e881170b1daae08365d847942c85752498bf0e6eaa585ff35e5f213c634506da: Status 404 returned error can't find the container with id e881170b1daae08365d847942c85752498bf0e6eaa585ff35e5f213c634506da Apr 23 16:49:31.479509 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:31.479472 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b6e5e0-predictor-5d7b47d467-7zxxt" event={"ID":"b2697a8c-e60d-45fb-b91b-c7b9a513dbc3","Type":"ContainerStarted","Data":"826c77626e4af399ef91aba65cb2668a807c80ba296b37bc4e00bf8dd0005ab7"} Apr 23 16:49:31.479509 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:31.479511 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b6e5e0-predictor-5d7b47d467-7zxxt" event={"ID":"b2697a8c-e60d-45fb-b91b-c7b9a513dbc3","Type":"ContainerStarted","Data":"e881170b1daae08365d847942c85752498bf0e6eaa585ff35e5f213c634506da"} Apr 23 16:49:34.488855 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:34.488827 2573 generic.go:358] "Generic (PLEG): container finished" podID="b2697a8c-e60d-45fb-b91b-c7b9a513dbc3" containerID="826c77626e4af399ef91aba65cb2668a807c80ba296b37bc4e00bf8dd0005ab7" exitCode=0 Apr 23 16:49:34.489179 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:34.488904 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b6e5e0-predictor-5d7b47d467-7zxxt" event={"ID":"b2697a8c-e60d-45fb-b91b-c7b9a513dbc3","Type":"ContainerDied","Data":"826c77626e4af399ef91aba65cb2668a807c80ba296b37bc4e00bf8dd0005ab7"} Apr 23 16:49:35.493031 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:35.492937 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b6e5e0-predictor-5d7b47d467-7zxxt" event={"ID":"b2697a8c-e60d-45fb-b91b-c7b9a513dbc3","Type":"ContainerStarted","Data":"04ed2e71e7de5180f4cee887fec6d5b72ca96e86ffae1f74c61edf8f2d29494b"} Apr 23 16:49:35.493373 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:35.493265 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/isvc-primary-b6e5e0-predictor-5d7b47d467-7zxxt" Apr 23 16:49:35.494427 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:35.494401 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b6e5e0-predictor-5d7b47d467-7zxxt" podUID="b2697a8c-e60d-45fb-b91b-c7b9a513dbc3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 23 16:49:35.509510 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:35.509461 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/isvc-primary-b6e5e0-predictor-5d7b47d467-7zxxt" podStartSLOduration=5.509447746 podStartE2EDuration="5.509447746s" podCreationTimestamp="2026-04-23 16:49:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:49:35.508205369 +0000 UTC m=+881.197991732" watchObservedRunningTime="2026-04-23 16:49:35.509447746 +0000 UTC m=+881.199234097" Apr 23 16:49:35.877320 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:35.877237 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml" podUID="b27ec6bc-c9cf-45cb-ae79-ed4801163827" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.24:8080: connect: connection refused" Apr 23 16:49:36.496351 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:36.496305 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b6e5e0-predictor-5d7b47d467-7zxxt" podUID="b2697a8c-e60d-45fb-b91b-c7b9a513dbc3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 23 16:49:39.054868 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:39.054846 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml" Apr 23 16:49:39.146775 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:39.146740 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b27ec6bc-c9cf-45cb-ae79-ed4801163827-kserve-provision-location\") pod \"b27ec6bc-c9cf-45cb-ae79-ed4801163827\" (UID: \"b27ec6bc-c9cf-45cb-ae79-ed4801163827\") " Apr 23 16:49:39.147037 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:39.147012 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b27ec6bc-c9cf-45cb-ae79-ed4801163827-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b27ec6bc-c9cf-45cb-ae79-ed4801163827" (UID: "b27ec6bc-c9cf-45cb-ae79-ed4801163827"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:49:39.247369 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:39.247323 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b27ec6bc-c9cf-45cb-ae79-ed4801163827-kserve-provision-location\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:49:39.505988 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:39.505896 2573 generic.go:358] "Generic (PLEG): container finished" podID="b27ec6bc-c9cf-45cb-ae79-ed4801163827" containerID="2931f316b2d55babb18c33f428c878c86c3f78b4cd41ddc9286b3824bdb687f6" exitCode=0 Apr 23 16:49:39.505988 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:39.505960 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml" event={"ID":"b27ec6bc-c9cf-45cb-ae79-ed4801163827","Type":"ContainerDied","Data":"2931f316b2d55babb18c33f428c878c86c3f78b4cd41ddc9286b3824bdb687f6"} Apr 23 16:49:39.506213 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:39.505989 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml" Apr 23 16:49:39.506213 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:39.506007 2573 scope.go:117] "RemoveContainer" containerID="2931f316b2d55babb18c33f428c878c86c3f78b4cd41ddc9286b3824bdb687f6" Apr 23 16:49:39.506213 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:39.505995 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml" event={"ID":"b27ec6bc-c9cf-45cb-ae79-ed4801163827","Type":"ContainerDied","Data":"b1eedb093551ed625887e0e1ea9a20a6810a75309a2f4d010002f68ec5f06e49"} Apr 23 16:49:39.513933 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:39.513761 2573 scope.go:117] "RemoveContainer" containerID="7456107a478a575de4b123ed9e25f7099f63e8b6d9425f10898272b5cb603bf4" Apr 23 16:49:39.520495 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:39.520478 2573 scope.go:117] "RemoveContainer" containerID="2931f316b2d55babb18c33f428c878c86c3f78b4cd41ddc9286b3824bdb687f6" Apr 23 16:49:39.520779 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:49:39.520757 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2931f316b2d55babb18c33f428c878c86c3f78b4cd41ddc9286b3824bdb687f6\": container with ID starting with 2931f316b2d55babb18c33f428c878c86c3f78b4cd41ddc9286b3824bdb687f6 not found: ID does not exist" containerID="2931f316b2d55babb18c33f428c878c86c3f78b4cd41ddc9286b3824bdb687f6" Apr 23 16:49:39.520868 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:39.520783 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2931f316b2d55babb18c33f428c878c86c3f78b4cd41ddc9286b3824bdb687f6"} err="failed to get container status \"2931f316b2d55babb18c33f428c878c86c3f78b4cd41ddc9286b3824bdb687f6\": rpc error: code = NotFound desc = could not find container \"2931f316b2d55babb18c33f428c878c86c3f78b4cd41ddc9286b3824bdb687f6\": container with ID starting with 2931f316b2d55babb18c33f428c878c86c3f78b4cd41ddc9286b3824bdb687f6 not found: ID does not exist" Apr 23 16:49:39.520868 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:39.520802 2573 scope.go:117] "RemoveContainer" containerID="7456107a478a575de4b123ed9e25f7099f63e8b6d9425f10898272b5cb603bf4" Apr 23 16:49:39.521043 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:49:39.521027 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7456107a478a575de4b123ed9e25f7099f63e8b6d9425f10898272b5cb603bf4\": container with ID starting with 7456107a478a575de4b123ed9e25f7099f63e8b6d9425f10898272b5cb603bf4 not found: ID does not exist" containerID="7456107a478a575de4b123ed9e25f7099f63e8b6d9425f10898272b5cb603bf4" Apr 23 16:49:39.521083 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:39.521049 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7456107a478a575de4b123ed9e25f7099f63e8b6d9425f10898272b5cb603bf4"} err="failed to get container status \"7456107a478a575de4b123ed9e25f7099f63e8b6d9425f10898272b5cb603bf4\": rpc error: code = NotFound desc = could not find container \"7456107a478a575de4b123ed9e25f7099f63e8b6d9425f10898272b5cb603bf4\": container with ID starting with 7456107a478a575de4b123ed9e25f7099f63e8b6d9425f10898272b5cb603bf4 not found: ID does not exist" Apr 23 16:49:39.525222 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:39.525201 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml"] Apr 23 16:49:39.528565 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:39.528545 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-sklearn-scale-raw-01e93-predictor-67cdd46fcb-ssfml"] Apr 23 16:49:40.879734 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:40.879677 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b27ec6bc-c9cf-45cb-ae79-ed4801163827" path="/var/lib/kubelet/pods/b27ec6bc-c9cf-45cb-ae79-ed4801163827/volumes" Apr 23 16:49:46.496672 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:46.496625 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b6e5e0-predictor-5d7b47d467-7zxxt" podUID="b2697a8c-e60d-45fb-b91b-c7b9a513dbc3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 23 16:49:54.772090 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:54.772060 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wd2cz_52988e90-484a-49cd-98f6-5510a28890d6/ovn-acl-logging/0.log" Apr 23 16:49:54.773250 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:54.773231 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wd2cz_52988e90-484a-49cd-98f6-5510a28890d6/ovn-acl-logging/0.log" Apr 23 16:49:56.496430 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:49:56.496380 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b6e5e0-predictor-5d7b47d467-7zxxt" podUID="b2697a8c-e60d-45fb-b91b-c7b9a513dbc3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 23 16:50:06.497278 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:06.497209 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b6e5e0-predictor-5d7b47d467-7zxxt" podUID="b2697a8c-e60d-45fb-b91b-c7b9a513dbc3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 23 16:50:16.497237 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:16.497199 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b6e5e0-predictor-5d7b47d467-7zxxt" podUID="b2697a8c-e60d-45fb-b91b-c7b9a513dbc3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 23 16:50:26.496531 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:26.496490 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b6e5e0-predictor-5d7b47d467-7zxxt" podUID="b2697a8c-e60d-45fb-b91b-c7b9a513dbc3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 23 16:50:36.496927 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:36.496886 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/isvc-primary-b6e5e0-predictor-5d7b47d467-7zxxt" podUID="b2697a8c-e60d-45fb-b91b-c7b9a513dbc3" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.25:8080: connect: connection refused" Apr 23 16:50:46.497304 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:46.497275 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/isvc-primary-b6e5e0-predictor-5d7b47d467-7zxxt" Apr 23 16:50:50.535298 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:50.535265 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4"] Apr 23 16:50:50.535637 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:50.535527 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b27ec6bc-c9cf-45cb-ae79-ed4801163827" containerName="kserve-container" Apr 23 16:50:50.535637 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:50.535537 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b27ec6bc-c9cf-45cb-ae79-ed4801163827" containerName="kserve-container" Apr 23 16:50:50.535637 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:50.535553 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b27ec6bc-c9cf-45cb-ae79-ed4801163827" containerName="storage-initializer" Apr 23 16:50:50.535637 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:50.535560 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b27ec6bc-c9cf-45cb-ae79-ed4801163827" containerName="storage-initializer" Apr 23 16:50:50.535637 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:50.535607 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b27ec6bc-c9cf-45cb-ae79-ed4801163827" containerName="kserve-container" Apr 23 16:50:50.538503 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:50.538488 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4" Apr 23 16:50:50.540211 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:50.540181 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"odh-kserve-custom-ca-bundle\"" Apr 23 16:50:50.540346 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:50.540238 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-sa-b6e5e0-dockercfg-ltkrv\"" Apr 23 16:50:50.540346 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:50.540181 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"invalid-s3-secret-b6e5e0\"" Apr 23 16:50:50.547799 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:50.547767 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4"] Apr 23 16:50:50.674513 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:50.674477 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f65714fc-9280-4a9e-bf2d-7d08cf176a77-kserve-provision-location\") pod \"isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4\" (UID: \"f65714fc-9280-4a9e-bf2d-7d08cf176a77\") " pod="kserve-ci-e2e-test/isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4" Apr 23 16:50:50.674640 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:50.674530 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f65714fc-9280-4a9e-bf2d-7d08cf176a77-cabundle-cert\") pod \"isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4\" (UID: \"f65714fc-9280-4a9e-bf2d-7d08cf176a77\") " pod="kserve-ci-e2e-test/isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4" Apr 23 16:50:50.775205 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:50.775178 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f65714fc-9280-4a9e-bf2d-7d08cf176a77-kserve-provision-location\") pod \"isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4\" (UID: \"f65714fc-9280-4a9e-bf2d-7d08cf176a77\") " pod="kserve-ci-e2e-test/isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4" Apr 23 16:50:50.775328 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:50.775228 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f65714fc-9280-4a9e-bf2d-7d08cf176a77-cabundle-cert\") pod \"isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4\" (UID: \"f65714fc-9280-4a9e-bf2d-7d08cf176a77\") " pod="kserve-ci-e2e-test/isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4" Apr 23 16:50:50.775624 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:50.775604 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f65714fc-9280-4a9e-bf2d-7d08cf176a77-kserve-provision-location\") pod \"isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4\" (UID: \"f65714fc-9280-4a9e-bf2d-7d08cf176a77\") " pod="kserve-ci-e2e-test/isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4" Apr 23 16:50:50.775882 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:50.775866 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f65714fc-9280-4a9e-bf2d-7d08cf176a77-cabundle-cert\") pod \"isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4\" (UID: \"f65714fc-9280-4a9e-bf2d-7d08cf176a77\") " pod="kserve-ci-e2e-test/isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4" Apr 23 16:50:50.848353 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:50.848311 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4" Apr 23 16:50:50.963555 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:50.963531 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4"] Apr 23 16:50:50.966188 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:50:50.966150 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf65714fc_9280_4a9e_bf2d_7d08cf176a77.slice/crio-4f3af6ee8f737e16a1e9e3288145219c6a2d286ebcd8791180c810032cb392f6 WatchSource:0}: Error finding container 4f3af6ee8f737e16a1e9e3288145219c6a2d286ebcd8791180c810032cb392f6: Status 404 returned error can't find the container with id 4f3af6ee8f737e16a1e9e3288145219c6a2d286ebcd8791180c810032cb392f6 Apr 23 16:50:50.967922 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:50.967906 2573 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 16:50:51.703908 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:51.703872 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4" event={"ID":"f65714fc-9280-4a9e-bf2d-7d08cf176a77","Type":"ContainerStarted","Data":"cca49a87b8f68211731673e838f473641c6678cbd7bde0159eca72a8757818ec"} Apr 23 16:50:51.703908 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:51.703909 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4" event={"ID":"f65714fc-9280-4a9e-bf2d-7d08cf176a77","Type":"ContainerStarted","Data":"4f3af6ee8f737e16a1e9e3288145219c6a2d286ebcd8791180c810032cb392f6"} Apr 23 16:50:54.713985 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:54.713960 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4_f65714fc-9280-4a9e-bf2d-7d08cf176a77/storage-initializer/0.log" Apr 23 16:50:54.714334 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:54.713996 2573 generic.go:358] "Generic (PLEG): container finished" podID="f65714fc-9280-4a9e-bf2d-7d08cf176a77" containerID="cca49a87b8f68211731673e838f473641c6678cbd7bde0159eca72a8757818ec" exitCode=1 Apr 23 16:50:54.714334 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:54.714048 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4" event={"ID":"f65714fc-9280-4a9e-bf2d-7d08cf176a77","Type":"ContainerDied","Data":"cca49a87b8f68211731673e838f473641c6678cbd7bde0159eca72a8757818ec"} Apr 23 16:50:55.718533 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:55.718506 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4_f65714fc-9280-4a9e-bf2d-7d08cf176a77/storage-initializer/0.log" Apr 23 16:50:55.718939 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:55.718612 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4" event={"ID":"f65714fc-9280-4a9e-bf2d-7d08cf176a77","Type":"ContainerStarted","Data":"d44883948ed5552dc379d8cea15a0725df68287b2e19da3e90db0ec44a1e19d3"} Apr 23 16:50:59.730276 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:59.730256 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4_f65714fc-9280-4a9e-bf2d-7d08cf176a77/storage-initializer/1.log" Apr 23 16:50:59.730596 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:59.730580 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4_f65714fc-9280-4a9e-bf2d-7d08cf176a77/storage-initializer/0.log" Apr 23 16:50:59.730643 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:59.730614 2573 generic.go:358] "Generic (PLEG): container finished" podID="f65714fc-9280-4a9e-bf2d-7d08cf176a77" containerID="d44883948ed5552dc379d8cea15a0725df68287b2e19da3e90db0ec44a1e19d3" exitCode=1 Apr 23 16:50:59.730685 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:59.730670 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4" event={"ID":"f65714fc-9280-4a9e-bf2d-7d08cf176a77","Type":"ContainerDied","Data":"d44883948ed5552dc379d8cea15a0725df68287b2e19da3e90db0ec44a1e19d3"} Apr 23 16:50:59.730752 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:59.730717 2573 scope.go:117] "RemoveContainer" containerID="cca49a87b8f68211731673e838f473641c6678cbd7bde0159eca72a8757818ec" Apr 23 16:50:59.731112 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:50:59.731093 2573 scope.go:117] "RemoveContainer" containerID="cca49a87b8f68211731673e838f473641c6678cbd7bde0159eca72a8757818ec" Apr 23 16:50:59.773949 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:50:59.773902 2573 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4_kserve-ci-e2e-test_f65714fc-9280-4a9e-bf2d-7d08cf176a77_0 in pod sandbox 4f3af6ee8f737e16a1e9e3288145219c6a2d286ebcd8791180c810032cb392f6 from index: no such id: 'cca49a87b8f68211731673e838f473641c6678cbd7bde0159eca72a8757818ec'" containerID="cca49a87b8f68211731673e838f473641c6678cbd7bde0159eca72a8757818ec" Apr 23 16:50:59.774047 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:50:59.773988 2573 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4_kserve-ci-e2e-test_f65714fc-9280-4a9e-bf2d-7d08cf176a77_0 in pod sandbox 4f3af6ee8f737e16a1e9e3288145219c6a2d286ebcd8791180c810032cb392f6 from index: no such id: 'cca49a87b8f68211731673e838f473641c6678cbd7bde0159eca72a8757818ec'; Skipping pod \"isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4_kserve-ci-e2e-test(f65714fc-9280-4a9e-bf2d-7d08cf176a77)\"" logger="UnhandledError" Apr 23 16:50:59.775675 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:50:59.775460 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4_kserve-ci-e2e-test(f65714fc-9280-4a9e-bf2d-7d08cf176a77)\"" pod="kserve-ci-e2e-test/isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4" podUID="f65714fc-9280-4a9e-bf2d-7d08cf176a77" Apr 23 16:51:00.734576 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:00.734551 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4_f65714fc-9280-4a9e-bf2d-7d08cf176a77/storage-initializer/1.log" Apr 23 16:51:06.612724 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:06.612633 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4"] Apr 23 16:51:06.656668 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:06.656618 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-b6e5e0-predictor-5d7b47d467-7zxxt"] Apr 23 16:51:06.657135 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:06.657085 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/isvc-primary-b6e5e0-predictor-5d7b47d467-7zxxt" podUID="b2697a8c-e60d-45fb-b91b-c7b9a513dbc3" containerName="kserve-container" containerID="cri-o://04ed2e71e7de5180f4cee887fec6d5b72ca96e86ffae1f74c61edf8f2d29494b" gracePeriod=30 Apr 23 16:51:06.738721 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:06.738680 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f"] Apr 23 16:51:06.749427 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:06.749403 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f" Apr 23 16:51:06.751005 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:06.750980 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f"] Apr 23 16:51:06.751415 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:06.751395 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-secret-865db3\"" Apr 23 16:51:06.751415 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:06.751407 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"fail-s3-sa-865db3-dockercfg-jkb2x\"" Apr 23 16:51:06.752332 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:06.752312 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4_f65714fc-9280-4a9e-bf2d-7d08cf176a77/storage-initializer/1.log" Apr 23 16:51:06.752435 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:06.752380 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4" Apr 23 16:51:06.753118 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:06.753097 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4_f65714fc-9280-4a9e-bf2d-7d08cf176a77/storage-initializer/1.log" Apr 23 16:51:06.753222 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:06.753176 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4" event={"ID":"f65714fc-9280-4a9e-bf2d-7d08cf176a77","Type":"ContainerDied","Data":"4f3af6ee8f737e16a1e9e3288145219c6a2d286ebcd8791180c810032cb392f6"} Apr 23 16:51:06.753222 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:06.753208 2573 scope.go:117] "RemoveContainer" containerID="d44883948ed5552dc379d8cea15a0725df68287b2e19da3e90db0ec44a1e19d3" Apr 23 16:51:06.890595 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:06.890543 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f65714fc-9280-4a9e-bf2d-7d08cf176a77-kserve-provision-location\") pod \"f65714fc-9280-4a9e-bf2d-7d08cf176a77\" (UID: \"f65714fc-9280-4a9e-bf2d-7d08cf176a77\") " Apr 23 16:51:06.890708 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:06.890597 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f65714fc-9280-4a9e-bf2d-7d08cf176a77-cabundle-cert\") pod \"f65714fc-9280-4a9e-bf2d-7d08cf176a77\" (UID: \"f65714fc-9280-4a9e-bf2d-7d08cf176a77\") " Apr 23 16:51:06.890752 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:06.890740 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91e77cd7-1db7-4865-8072-6c16017a09d3-kserve-provision-location\") pod \"isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f\" (UID: \"91e77cd7-1db7-4865-8072-6c16017a09d3\") " pod="kserve-ci-e2e-test/isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f" Apr 23 16:51:06.890788 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:06.890771 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/91e77cd7-1db7-4865-8072-6c16017a09d3-cabundle-cert\") pod \"isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f\" (UID: \"91e77cd7-1db7-4865-8072-6c16017a09d3\") " pod="kserve-ci-e2e-test/isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f" Apr 23 16:51:06.890844 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:06.890786 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f65714fc-9280-4a9e-bf2d-7d08cf176a77-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f65714fc-9280-4a9e-bf2d-7d08cf176a77" (UID: "f65714fc-9280-4a9e-bf2d-7d08cf176a77"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:51:06.890931 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:06.890915 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f65714fc-9280-4a9e-bf2d-7d08cf176a77-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "f65714fc-9280-4a9e-bf2d-7d08cf176a77" (UID: "f65714fc-9280-4a9e-bf2d-7d08cf176a77"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:51:06.991666 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:06.991635 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91e77cd7-1db7-4865-8072-6c16017a09d3-kserve-provision-location\") pod \"isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f\" (UID: \"91e77cd7-1db7-4865-8072-6c16017a09d3\") " pod="kserve-ci-e2e-test/isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f" Apr 23 16:51:06.991798 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:06.991677 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/91e77cd7-1db7-4865-8072-6c16017a09d3-cabundle-cert\") pod \"isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f\" (UID: \"91e77cd7-1db7-4865-8072-6c16017a09d3\") " pod="kserve-ci-e2e-test/isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f" Apr 23 16:51:06.991798 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:06.991724 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f65714fc-9280-4a9e-bf2d-7d08cf176a77-kserve-provision-location\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:51:06.991798 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:06.991734 2573 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/f65714fc-9280-4a9e-bf2d-7d08cf176a77-cabundle-cert\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:51:06.991990 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:06.991972 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91e77cd7-1db7-4865-8072-6c16017a09d3-kserve-provision-location\") pod \"isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f\" (UID: \"91e77cd7-1db7-4865-8072-6c16017a09d3\") " pod="kserve-ci-e2e-test/isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f" Apr 23 16:51:06.992207 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:06.992192 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/91e77cd7-1db7-4865-8072-6c16017a09d3-cabundle-cert\") pod \"isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f\" (UID: \"91e77cd7-1db7-4865-8072-6c16017a09d3\") " pod="kserve-ci-e2e-test/isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f" Apr 23 16:51:07.060884 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:07.060856 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f" Apr 23 16:51:07.173790 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:07.173762 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f"] Apr 23 16:51:07.176643 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:51:07.176605 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91e77cd7_1db7_4865_8072_6c16017a09d3.slice/crio-98dd7c3433c4a37833613e485b2f61747f6c7ea443dea57cadb438a89044de31 WatchSource:0}: Error finding container 98dd7c3433c4a37833613e485b2f61747f6c7ea443dea57cadb438a89044de31: Status 404 returned error can't find the container with id 98dd7c3433c4a37833613e485b2f61747f6c7ea443dea57cadb438a89044de31 Apr 23 16:51:07.756820 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:07.756792 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4" Apr 23 16:51:07.758235 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:07.758204 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f" event={"ID":"91e77cd7-1db7-4865-8072-6c16017a09d3","Type":"ContainerStarted","Data":"67140eba5c15194421b9859eb7f5602a3a898dc5bd9ee960bae062a3497fe94e"} Apr 23 16:51:07.758344 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:07.758243 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f" event={"ID":"91e77cd7-1db7-4865-8072-6c16017a09d3","Type":"ContainerStarted","Data":"98dd7c3433c4a37833613e485b2f61747f6c7ea443dea57cadb438a89044de31"} Apr 23 16:51:07.796920 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:07.796892 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4"] Apr 23 16:51:07.800667 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:07.800644 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-secondary-b6e5e0-predictor-f4bdc6dfd-smvq4"] Apr 23 16:51:08.879360 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:08.879322 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f65714fc-9280-4a9e-bf2d-7d08cf176a77" path="/var/lib/kubelet/pods/f65714fc-9280-4a9e-bf2d-7d08cf176a77/volumes" Apr 23 16:51:10.194095 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:10.194074 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-b6e5e0-predictor-5d7b47d467-7zxxt" Apr 23 16:51:10.318233 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:10.318176 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b2697a8c-e60d-45fb-b91b-c7b9a513dbc3-kserve-provision-location\") pod \"b2697a8c-e60d-45fb-b91b-c7b9a513dbc3\" (UID: \"b2697a8c-e60d-45fb-b91b-c7b9a513dbc3\") " Apr 23 16:51:10.318480 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:10.318457 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2697a8c-e60d-45fb-b91b-c7b9a513dbc3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b2697a8c-e60d-45fb-b91b-c7b9a513dbc3" (UID: "b2697a8c-e60d-45fb-b91b-c7b9a513dbc3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:51:10.419282 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:10.419260 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b2697a8c-e60d-45fb-b91b-c7b9a513dbc3-kserve-provision-location\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:51:10.774733 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:10.774686 2573 generic.go:358] "Generic (PLEG): container finished" podID="b2697a8c-e60d-45fb-b91b-c7b9a513dbc3" containerID="04ed2e71e7de5180f4cee887fec6d5b72ca96e86ffae1f74c61edf8f2d29494b" exitCode=0 Apr 23 16:51:10.774845 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:10.774741 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b6e5e0-predictor-5d7b47d467-7zxxt" event={"ID":"b2697a8c-e60d-45fb-b91b-c7b9a513dbc3","Type":"ContainerDied","Data":"04ed2e71e7de5180f4cee887fec6d5b72ca96e86ffae1f74c61edf8f2d29494b"} Apr 23 16:51:10.774845 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:10.774773 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-primary-b6e5e0-predictor-5d7b47d467-7zxxt" event={"ID":"b2697a8c-e60d-45fb-b91b-c7b9a513dbc3","Type":"ContainerDied","Data":"e881170b1daae08365d847942c85752498bf0e6eaa585ff35e5f213c634506da"} Apr 23 16:51:10.774845 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:10.774796 2573 scope.go:117] "RemoveContainer" containerID="04ed2e71e7de5180f4cee887fec6d5b72ca96e86ffae1f74c61edf8f2d29494b" Apr 23 16:51:10.774845 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:10.774798 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-primary-b6e5e0-predictor-5d7b47d467-7zxxt" Apr 23 16:51:10.782347 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:10.782328 2573 scope.go:117] "RemoveContainer" containerID="826c77626e4af399ef91aba65cb2668a807c80ba296b37bc4e00bf8dd0005ab7" Apr 23 16:51:10.789016 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:10.789001 2573 scope.go:117] "RemoveContainer" containerID="04ed2e71e7de5180f4cee887fec6d5b72ca96e86ffae1f74c61edf8f2d29494b" Apr 23 16:51:10.789259 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:51:10.789239 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04ed2e71e7de5180f4cee887fec6d5b72ca96e86ffae1f74c61edf8f2d29494b\": container with ID starting with 04ed2e71e7de5180f4cee887fec6d5b72ca96e86ffae1f74c61edf8f2d29494b not found: ID does not exist" containerID="04ed2e71e7de5180f4cee887fec6d5b72ca96e86ffae1f74c61edf8f2d29494b" Apr 23 16:51:10.789337 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:10.789264 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04ed2e71e7de5180f4cee887fec6d5b72ca96e86ffae1f74c61edf8f2d29494b"} err="failed to get container status \"04ed2e71e7de5180f4cee887fec6d5b72ca96e86ffae1f74c61edf8f2d29494b\": rpc error: code = NotFound desc = could not find container \"04ed2e71e7de5180f4cee887fec6d5b72ca96e86ffae1f74c61edf8f2d29494b\": container with ID starting with 04ed2e71e7de5180f4cee887fec6d5b72ca96e86ffae1f74c61edf8f2d29494b not found: ID does not exist" Apr 23 16:51:10.789337 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:10.789279 2573 scope.go:117] "RemoveContainer" containerID="826c77626e4af399ef91aba65cb2668a807c80ba296b37bc4e00bf8dd0005ab7" Apr 23 16:51:10.789506 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:51:10.789491 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"826c77626e4af399ef91aba65cb2668a807c80ba296b37bc4e00bf8dd0005ab7\": container with ID starting with 826c77626e4af399ef91aba65cb2668a807c80ba296b37bc4e00bf8dd0005ab7 not found: ID does not exist" containerID="826c77626e4af399ef91aba65cb2668a807c80ba296b37bc4e00bf8dd0005ab7" Apr 23 16:51:10.789546 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:10.789508 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"826c77626e4af399ef91aba65cb2668a807c80ba296b37bc4e00bf8dd0005ab7"} err="failed to get container status \"826c77626e4af399ef91aba65cb2668a807c80ba296b37bc4e00bf8dd0005ab7\": rpc error: code = NotFound desc = could not find container \"826c77626e4af399ef91aba65cb2668a807c80ba296b37bc4e00bf8dd0005ab7\": container with ID starting with 826c77626e4af399ef91aba65cb2668a807c80ba296b37bc4e00bf8dd0005ab7 not found: ID does not exist" Apr 23 16:51:10.793160 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:10.793138 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-b6e5e0-predictor-5d7b47d467-7zxxt"] Apr 23 16:51:10.796566 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:10.796546 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-primary-b6e5e0-predictor-5d7b47d467-7zxxt"] Apr 23 16:51:10.879208 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:10.879186 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2697a8c-e60d-45fb-b91b-c7b9a513dbc3" path="/var/lib/kubelet/pods/b2697a8c-e60d-45fb-b91b-c7b9a513dbc3/volumes" Apr 23 16:51:12.782046 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:12.781991 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f_91e77cd7-1db7-4865-8072-6c16017a09d3/storage-initializer/0.log" Apr 23 16:51:12.782046 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:12.782024 2573 generic.go:358] "Generic (PLEG): container finished" podID="91e77cd7-1db7-4865-8072-6c16017a09d3" containerID="67140eba5c15194421b9859eb7f5602a3a898dc5bd9ee960bae062a3497fe94e" exitCode=1 Apr 23 16:51:12.782399 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:12.782060 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f" event={"ID":"91e77cd7-1db7-4865-8072-6c16017a09d3","Type":"ContainerDied","Data":"67140eba5c15194421b9859eb7f5602a3a898dc5bd9ee960bae062a3497fe94e"} Apr 23 16:51:13.786095 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:13.786068 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f_91e77cd7-1db7-4865-8072-6c16017a09d3/storage-initializer/0.log" Apr 23 16:51:13.786449 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:13.786150 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f" event={"ID":"91e77cd7-1db7-4865-8072-6c16017a09d3","Type":"ContainerStarted","Data":"a7610c0be147221fa6547db6aca301c7fd446cb4abd0259883d1039aacd9f4b6"} Apr 23 16:51:15.791973 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:15.791951 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f_91e77cd7-1db7-4865-8072-6c16017a09d3/storage-initializer/1.log" Apr 23 16:51:15.792290 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:15.792274 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f_91e77cd7-1db7-4865-8072-6c16017a09d3/storage-initializer/0.log" Apr 23 16:51:15.792330 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:15.792309 2573 generic.go:358] "Generic (PLEG): container finished" podID="91e77cd7-1db7-4865-8072-6c16017a09d3" containerID="a7610c0be147221fa6547db6aca301c7fd446cb4abd0259883d1039aacd9f4b6" exitCode=1 Apr 23 16:51:15.792375 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:15.792361 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f" event={"ID":"91e77cd7-1db7-4865-8072-6c16017a09d3","Type":"ContainerDied","Data":"a7610c0be147221fa6547db6aca301c7fd446cb4abd0259883d1039aacd9f4b6"} Apr 23 16:51:15.792406 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:15.792395 2573 scope.go:117] "RemoveContainer" containerID="67140eba5c15194421b9859eb7f5602a3a898dc5bd9ee960bae062a3497fe94e" Apr 23 16:51:15.792743 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:15.792727 2573 scope.go:117] "RemoveContainer" containerID="67140eba5c15194421b9859eb7f5602a3a898dc5bd9ee960bae062a3497fe94e" Apr 23 16:51:15.801809 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:51:15.801779 2573 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f_kserve-ci-e2e-test_91e77cd7-1db7-4865-8072-6c16017a09d3_0 in pod sandbox 98dd7c3433c4a37833613e485b2f61747f6c7ea443dea57cadb438a89044de31 from index: no such id: '67140eba5c15194421b9859eb7f5602a3a898dc5bd9ee960bae062a3497fe94e'" containerID="67140eba5c15194421b9859eb7f5602a3a898dc5bd9ee960bae062a3497fe94e" Apr 23 16:51:15.801872 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:15.801818 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67140eba5c15194421b9859eb7f5602a3a898dc5bd9ee960bae062a3497fe94e"} err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f_kserve-ci-e2e-test_91e77cd7-1db7-4865-8072-6c16017a09d3_0 in pod sandbox 98dd7c3433c4a37833613e485b2f61747f6c7ea443dea57cadb438a89044de31 from index: no such id: '67140eba5c15194421b9859eb7f5602a3a898dc5bd9ee960bae062a3497fe94e'" Apr 23 16:51:15.801975 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:51:15.801959 2573 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f_kserve-ci-e2e-test(91e77cd7-1db7-4865-8072-6c16017a09d3)\"" pod="kserve-ci-e2e-test/isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f" podUID="91e77cd7-1db7-4865-8072-6c16017a09d3" Apr 23 16:51:16.752071 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:16.752034 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f"] Apr 23 16:51:16.796026 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:16.796005 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f_91e77cd7-1db7-4865-8072-6c16017a09d3/storage-initializer/1.log" Apr 23 16:51:16.885019 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:16.884992 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-5a19e-predictor-565b5c779d-9njq9"] Apr 23 16:51:16.885313 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:16.885299 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b2697a8c-e60d-45fb-b91b-c7b9a513dbc3" containerName="storage-initializer" Apr 23 16:51:16.885359 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:16.885315 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2697a8c-e60d-45fb-b91b-c7b9a513dbc3" containerName="storage-initializer" Apr 23 16:51:16.885359 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:16.885334 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b2697a8c-e60d-45fb-b91b-c7b9a513dbc3" containerName="kserve-container" Apr 23 16:51:16.885359 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:16.885339 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2697a8c-e60d-45fb-b91b-c7b9a513dbc3" containerName="kserve-container" Apr 23 16:51:16.885359 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:16.885348 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f65714fc-9280-4a9e-bf2d-7d08cf176a77" containerName="storage-initializer" Apr 23 16:51:16.885359 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:16.885357 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f65714fc-9280-4a9e-bf2d-7d08cf176a77" containerName="storage-initializer" Apr 23 16:51:16.885504 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:16.885366 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f65714fc-9280-4a9e-bf2d-7d08cf176a77" containerName="storage-initializer" Apr 23 16:51:16.885504 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:16.885372 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="f65714fc-9280-4a9e-bf2d-7d08cf176a77" containerName="storage-initializer" Apr 23 16:51:16.885504 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:16.885409 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f65714fc-9280-4a9e-bf2d-7d08cf176a77" containerName="storage-initializer" Apr 23 16:51:16.885504 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:16.885419 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="b2697a8c-e60d-45fb-b91b-c7b9a513dbc3" containerName="kserve-container" Apr 23 16:51:16.885504 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:16.885502 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="f65714fc-9280-4a9e-bf2d-7d08cf176a77" containerName="storage-initializer" Apr 23 16:51:16.889385 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:16.889370 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-5a19e-predictor-565b5c779d-9njq9" Apr 23 16:51:16.891163 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:16.891145 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-p7xp4\"" Apr 23 16:51:16.894492 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:16.894470 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-5a19e-predictor-565b5c779d-9njq9"] Apr 23 16:51:16.920166 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:16.920149 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f_91e77cd7-1db7-4865-8072-6c16017a09d3/storage-initializer/1.log" Apr 23 16:51:16.920235 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:16.920199 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f" Apr 23 16:51:17.069876 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:17.069813 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/91e77cd7-1db7-4865-8072-6c16017a09d3-cabundle-cert\") pod \"91e77cd7-1db7-4865-8072-6c16017a09d3\" (UID: \"91e77cd7-1db7-4865-8072-6c16017a09d3\") " Apr 23 16:51:17.069876 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:17.069860 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91e77cd7-1db7-4865-8072-6c16017a09d3-kserve-provision-location\") pod \"91e77cd7-1db7-4865-8072-6c16017a09d3\" (UID: \"91e77cd7-1db7-4865-8072-6c16017a09d3\") " Apr 23 16:51:17.070029 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:17.069979 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84fb6b2b-4b51-4318-a257-c9de319652b1-kserve-provision-location\") pod \"raw-sklearn-5a19e-predictor-565b5c779d-9njq9\" (UID: \"84fb6b2b-4b51-4318-a257-c9de319652b1\") " pod="kserve-ci-e2e-test/raw-sklearn-5a19e-predictor-565b5c779d-9njq9" Apr 23 16:51:17.070131 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:17.070111 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/91e77cd7-1db7-4865-8072-6c16017a09d3-cabundle-cert" (OuterVolumeSpecName: "cabundle-cert") pod "91e77cd7-1db7-4865-8072-6c16017a09d3" (UID: "91e77cd7-1db7-4865-8072-6c16017a09d3"). InnerVolumeSpecName "cabundle-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 16:51:17.070175 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:17.070150 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/91e77cd7-1db7-4865-8072-6c16017a09d3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "91e77cd7-1db7-4865-8072-6c16017a09d3" (UID: "91e77cd7-1db7-4865-8072-6c16017a09d3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:51:17.170617 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:17.170583 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84fb6b2b-4b51-4318-a257-c9de319652b1-kserve-provision-location\") pod \"raw-sklearn-5a19e-predictor-565b5c779d-9njq9\" (UID: \"84fb6b2b-4b51-4318-a257-c9de319652b1\") " pod="kserve-ci-e2e-test/raw-sklearn-5a19e-predictor-565b5c779d-9njq9" Apr 23 16:51:17.170753 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:17.170724 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/91e77cd7-1db7-4865-8072-6c16017a09d3-kserve-provision-location\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:51:17.170753 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:17.170744 2573 reconciler_common.go:299] "Volume detached for volume \"cabundle-cert\" (UniqueName: \"kubernetes.io/configmap/91e77cd7-1db7-4865-8072-6c16017a09d3-cabundle-cert\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:51:17.170932 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:17.170915 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84fb6b2b-4b51-4318-a257-c9de319652b1-kserve-provision-location\") pod \"raw-sklearn-5a19e-predictor-565b5c779d-9njq9\" (UID: \"84fb6b2b-4b51-4318-a257-c9de319652b1\") " pod="kserve-ci-e2e-test/raw-sklearn-5a19e-predictor-565b5c779d-9njq9" Apr 23 16:51:17.201375 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:17.201350 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-5a19e-predictor-565b5c779d-9njq9" Apr 23 16:51:17.318204 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:17.318100 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-5a19e-predictor-565b5c779d-9njq9"] Apr 23 16:51:17.320204 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:51:17.320137 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84fb6b2b_4b51_4318_a257_c9de319652b1.slice/crio-ce6b7c39afaeb26cc98ac389f4e11fa3ff14d5590b271ffd46e588e3aad08b58 WatchSource:0}: Error finding container ce6b7c39afaeb26cc98ac389f4e11fa3ff14d5590b271ffd46e588e3aad08b58: Status 404 returned error can't find the container with id ce6b7c39afaeb26cc98ac389f4e11fa3ff14d5590b271ffd46e588e3aad08b58 Apr 23 16:51:17.800711 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:17.800666 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-5a19e-predictor-565b5c779d-9njq9" event={"ID":"84fb6b2b-4b51-4318-a257-c9de319652b1","Type":"ContainerStarted","Data":"a732f7b9dda8939916d42aed80a80f4a4d087313392fa48246791d14138d4412"} Apr 23 16:51:17.801167 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:17.800722 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-5a19e-predictor-565b5c779d-9njq9" event={"ID":"84fb6b2b-4b51-4318-a257-c9de319652b1","Type":"ContainerStarted","Data":"ce6b7c39afaeb26cc98ac389f4e11fa3ff14d5590b271ffd46e588e3aad08b58"} Apr 23 16:51:17.801878 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:17.801852 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f_91e77cd7-1db7-4865-8072-6c16017a09d3/storage-initializer/1.log" Apr 23 16:51:17.802003 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:17.801919 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f" event={"ID":"91e77cd7-1db7-4865-8072-6c16017a09d3","Type":"ContainerDied","Data":"98dd7c3433c4a37833613e485b2f61747f6c7ea443dea57cadb438a89044de31"} Apr 23 16:51:17.802003 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:17.801956 2573 scope.go:117] "RemoveContainer" containerID="a7610c0be147221fa6547db6aca301c7fd446cb4abd0259883d1039aacd9f4b6" Apr 23 16:51:17.802003 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:17.801962 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f" Apr 23 16:51:17.838007 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:17.837982 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f"] Apr 23 16:51:17.841543 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:17.841524 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/isvc-init-fail-865db3-predictor-6d9b8cf984-sf96f"] Apr 23 16:51:18.879676 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:18.879646 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="91e77cd7-1db7-4865-8072-6c16017a09d3" path="/var/lib/kubelet/pods/91e77cd7-1db7-4865-8072-6c16017a09d3/volumes" Apr 23 16:51:21.816284 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:21.816251 2573 generic.go:358] "Generic (PLEG): container finished" podID="84fb6b2b-4b51-4318-a257-c9de319652b1" containerID="a732f7b9dda8939916d42aed80a80f4a4d087313392fa48246791d14138d4412" exitCode=0 Apr 23 16:51:21.816640 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:21.816310 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-5a19e-predictor-565b5c779d-9njq9" event={"ID":"84fb6b2b-4b51-4318-a257-c9de319652b1","Type":"ContainerDied","Data":"a732f7b9dda8939916d42aed80a80f4a4d087313392fa48246791d14138d4412"} Apr 23 16:51:22.820584 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:22.820549 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-5a19e-predictor-565b5c779d-9njq9" event={"ID":"84fb6b2b-4b51-4318-a257-c9de319652b1","Type":"ContainerStarted","Data":"0afbc2b4c11e2dfb62c95d2af605c130d57986338e71a103da5eb076f7ba6cc2"} Apr 23 16:51:22.821006 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:22.820832 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-5a19e-predictor-565b5c779d-9njq9" Apr 23 16:51:22.821911 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:22.821888 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-5a19e-predictor-565b5c779d-9njq9" podUID="84fb6b2b-4b51-4318-a257-c9de319652b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 23 16:51:22.835564 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:22.835527 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-5a19e-predictor-565b5c779d-9njq9" podStartSLOduration=6.835513062 podStartE2EDuration="6.835513062s" podCreationTimestamp="2026-04-23 16:51:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:51:22.834975456 +0000 UTC m=+988.524761803" watchObservedRunningTime="2026-04-23 16:51:22.835513062 +0000 UTC m=+988.525299408" Apr 23 16:51:23.823906 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:23.823870 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-5a19e-predictor-565b5c779d-9njq9" podUID="84fb6b2b-4b51-4318-a257-c9de319652b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 23 16:51:33.824672 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:33.824634 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-5a19e-predictor-565b5c779d-9njq9" podUID="84fb6b2b-4b51-4318-a257-c9de319652b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 23 16:51:43.824769 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:43.824726 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-5a19e-predictor-565b5c779d-9njq9" podUID="84fb6b2b-4b51-4318-a257-c9de319652b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 23 16:51:53.824810 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:51:53.824764 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-5a19e-predictor-565b5c779d-9njq9" podUID="84fb6b2b-4b51-4318-a257-c9de319652b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 23 16:52:03.824440 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:03.824403 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-5a19e-predictor-565b5c779d-9njq9" podUID="84fb6b2b-4b51-4318-a257-c9de319652b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 23 16:52:13.824213 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:13.824170 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-5a19e-predictor-565b5c779d-9njq9" podUID="84fb6b2b-4b51-4318-a257-c9de319652b1" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.28:8080: connect: connection refused" Apr 23 16:52:23.824876 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:23.824843 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-5a19e-predictor-565b5c779d-9njq9" Apr 23 16:52:26.995835 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:26.995794 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-5a19e-predictor-565b5c779d-9njq9"] Apr 23 16:52:26.996407 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:26.996017 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-5a19e-predictor-565b5c779d-9njq9" podUID="84fb6b2b-4b51-4318-a257-c9de319652b1" containerName="kserve-container" containerID="cri-o://0afbc2b4c11e2dfb62c95d2af605c130d57986338e71a103da5eb076f7ba6cc2" gracePeriod=30 Apr 23 16:52:27.041512 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:27.041466 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-ac859-predictor-5b64df665d-pcnbk"] Apr 23 16:52:27.041776 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:27.041764 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="91e77cd7-1db7-4865-8072-6c16017a09d3" containerName="storage-initializer" Apr 23 16:52:27.041823 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:27.041778 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="91e77cd7-1db7-4865-8072-6c16017a09d3" containerName="storage-initializer" Apr 23 16:52:27.041823 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:27.041789 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="91e77cd7-1db7-4865-8072-6c16017a09d3" containerName="storage-initializer" Apr 23 16:52:27.041823 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:27.041794 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="91e77cd7-1db7-4865-8072-6c16017a09d3" containerName="storage-initializer" Apr 23 16:52:27.041915 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:27.041840 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="91e77cd7-1db7-4865-8072-6c16017a09d3" containerName="storage-initializer" Apr 23 16:52:27.041947 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:27.041932 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="91e77cd7-1db7-4865-8072-6c16017a09d3" containerName="storage-initializer" Apr 23 16:52:27.044804 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:27.044790 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-ac859-predictor-5b64df665d-pcnbk" Apr 23 16:52:27.052012 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:27.051990 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-ac859-predictor-5b64df665d-pcnbk"] Apr 23 16:52:27.151228 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:27.151199 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ab9610f-f734-4328-b7d1-be10fe184a7b-kserve-provision-location\") pod \"raw-sklearn-runtime-ac859-predictor-5b64df665d-pcnbk\" (UID: \"5ab9610f-f734-4328-b7d1-be10fe184a7b\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-ac859-predictor-5b64df665d-pcnbk" Apr 23 16:52:27.252483 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:27.252423 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ab9610f-f734-4328-b7d1-be10fe184a7b-kserve-provision-location\") pod \"raw-sklearn-runtime-ac859-predictor-5b64df665d-pcnbk\" (UID: \"5ab9610f-f734-4328-b7d1-be10fe184a7b\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-ac859-predictor-5b64df665d-pcnbk" Apr 23 16:52:27.252771 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:27.252755 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ab9610f-f734-4328-b7d1-be10fe184a7b-kserve-provision-location\") pod \"raw-sklearn-runtime-ac859-predictor-5b64df665d-pcnbk\" (UID: \"5ab9610f-f734-4328-b7d1-be10fe184a7b\") " pod="kserve-ci-e2e-test/raw-sklearn-runtime-ac859-predictor-5b64df665d-pcnbk" Apr 23 16:52:27.354867 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:27.354844 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-ac859-predictor-5b64df665d-pcnbk" Apr 23 16:52:27.469357 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:27.469326 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-ac859-predictor-5b64df665d-pcnbk"] Apr 23 16:52:27.473175 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:52:27.473149 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ab9610f_f734_4328_b7d1_be10fe184a7b.slice/crio-9ae47749c91f3c557b11ff527a32d6a1053eab54c7aa6687b7504436f3f7e0d3 WatchSource:0}: Error finding container 9ae47749c91f3c557b11ff527a32d6a1053eab54c7aa6687b7504436f3f7e0d3: Status 404 returned error can't find the container with id 9ae47749c91f3c557b11ff527a32d6a1053eab54c7aa6687b7504436f3f7e0d3 Apr 23 16:52:27.996920 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:27.996884 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-ac859-predictor-5b64df665d-pcnbk" event={"ID":"5ab9610f-f734-4328-b7d1-be10fe184a7b","Type":"ContainerStarted","Data":"afdf245da4d671857ef9a6f0dc221964a23e099ffc3033baca0af80b245a921d"} Apr 23 16:52:27.996920 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:27.996920 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-ac859-predictor-5b64df665d-pcnbk" event={"ID":"5ab9610f-f734-4328-b7d1-be10fe184a7b","Type":"ContainerStarted","Data":"9ae47749c91f3c557b11ff527a32d6a1053eab54c7aa6687b7504436f3f7e0d3"} Apr 23 16:52:30.530662 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:30.530639 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-5a19e-predictor-565b5c779d-9njq9" Apr 23 16:52:30.676762 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:30.676737 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84fb6b2b-4b51-4318-a257-c9de319652b1-kserve-provision-location\") pod \"84fb6b2b-4b51-4318-a257-c9de319652b1\" (UID: \"84fb6b2b-4b51-4318-a257-c9de319652b1\") " Apr 23 16:52:30.677039 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:30.677018 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/84fb6b2b-4b51-4318-a257-c9de319652b1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "84fb6b2b-4b51-4318-a257-c9de319652b1" (UID: "84fb6b2b-4b51-4318-a257-c9de319652b1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:52:30.777810 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:30.777787 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/84fb6b2b-4b51-4318-a257-c9de319652b1-kserve-provision-location\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:52:31.008048 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:31.007971 2573 generic.go:358] "Generic (PLEG): container finished" podID="84fb6b2b-4b51-4318-a257-c9de319652b1" containerID="0afbc2b4c11e2dfb62c95d2af605c130d57986338e71a103da5eb076f7ba6cc2" exitCode=0 Apr 23 16:52:31.008048 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:31.008044 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-5a19e-predictor-565b5c779d-9njq9" Apr 23 16:52:31.008201 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:31.008057 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-5a19e-predictor-565b5c779d-9njq9" event={"ID":"84fb6b2b-4b51-4318-a257-c9de319652b1","Type":"ContainerDied","Data":"0afbc2b4c11e2dfb62c95d2af605c130d57986338e71a103da5eb076f7ba6cc2"} Apr 23 16:52:31.008201 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:31.008112 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-5a19e-predictor-565b5c779d-9njq9" event={"ID":"84fb6b2b-4b51-4318-a257-c9de319652b1","Type":"ContainerDied","Data":"ce6b7c39afaeb26cc98ac389f4e11fa3ff14d5590b271ffd46e588e3aad08b58"} Apr 23 16:52:31.008201 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:31.008136 2573 scope.go:117] "RemoveContainer" containerID="0afbc2b4c11e2dfb62c95d2af605c130d57986338e71a103da5eb076f7ba6cc2" Apr 23 16:52:31.017368 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:31.017349 2573 scope.go:117] "RemoveContainer" containerID="a732f7b9dda8939916d42aed80a80f4a4d087313392fa48246791d14138d4412" Apr 23 16:52:31.022348 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:31.022320 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-5a19e-predictor-565b5c779d-9njq9"] Apr 23 16:52:31.024976 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:31.024937 2573 scope.go:117] "RemoveContainer" containerID="0afbc2b4c11e2dfb62c95d2af605c130d57986338e71a103da5eb076f7ba6cc2" Apr 23 16:52:31.025262 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:52:31.025238 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0afbc2b4c11e2dfb62c95d2af605c130d57986338e71a103da5eb076f7ba6cc2\": container with ID starting with 0afbc2b4c11e2dfb62c95d2af605c130d57986338e71a103da5eb076f7ba6cc2 not found: ID does not exist" containerID="0afbc2b4c11e2dfb62c95d2af605c130d57986338e71a103da5eb076f7ba6cc2" Apr 23 16:52:31.025321 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:31.025270 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0afbc2b4c11e2dfb62c95d2af605c130d57986338e71a103da5eb076f7ba6cc2"} err="failed to get container status \"0afbc2b4c11e2dfb62c95d2af605c130d57986338e71a103da5eb076f7ba6cc2\": rpc error: code = NotFound desc = could not find container \"0afbc2b4c11e2dfb62c95d2af605c130d57986338e71a103da5eb076f7ba6cc2\": container with ID starting with 0afbc2b4c11e2dfb62c95d2af605c130d57986338e71a103da5eb076f7ba6cc2 not found: ID does not exist" Apr 23 16:52:31.025321 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:31.025288 2573 scope.go:117] "RemoveContainer" containerID="a732f7b9dda8939916d42aed80a80f4a4d087313392fa48246791d14138d4412" Apr 23 16:52:31.025514 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:52:31.025496 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a732f7b9dda8939916d42aed80a80f4a4d087313392fa48246791d14138d4412\": container with ID starting with a732f7b9dda8939916d42aed80a80f4a4d087313392fa48246791d14138d4412 not found: ID does not exist" containerID="a732f7b9dda8939916d42aed80a80f4a4d087313392fa48246791d14138d4412" Apr 23 16:52:31.025569 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:31.025526 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a732f7b9dda8939916d42aed80a80f4a4d087313392fa48246791d14138d4412"} err="failed to get container status \"a732f7b9dda8939916d42aed80a80f4a4d087313392fa48246791d14138d4412\": rpc error: code = NotFound desc = could not find container \"a732f7b9dda8939916d42aed80a80f4a4d087313392fa48246791d14138d4412\": container with ID starting with a732f7b9dda8939916d42aed80a80f4a4d087313392fa48246791d14138d4412 not found: ID does not exist" Apr 23 16:52:31.028177 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:31.028156 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-5a19e-predictor-565b5c779d-9njq9"] Apr 23 16:52:32.013847 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:32.013806 2573 generic.go:358] "Generic (PLEG): container finished" podID="5ab9610f-f734-4328-b7d1-be10fe184a7b" containerID="afdf245da4d671857ef9a6f0dc221964a23e099ffc3033baca0af80b245a921d" exitCode=0 Apr 23 16:52:32.014248 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:32.013875 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-ac859-predictor-5b64df665d-pcnbk" event={"ID":"5ab9610f-f734-4328-b7d1-be10fe184a7b","Type":"ContainerDied","Data":"afdf245da4d671857ef9a6f0dc221964a23e099ffc3033baca0af80b245a921d"} Apr 23 16:52:32.879089 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:32.879060 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84fb6b2b-4b51-4318-a257-c9de319652b1" path="/var/lib/kubelet/pods/84fb6b2b-4b51-4318-a257-c9de319652b1/volumes" Apr 23 16:52:33.017470 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:33.017445 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-ac859-predictor-5b64df665d-pcnbk" event={"ID":"5ab9610f-f734-4328-b7d1-be10fe184a7b","Type":"ContainerStarted","Data":"85421d85838048b084e58f9994b1c2190697ab139f05428f0d9386003adfa250"} Apr 23 16:52:33.017854 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:33.017722 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-ac859-predictor-5b64df665d-pcnbk" Apr 23 16:52:33.018996 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:33.018972 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-ac859-predictor-5b64df665d-pcnbk" podUID="5ab9610f-f734-4328-b7d1-be10fe184a7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 23 16:52:33.032418 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:33.032378 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/raw-sklearn-runtime-ac859-predictor-5b64df665d-pcnbk" podStartSLOduration=6.03236558 podStartE2EDuration="6.03236558s" podCreationTimestamp="2026-04-23 16:52:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:52:33.031033174 +0000 UTC m=+1058.720819520" watchObservedRunningTime="2026-04-23 16:52:33.03236558 +0000 UTC m=+1058.722151926" Apr 23 16:52:34.020538 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:34.020500 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-ac859-predictor-5b64df665d-pcnbk" podUID="5ab9610f-f734-4328-b7d1-be10fe184a7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 23 16:52:44.021119 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:44.021077 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-ac859-predictor-5b64df665d-pcnbk" podUID="5ab9610f-f734-4328-b7d1-be10fe184a7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 23 16:52:54.021553 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:52:54.021513 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-ac859-predictor-5b64df665d-pcnbk" podUID="5ab9610f-f734-4328-b7d1-be10fe184a7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 23 16:53:04.021265 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:04.021222 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-ac859-predictor-5b64df665d-pcnbk" podUID="5ab9610f-f734-4328-b7d1-be10fe184a7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 23 16:53:14.021024 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:14.020966 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-ac859-predictor-5b64df665d-pcnbk" podUID="5ab9610f-f734-4328-b7d1-be10fe184a7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 23 16:53:24.021103 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:24.021058 2573 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/raw-sklearn-runtime-ac859-predictor-5b64df665d-pcnbk" podUID="5ab9610f-f734-4328-b7d1-be10fe184a7b" containerName="kserve-container" probeResult="failure" output="dial tcp 10.134.0.29:8080: connect: connection refused" Apr 23 16:53:34.021836 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:34.021808 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/raw-sklearn-runtime-ac859-predictor-5b64df665d-pcnbk" Apr 23 16:53:37.184351 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:37.184319 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-ac859-predictor-5b64df665d-pcnbk"] Apr 23 16:53:37.184788 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:37.184590 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/raw-sklearn-runtime-ac859-predictor-5b64df665d-pcnbk" podUID="5ab9610f-f734-4328-b7d1-be10fe184a7b" containerName="kserve-container" containerID="cri-o://85421d85838048b084e58f9994b1c2190697ab139f05428f0d9386003adfa250" gracePeriod=30 Apr 23 16:53:38.169419 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:38.169389 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-s92tk/must-gather-r6wjp"] Apr 23 16:53:38.169703 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:38.169681 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84fb6b2b-4b51-4318-a257-c9de319652b1" containerName="kserve-container" Apr 23 16:53:38.169753 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:38.169709 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="84fb6b2b-4b51-4318-a257-c9de319652b1" containerName="kserve-container" Apr 23 16:53:38.169753 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:38.169732 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="84fb6b2b-4b51-4318-a257-c9de319652b1" containerName="storage-initializer" Apr 23 16:53:38.169753 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:38.169738 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="84fb6b2b-4b51-4318-a257-c9de319652b1" containerName="storage-initializer" Apr 23 16:53:38.169843 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:38.169776 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="84fb6b2b-4b51-4318-a257-c9de319652b1" containerName="kserve-container" Apr 23 16:53:38.172680 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:38.172665 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s92tk/must-gather-r6wjp" Apr 23 16:53:38.174762 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:38.174735 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-s92tk\"/\"openshift-service-ca.crt\"" Apr 23 16:53:38.174958 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:38.174944 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-s92tk\"/\"default-dockercfg-c2z7f\"" Apr 23 16:53:38.175180 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:38.175162 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-s92tk\"/\"kube-root-ca.crt\"" Apr 23 16:53:38.181407 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:38.181388 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-s92tk/must-gather-r6wjp"] Apr 23 16:53:38.224666 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:38.224638 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsvd4\" (UniqueName: \"kubernetes.io/projected/013703ce-4894-44f8-b1f5-3fba2dfae5d4-kube-api-access-bsvd4\") pod \"must-gather-r6wjp\" (UID: \"013703ce-4894-44f8-b1f5-3fba2dfae5d4\") " pod="openshift-must-gather-s92tk/must-gather-r6wjp" Apr 23 16:53:38.224955 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:38.224668 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/013703ce-4894-44f8-b1f5-3fba2dfae5d4-must-gather-output\") pod \"must-gather-r6wjp\" (UID: \"013703ce-4894-44f8-b1f5-3fba2dfae5d4\") " pod="openshift-must-gather-s92tk/must-gather-r6wjp" Apr 23 16:53:38.325510 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:38.325488 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bsvd4\" (UniqueName: \"kubernetes.io/projected/013703ce-4894-44f8-b1f5-3fba2dfae5d4-kube-api-access-bsvd4\") pod \"must-gather-r6wjp\" (UID: \"013703ce-4894-44f8-b1f5-3fba2dfae5d4\") " pod="openshift-must-gather-s92tk/must-gather-r6wjp" Apr 23 16:53:38.325582 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:38.325518 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/013703ce-4894-44f8-b1f5-3fba2dfae5d4-must-gather-output\") pod \"must-gather-r6wjp\" (UID: \"013703ce-4894-44f8-b1f5-3fba2dfae5d4\") " pod="openshift-must-gather-s92tk/must-gather-r6wjp" Apr 23 16:53:38.325840 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:38.325826 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/013703ce-4894-44f8-b1f5-3fba2dfae5d4-must-gather-output\") pod \"must-gather-r6wjp\" (UID: \"013703ce-4894-44f8-b1f5-3fba2dfae5d4\") " pod="openshift-must-gather-s92tk/must-gather-r6wjp" Apr 23 16:53:38.332182 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:38.332155 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsvd4\" (UniqueName: \"kubernetes.io/projected/013703ce-4894-44f8-b1f5-3fba2dfae5d4-kube-api-access-bsvd4\") pod \"must-gather-r6wjp\" (UID: \"013703ce-4894-44f8-b1f5-3fba2dfae5d4\") " pod="openshift-must-gather-s92tk/must-gather-r6wjp" Apr 23 16:53:38.493940 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:38.493889 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s92tk/must-gather-r6wjp" Apr 23 16:53:38.605969 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:38.605935 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-s92tk/must-gather-r6wjp"] Apr 23 16:53:38.609768 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:53:38.609743 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod013703ce_4894_44f8_b1f5_3fba2dfae5d4.slice/crio-1793721a40475a6cd0f1e8e5db366ed0d8223fbd2e7342fbc1f683f6a84bb146 WatchSource:0}: Error finding container 1793721a40475a6cd0f1e8e5db366ed0d8223fbd2e7342fbc1f683f6a84bb146: Status 404 returned error can't find the container with id 1793721a40475a6cd0f1e8e5db366ed0d8223fbd2e7342fbc1f683f6a84bb146 Apr 23 16:53:39.178885 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:39.178856 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s92tk/must-gather-r6wjp" event={"ID":"013703ce-4894-44f8-b1f5-3fba2dfae5d4","Type":"ContainerStarted","Data":"1793721a40475a6cd0f1e8e5db366ed0d8223fbd2e7342fbc1f683f6a84bb146"} Apr 23 16:53:41.022149 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:41.022125 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-ac859-predictor-5b64df665d-pcnbk" Apr 23 16:53:41.044672 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:41.044647 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ab9610f-f734-4328-b7d1-be10fe184a7b-kserve-provision-location\") pod \"5ab9610f-f734-4328-b7d1-be10fe184a7b\" (UID: \"5ab9610f-f734-4328-b7d1-be10fe184a7b\") " Apr 23 16:53:41.045092 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:41.045061 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5ab9610f-f734-4328-b7d1-be10fe184a7b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5ab9610f-f734-4328-b7d1-be10fe184a7b" (UID: "5ab9610f-f734-4328-b7d1-be10fe184a7b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:53:41.145992 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:41.145966 2573 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5ab9610f-f734-4328-b7d1-be10fe184a7b-kserve-provision-location\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:53:41.190161 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:41.189980 2573 generic.go:358] "Generic (PLEG): container finished" podID="5ab9610f-f734-4328-b7d1-be10fe184a7b" containerID="85421d85838048b084e58f9994b1c2190697ab139f05428f0d9386003adfa250" exitCode=0 Apr 23 16:53:41.190161 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:41.190019 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-ac859-predictor-5b64df665d-pcnbk" event={"ID":"5ab9610f-f734-4328-b7d1-be10fe184a7b","Type":"ContainerDied","Data":"85421d85838048b084e58f9994b1c2190697ab139f05428f0d9386003adfa250"} Apr 23 16:53:41.190161 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:41.190054 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/raw-sklearn-runtime-ac859-predictor-5b64df665d-pcnbk" event={"ID":"5ab9610f-f734-4328-b7d1-be10fe184a7b","Type":"ContainerDied","Data":"9ae47749c91f3c557b11ff527a32d6a1053eab54c7aa6687b7504436f3f7e0d3"} Apr 23 16:53:41.190161 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:41.190057 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/raw-sklearn-runtime-ac859-predictor-5b64df665d-pcnbk" Apr 23 16:53:41.190161 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:41.190073 2573 scope.go:117] "RemoveContainer" containerID="85421d85838048b084e58f9994b1c2190697ab139f05428f0d9386003adfa250" Apr 23 16:53:41.201546 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:41.201506 2573 scope.go:117] "RemoveContainer" containerID="afdf245da4d671857ef9a6f0dc221964a23e099ffc3033baca0af80b245a921d" Apr 23 16:53:41.213311 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:41.213288 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-ac859-predictor-5b64df665d-pcnbk"] Apr 23 16:53:41.215277 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:41.215164 2573 scope.go:117] "RemoveContainer" containerID="85421d85838048b084e58f9994b1c2190697ab139f05428f0d9386003adfa250" Apr 23 16:53:41.215566 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:53:41.215524 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85421d85838048b084e58f9994b1c2190697ab139f05428f0d9386003adfa250\": container with ID starting with 85421d85838048b084e58f9994b1c2190697ab139f05428f0d9386003adfa250 not found: ID does not exist" containerID="85421d85838048b084e58f9994b1c2190697ab139f05428f0d9386003adfa250" Apr 23 16:53:41.215745 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:41.215720 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85421d85838048b084e58f9994b1c2190697ab139f05428f0d9386003adfa250"} err="failed to get container status \"85421d85838048b084e58f9994b1c2190697ab139f05428f0d9386003adfa250\": rpc error: code = NotFound desc = could not find container \"85421d85838048b084e58f9994b1c2190697ab139f05428f0d9386003adfa250\": container with ID starting with 85421d85838048b084e58f9994b1c2190697ab139f05428f0d9386003adfa250 not found: ID does not exist" Apr 23 16:53:41.215890 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:41.215878 2573 scope.go:117] "RemoveContainer" containerID="afdf245da4d671857ef9a6f0dc221964a23e099ffc3033baca0af80b245a921d" Apr 23 16:53:41.216355 ip-10-0-137-14 kubenswrapper[2573]: E0423 16:53:41.216274 2573 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"afdf245da4d671857ef9a6f0dc221964a23e099ffc3033baca0af80b245a921d\": container with ID starting with afdf245da4d671857ef9a6f0dc221964a23e099ffc3033baca0af80b245a921d not found: ID does not exist" containerID="afdf245da4d671857ef9a6f0dc221964a23e099ffc3033baca0af80b245a921d" Apr 23 16:53:41.216355 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:41.216309 2573 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"afdf245da4d671857ef9a6f0dc221964a23e099ffc3033baca0af80b245a921d"} err="failed to get container status \"afdf245da4d671857ef9a6f0dc221964a23e099ffc3033baca0af80b245a921d\": rpc error: code = NotFound desc = could not find container \"afdf245da4d671857ef9a6f0dc221964a23e099ffc3033baca0af80b245a921d\": container with ID starting with afdf245da4d671857ef9a6f0dc221964a23e099ffc3033baca0af80b245a921d not found: ID does not exist" Apr 23 16:53:41.216877 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:41.216847 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/raw-sklearn-runtime-ac859-predictor-5b64df665d-pcnbk"] Apr 23 16:53:42.881068 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:42.881035 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5ab9610f-f734-4328-b7d1-be10fe184a7b" path="/var/lib/kubelet/pods/5ab9610f-f734-4328-b7d1-be10fe184a7b/volumes" Apr 23 16:53:44.202051 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:44.202014 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s92tk/must-gather-r6wjp" event={"ID":"013703ce-4894-44f8-b1f5-3fba2dfae5d4","Type":"ContainerStarted","Data":"57da90dd5e267c089824fc7b27745370870e0f13c10092bc8a0d8e9bdfd5405f"} Apr 23 16:53:44.202051 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:44.202051 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s92tk/must-gather-r6wjp" event={"ID":"013703ce-4894-44f8-b1f5-3fba2dfae5d4","Type":"ContainerStarted","Data":"04397ad0ccfb76b06c06f6838a601f7ab2f8096a7ba82f8d62daf27086de83aa"} Apr 23 16:53:44.217153 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:53:44.217104 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-s92tk/must-gather-r6wjp" podStartSLOduration=1.258204501 podStartE2EDuration="6.217089573s" podCreationTimestamp="2026-04-23 16:53:38 +0000 UTC" firstStartedPulling="2026-04-23 16:53:38.611436742 +0000 UTC m=+1124.301223071" lastFinishedPulling="2026-04-23 16:53:43.570321809 +0000 UTC m=+1129.260108143" observedRunningTime="2026-04-23 16:53:44.215675546 +0000 UTC m=+1129.905461889" watchObservedRunningTime="2026-04-23 16:53:44.217089573 +0000 UTC m=+1129.906875960" Apr 23 16:54:01.248811 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:01.248783 2573 generic.go:358] "Generic (PLEG): container finished" podID="013703ce-4894-44f8-b1f5-3fba2dfae5d4" containerID="04397ad0ccfb76b06c06f6838a601f7ab2f8096a7ba82f8d62daf27086de83aa" exitCode=0 Apr 23 16:54:01.249162 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:01.248861 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-s92tk/must-gather-r6wjp" event={"ID":"013703ce-4894-44f8-b1f5-3fba2dfae5d4","Type":"ContainerDied","Data":"04397ad0ccfb76b06c06f6838a601f7ab2f8096a7ba82f8d62daf27086de83aa"} Apr 23 16:54:01.249202 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:01.249176 2573 scope.go:117] "RemoveContainer" containerID="04397ad0ccfb76b06c06f6838a601f7ab2f8096a7ba82f8d62daf27086de83aa" Apr 23 16:54:01.624921 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:01.624876 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-s92tk_must-gather-r6wjp_013703ce-4894-44f8-b1f5-3fba2dfae5d4/gather/0.log" Apr 23 16:54:02.159958 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:02.159929 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gwvnc/must-gather-xnmh7"] Apr 23 16:54:02.160202 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:02.160190 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ab9610f-f734-4328-b7d1-be10fe184a7b" containerName="storage-initializer" Apr 23 16:54:02.160249 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:02.160203 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ab9610f-f734-4328-b7d1-be10fe184a7b" containerName="storage-initializer" Apr 23 16:54:02.160249 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:02.160243 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5ab9610f-f734-4328-b7d1-be10fe184a7b" containerName="kserve-container" Apr 23 16:54:02.160249 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:02.160249 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="5ab9610f-f734-4328-b7d1-be10fe184a7b" containerName="kserve-container" Apr 23 16:54:02.160356 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:02.160293 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="5ab9610f-f734-4328-b7d1-be10fe184a7b" containerName="kserve-container" Apr 23 16:54:02.163285 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:02.163271 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gwvnc/must-gather-xnmh7" Apr 23 16:54:02.164878 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:02.164854 2573 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-gwvnc\"/\"default-dockercfg-kgzpm\"" Apr 23 16:54:02.164992 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:02.164927 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gwvnc\"/\"openshift-service-ca.crt\"" Apr 23 16:54:02.165072 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:02.165045 2573 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-gwvnc\"/\"kube-root-ca.crt\"" Apr 23 16:54:02.171557 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:02.171539 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gwvnc/must-gather-xnmh7"] Apr 23 16:54:02.211282 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:02.211259 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7a432adb-48f0-4c8d-b575-8d6823b357e7-must-gather-output\") pod \"must-gather-xnmh7\" (UID: \"7a432adb-48f0-4c8d-b575-8d6823b357e7\") " pod="openshift-must-gather-gwvnc/must-gather-xnmh7" Apr 23 16:54:02.211363 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:02.211295 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ssg9f\" (UniqueName: \"kubernetes.io/projected/7a432adb-48f0-4c8d-b575-8d6823b357e7-kube-api-access-ssg9f\") pod \"must-gather-xnmh7\" (UID: \"7a432adb-48f0-4c8d-b575-8d6823b357e7\") " pod="openshift-must-gather-gwvnc/must-gather-xnmh7" Apr 23 16:54:02.312112 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:02.312087 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ssg9f\" (UniqueName: \"kubernetes.io/projected/7a432adb-48f0-4c8d-b575-8d6823b357e7-kube-api-access-ssg9f\") pod \"must-gather-xnmh7\" (UID: \"7a432adb-48f0-4c8d-b575-8d6823b357e7\") " pod="openshift-must-gather-gwvnc/must-gather-xnmh7" Apr 23 16:54:02.312440 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:02.312156 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7a432adb-48f0-4c8d-b575-8d6823b357e7-must-gather-output\") pod \"must-gather-xnmh7\" (UID: \"7a432adb-48f0-4c8d-b575-8d6823b357e7\") " pod="openshift-must-gather-gwvnc/must-gather-xnmh7" Apr 23 16:54:02.312497 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:02.312453 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7a432adb-48f0-4c8d-b575-8d6823b357e7-must-gather-output\") pod \"must-gather-xnmh7\" (UID: \"7a432adb-48f0-4c8d-b575-8d6823b357e7\") " pod="openshift-must-gather-gwvnc/must-gather-xnmh7" Apr 23 16:54:02.319209 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:02.319189 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ssg9f\" (UniqueName: \"kubernetes.io/projected/7a432adb-48f0-4c8d-b575-8d6823b357e7-kube-api-access-ssg9f\") pod \"must-gather-xnmh7\" (UID: \"7a432adb-48f0-4c8d-b575-8d6823b357e7\") " pod="openshift-must-gather-gwvnc/must-gather-xnmh7" Apr 23 16:54:02.472480 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:02.472433 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gwvnc/must-gather-xnmh7" Apr 23 16:54:02.584193 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:02.584165 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gwvnc/must-gather-xnmh7"] Apr 23 16:54:02.587581 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:54:02.587552 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7a432adb_48f0_4c8d_b575_8d6823b357e7.slice/crio-582a8304560fe4a4d1f71c5c47287ab8f96b2dfb98be231a0a8b3ff533636d1c WatchSource:0}: Error finding container 582a8304560fe4a4d1f71c5c47287ab8f96b2dfb98be231a0a8b3ff533636d1c: Status 404 returned error can't find the container with id 582a8304560fe4a4d1f71c5c47287ab8f96b2dfb98be231a0a8b3ff533636d1c Apr 23 16:54:03.255514 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:03.255475 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gwvnc/must-gather-xnmh7" event={"ID":"7a432adb-48f0-4c8d-b575-8d6823b357e7","Type":"ContainerStarted","Data":"582a8304560fe4a4d1f71c5c47287ab8f96b2dfb98be231a0a8b3ff533636d1c"} Apr 23 16:54:04.261039 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:04.260984 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gwvnc/must-gather-xnmh7" event={"ID":"7a432adb-48f0-4c8d-b575-8d6823b357e7","Type":"ContainerStarted","Data":"5804a4c20f2d61f6d188b8605fcb9d8756b9b6ee68225a7ec840752945c6fb6d"} Apr 23 16:54:04.261039 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:04.261044 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gwvnc/must-gather-xnmh7" event={"ID":"7a432adb-48f0-4c8d-b575-8d6823b357e7","Type":"ContainerStarted","Data":"6f25e97c5a8d1a425d595493c34df065af35cf893b88318f790421ab1e2e97db"} Apr 23 16:54:04.276472 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:04.276413 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gwvnc/must-gather-xnmh7" podStartSLOduration=1.5009891899999999 podStartE2EDuration="2.27639566s" podCreationTimestamp="2026-04-23 16:54:02 +0000 UTC" firstStartedPulling="2026-04-23 16:54:02.589359885 +0000 UTC m=+1148.279146208" lastFinishedPulling="2026-04-23 16:54:03.364766343 +0000 UTC m=+1149.054552678" observedRunningTime="2026-04-23 16:54:04.274383098 +0000 UTC m=+1149.964169443" watchObservedRunningTime="2026-04-23 16:54:04.27639566 +0000 UTC m=+1149.966182006" Apr 23 16:54:04.835430 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:04.835396 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-xv7s5_3a80e579-6e61-4497-9477-5154f3af3b17/global-pull-secret-syncer/0.log" Apr 23 16:54:04.939769 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:04.939742 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-rq49t_d28475ec-752d-4947-9e85-f35681ad68ab/konnectivity-agent/0.log" Apr 23 16:54:04.991686 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:04.991662 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-137-14.ec2.internal_c67cb434c3c0499bd3e70b96d76e7361/haproxy/0.log" Apr 23 16:54:06.998909 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:06.998814 2573 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-s92tk/must-gather-r6wjp"] Apr 23 16:54:06.999891 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:06.999822 2573 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-s92tk/must-gather-r6wjp" podUID="013703ce-4894-44f8-b1f5-3fba2dfae5d4" containerName="copy" containerID="cri-o://57da90dd5e267c089824fc7b27745370870e0f13c10092bc8a0d8e9bdfd5405f" gracePeriod=2 Apr 23 16:54:07.001841 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:07.001815 2573 status_manager.go:895] "Failed to get status for pod" podUID="013703ce-4894-44f8-b1f5-3fba2dfae5d4" pod="openshift-must-gather-s92tk/must-gather-r6wjp" err="pods \"must-gather-r6wjp\" is forbidden: User \"system:node:ip-10-0-137-14.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-s92tk\": no relationship found between node 'ip-10-0-137-14.ec2.internal' and this object" Apr 23 16:54:07.004305 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:07.004287 2573 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-s92tk/must-gather-r6wjp"] Apr 23 16:54:07.284304 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:07.279816 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-s92tk_must-gather-r6wjp_013703ce-4894-44f8-b1f5-3fba2dfae5d4/copy/0.log" Apr 23 16:54:07.284304 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:07.280171 2573 generic.go:358] "Generic (PLEG): container finished" podID="013703ce-4894-44f8-b1f5-3fba2dfae5d4" containerID="57da90dd5e267c089824fc7b27745370870e0f13c10092bc8a0d8e9bdfd5405f" exitCode=143 Apr 23 16:54:07.344731 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:07.343637 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-s92tk_must-gather-r6wjp_013703ce-4894-44f8-b1f5-3fba2dfae5d4/copy/0.log" Apr 23 16:54:07.344731 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:07.344022 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s92tk/must-gather-r6wjp" Apr 23 16:54:07.351766 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:07.347849 2573 status_manager.go:895] "Failed to get status for pod" podUID="013703ce-4894-44f8-b1f5-3fba2dfae5d4" pod="openshift-must-gather-s92tk/must-gather-r6wjp" err="pods \"must-gather-r6wjp\" is forbidden: User \"system:node:ip-10-0-137-14.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-s92tk\": no relationship found between node 'ip-10-0-137-14.ec2.internal' and this object" Apr 23 16:54:07.458719 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:07.457903 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsvd4\" (UniqueName: \"kubernetes.io/projected/013703ce-4894-44f8-b1f5-3fba2dfae5d4-kube-api-access-bsvd4\") pod \"013703ce-4894-44f8-b1f5-3fba2dfae5d4\" (UID: \"013703ce-4894-44f8-b1f5-3fba2dfae5d4\") " Apr 23 16:54:07.458719 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:07.458005 2573 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/013703ce-4894-44f8-b1f5-3fba2dfae5d4-must-gather-output\") pod \"013703ce-4894-44f8-b1f5-3fba2dfae5d4\" (UID: \"013703ce-4894-44f8-b1f5-3fba2dfae5d4\") " Apr 23 16:54:07.459521 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:07.459253 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/013703ce-4894-44f8-b1f5-3fba2dfae5d4-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "013703ce-4894-44f8-b1f5-3fba2dfae5d4" (UID: "013703ce-4894-44f8-b1f5-3fba2dfae5d4"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 16:54:07.461827 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:07.461802 2573 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/013703ce-4894-44f8-b1f5-3fba2dfae5d4-kube-api-access-bsvd4" (OuterVolumeSpecName: "kube-api-access-bsvd4") pod "013703ce-4894-44f8-b1f5-3fba2dfae5d4" (UID: "013703ce-4894-44f8-b1f5-3fba2dfae5d4"). InnerVolumeSpecName "kube-api-access-bsvd4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 16:54:07.558677 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:07.558591 2573 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bsvd4\" (UniqueName: \"kubernetes.io/projected/013703ce-4894-44f8-b1f5-3fba2dfae5d4-kube-api-access-bsvd4\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:54:07.558677 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:07.558633 2573 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/013703ce-4894-44f8-b1f5-3fba2dfae5d4-must-gather-output\") on node \"ip-10-0-137-14.ec2.internal\" DevicePath \"\"" Apr 23 16:54:08.209157 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:08.209111 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f1188671-4e6b-4695-8519-525d5a1559ed/alertmanager/0.log" Apr 23 16:54:08.236043 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:08.236014 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f1188671-4e6b-4695-8519-525d5a1559ed/config-reloader/0.log" Apr 23 16:54:08.261479 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:08.261457 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f1188671-4e6b-4695-8519-525d5a1559ed/kube-rbac-proxy-web/0.log" Apr 23 16:54:08.284281 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:08.284251 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-s92tk_must-gather-r6wjp_013703ce-4894-44f8-b1f5-3fba2dfae5d4/copy/0.log" Apr 23 16:54:08.284681 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:08.284657 2573 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-s92tk/must-gather-r6wjp" Apr 23 16:54:08.284819 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:08.284711 2573 scope.go:117] "RemoveContainer" containerID="57da90dd5e267c089824fc7b27745370870e0f13c10092bc8a0d8e9bdfd5405f" Apr 23 16:54:08.286309 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:08.286279 2573 status_manager.go:895] "Failed to get status for pod" podUID="013703ce-4894-44f8-b1f5-3fba2dfae5d4" pod="openshift-must-gather-s92tk/must-gather-r6wjp" err="pods \"must-gather-r6wjp\" is forbidden: User \"system:node:ip-10-0-137-14.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-s92tk\": no relationship found between node 'ip-10-0-137-14.ec2.internal' and this object" Apr 23 16:54:08.286603 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:08.286555 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f1188671-4e6b-4695-8519-525d5a1559ed/kube-rbac-proxy/0.log" Apr 23 16:54:08.300016 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:08.299848 2573 status_manager.go:895] "Failed to get status for pod" podUID="013703ce-4894-44f8-b1f5-3fba2dfae5d4" pod="openshift-must-gather-s92tk/must-gather-r6wjp" err="pods \"must-gather-r6wjp\" is forbidden: User \"system:node:ip-10-0-137-14.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-s92tk\": no relationship found between node 'ip-10-0-137-14.ec2.internal' and this object" Apr 23 16:54:08.305348 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:08.303060 2573 scope.go:117] "RemoveContainer" containerID="04397ad0ccfb76b06c06f6838a601f7ab2f8096a7ba82f8d62daf27086de83aa" Apr 23 16:54:08.313627 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:08.313605 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f1188671-4e6b-4695-8519-525d5a1559ed/kube-rbac-proxy-metric/0.log" Apr 23 16:54:08.335825 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:08.335793 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f1188671-4e6b-4695-8519-525d5a1559ed/prom-label-proxy/0.log" Apr 23 16:54:08.358398 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:08.358371 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_f1188671-4e6b-4695-8519-525d5a1559ed/init-config-reloader/0.log" Apr 23 16:54:08.744779 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:08.744749 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-z88zz_4953b4c2-f7bb-425b-a8af-7cff76f0e8c8/node-exporter/0.log" Apr 23 16:54:08.765606 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:08.765576 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-z88zz_4953b4c2-f7bb-425b-a8af-7cff76f0e8c8/kube-rbac-proxy/0.log" Apr 23 16:54:08.788279 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:08.788252 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-z88zz_4953b4c2-f7bb-425b-a8af-7cff76f0e8c8/init-textfile/0.log" Apr 23 16:54:08.818203 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:08.818172 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-7mt24_046d19c7-3894-44f1-bc2d-035f0169e8d2/kube-rbac-proxy-main/0.log" Apr 23 16:54:08.838989 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:08.838957 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-7mt24_046d19c7-3894-44f1-bc2d-035f0169e8d2/kube-rbac-proxy-self/0.log" Apr 23 16:54:08.862869 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:08.862838 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-7mt24_046d19c7-3894-44f1-bc2d-035f0169e8d2/openshift-state-metrics/0.log" Apr 23 16:54:08.881140 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:08.881105 2573 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="013703ce-4894-44f8-b1f5-3fba2dfae5d4" path="/var/lib/kubelet/pods/013703ce-4894-44f8-b1f5-3fba2dfae5d4/volumes" Apr 23 16:54:08.907795 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:08.907769 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1bfd3bfb-a0c6-4571-a32d-702092eaf236/prometheus/0.log" Apr 23 16:54:08.929257 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:08.929136 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1bfd3bfb-a0c6-4571-a32d-702092eaf236/config-reloader/0.log" Apr 23 16:54:08.967425 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:08.967396 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1bfd3bfb-a0c6-4571-a32d-702092eaf236/thanos-sidecar/0.log" Apr 23 16:54:08.990066 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:08.990043 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1bfd3bfb-a0c6-4571-a32d-702092eaf236/kube-rbac-proxy-web/0.log" Apr 23 16:54:09.012364 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:09.012336 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1bfd3bfb-a0c6-4571-a32d-702092eaf236/kube-rbac-proxy/0.log" Apr 23 16:54:09.035458 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:09.035408 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1bfd3bfb-a0c6-4571-a32d-702092eaf236/kube-rbac-proxy-thanos/0.log" Apr 23 16:54:09.057841 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:09.057811 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_1bfd3bfb-a0c6-4571-a32d-702092eaf236/init-config-reloader/0.log" Apr 23 16:54:11.894359 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:11.894317 2573 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwfhd"] Apr 23 16:54:11.894933 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:11.894717 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="013703ce-4894-44f8-b1f5-3fba2dfae5d4" containerName="copy" Apr 23 16:54:11.894933 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:11.894736 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="013703ce-4894-44f8-b1f5-3fba2dfae5d4" containerName="copy" Apr 23 16:54:11.894933 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:11.894779 2573 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="013703ce-4894-44f8-b1f5-3fba2dfae5d4" containerName="gather" Apr 23 16:54:11.894933 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:11.894788 2573 state_mem.go:107] "Deleted CPUSet assignment" podUID="013703ce-4894-44f8-b1f5-3fba2dfae5d4" containerName="gather" Apr 23 16:54:11.894933 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:11.894852 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="013703ce-4894-44f8-b1f5-3fba2dfae5d4" containerName="copy" Apr 23 16:54:11.894933 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:11.894871 2573 memory_manager.go:356] "RemoveStaleState removing state" podUID="013703ce-4894-44f8-b1f5-3fba2dfae5d4" containerName="gather" Apr 23 16:54:11.899030 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:11.899002 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwfhd" Apr 23 16:54:11.906366 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:11.906341 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwfhd"] Apr 23 16:54:11.996347 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:11.996311 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3164d510-358f-4720-967a-3fbd87835b1b-lib-modules\") pod \"perf-node-gather-daemonset-dwfhd\" (UID: \"3164d510-358f-4720-967a-3fbd87835b1b\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwfhd" Apr 23 16:54:11.996792 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:11.996746 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttrwn\" (UniqueName: \"kubernetes.io/projected/3164d510-358f-4720-967a-3fbd87835b1b-kube-api-access-ttrwn\") pod \"perf-node-gather-daemonset-dwfhd\" (UID: \"3164d510-358f-4720-967a-3fbd87835b1b\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwfhd" Apr 23 16:54:11.996932 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:11.996816 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3164d510-358f-4720-967a-3fbd87835b1b-proc\") pod \"perf-node-gather-daemonset-dwfhd\" (UID: \"3164d510-358f-4720-967a-3fbd87835b1b\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwfhd" Apr 23 16:54:11.996932 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:11.996862 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3164d510-358f-4720-967a-3fbd87835b1b-podres\") pod \"perf-node-gather-daemonset-dwfhd\" (UID: \"3164d510-358f-4720-967a-3fbd87835b1b\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwfhd" Apr 23 16:54:11.996932 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:11.996889 2573 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3164d510-358f-4720-967a-3fbd87835b1b-sys\") pod \"perf-node-gather-daemonset-dwfhd\" (UID: \"3164d510-358f-4720-967a-3fbd87835b1b\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwfhd" Apr 23 16:54:12.097990 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:12.097955 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3164d510-358f-4720-967a-3fbd87835b1b-lib-modules\") pod \"perf-node-gather-daemonset-dwfhd\" (UID: \"3164d510-358f-4720-967a-3fbd87835b1b\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwfhd" Apr 23 16:54:12.098146 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:12.098010 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ttrwn\" (UniqueName: \"kubernetes.io/projected/3164d510-358f-4720-967a-3fbd87835b1b-kube-api-access-ttrwn\") pod \"perf-node-gather-daemonset-dwfhd\" (UID: \"3164d510-358f-4720-967a-3fbd87835b1b\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwfhd" Apr 23 16:54:12.098146 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:12.098057 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3164d510-358f-4720-967a-3fbd87835b1b-proc\") pod \"perf-node-gather-daemonset-dwfhd\" (UID: \"3164d510-358f-4720-967a-3fbd87835b1b\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwfhd" Apr 23 16:54:12.098146 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:12.098104 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3164d510-358f-4720-967a-3fbd87835b1b-podres\") pod \"perf-node-gather-daemonset-dwfhd\" (UID: \"3164d510-358f-4720-967a-3fbd87835b1b\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwfhd" Apr 23 16:54:12.098146 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:12.098131 2573 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3164d510-358f-4720-967a-3fbd87835b1b-sys\") pod \"perf-node-gather-daemonset-dwfhd\" (UID: \"3164d510-358f-4720-967a-3fbd87835b1b\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwfhd" Apr 23 16:54:12.098399 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:12.098282 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/3164d510-358f-4720-967a-3fbd87835b1b-podres\") pod \"perf-node-gather-daemonset-dwfhd\" (UID: \"3164d510-358f-4720-967a-3fbd87835b1b\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwfhd" Apr 23 16:54:12.098455 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:12.098413 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3164d510-358f-4720-967a-3fbd87835b1b-sys\") pod \"perf-node-gather-daemonset-dwfhd\" (UID: \"3164d510-358f-4720-967a-3fbd87835b1b\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwfhd" Apr 23 16:54:12.098617 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:12.098593 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3164d510-358f-4720-967a-3fbd87835b1b-lib-modules\") pod \"perf-node-gather-daemonset-dwfhd\" (UID: \"3164d510-358f-4720-967a-3fbd87835b1b\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwfhd" Apr 23 16:54:12.098709 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:12.098533 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/3164d510-358f-4720-967a-3fbd87835b1b-proc\") pod \"perf-node-gather-daemonset-dwfhd\" (UID: \"3164d510-358f-4720-967a-3fbd87835b1b\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwfhd" Apr 23 16:54:12.106454 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:12.106428 2573 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttrwn\" (UniqueName: \"kubernetes.io/projected/3164d510-358f-4720-967a-3fbd87835b1b-kube-api-access-ttrwn\") pod \"perf-node-gather-daemonset-dwfhd\" (UID: \"3164d510-358f-4720-967a-3fbd87835b1b\") " pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwfhd" Apr 23 16:54:12.210799 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:12.210749 2573 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwfhd" Apr 23 16:54:12.382392 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:12.382372 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-c9zd9_1c58deba-82d5-4b30-a362-237c21e8311b/dns/0.log" Apr 23 16:54:12.404919 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:12.404891 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-c9zd9_1c58deba-82d5-4b30-a362-237c21e8311b/kube-rbac-proxy/0.log" Apr 23 16:54:12.549232 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:12.549211 2573 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwfhd"] Apr 23 16:54:12.551062 ip-10-0-137-14 kubenswrapper[2573]: W0423 16:54:12.551038 2573 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod3164d510_358f_4720_967a_3fbd87835b1b.slice/crio-3f9ada13b2473f65fdda6e431ca45d0fd35df59b210eabb0509e7590bf867b08 WatchSource:0}: Error finding container 3f9ada13b2473f65fdda6e431ca45d0fd35df59b210eabb0509e7590bf867b08: Status 404 returned error can't find the container with id 3f9ada13b2473f65fdda6e431ca45d0fd35df59b210eabb0509e7590bf867b08 Apr 23 16:54:12.554455 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:12.554439 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-b8tpj_56124f3f-030d-47d4-99f9-65b3011d5573/dns-node-resolver/0.log" Apr 23 16:54:12.967229 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:12.967205 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-6d44bcc48b-45nq9_e39f91f1-ba00-471a-9d79-c0574e83f873/registry/0.log" Apr 23 16:54:13.035904 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:13.035876 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-tk6kx_0a0f1a6d-d0aa-4632-8dd9-0adbe8707e89/node-ca/0.log" Apr 23 16:54:13.306246 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:13.306144 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwfhd" event={"ID":"3164d510-358f-4720-967a-3fbd87835b1b","Type":"ContainerStarted","Data":"7e852ec2e743d71e527b1ee40891d9c0f5d2a417bb405df4a2c1ffe10ad84774"} Apr 23 16:54:13.306246 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:13.306186 2573 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwfhd" event={"ID":"3164d510-358f-4720-967a-3fbd87835b1b","Type":"ContainerStarted","Data":"3f9ada13b2473f65fdda6e431ca45d0fd35df59b210eabb0509e7590bf867b08"} Apr 23 16:54:13.306246 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:13.306244 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwfhd" Apr 23 16:54:13.322547 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:13.322502 2573 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwfhd" podStartSLOduration=2.322486053 podStartE2EDuration="2.322486053s" podCreationTimestamp="2026-04-23 16:54:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 16:54:13.321856179 +0000 UTC m=+1159.011642525" watchObservedRunningTime="2026-04-23 16:54:13.322486053 +0000 UTC m=+1159.012272398" Apr 23 16:54:14.085626 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:14.085574 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-g4m2z_a0c13d03-c009-47c2-b8ff-f968c81cb35c/serve-healthcheck-canary/0.log" Apr 23 16:54:14.479474 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:14.479447 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8nvz6_a887758a-12b6-4011-819b-3e2204768863/kube-rbac-proxy/0.log" Apr 23 16:54:14.499711 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:14.499674 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8nvz6_a887758a-12b6-4011-819b-3e2204768863/exporter/0.log" Apr 23 16:54:14.520166 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:14.520142 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-8nvz6_a887758a-12b6-4011-819b-3e2204768863/extractor/0.log" Apr 23 16:54:16.750231 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:16.750189 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-q2rt4_901d2ca7-8019-4f76-af7c-f4abd010f487/s3-init/0.log" Apr 23 16:54:19.319224 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:19.319197 2573 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-gwvnc/perf-node-gather-daemonset-dwfhd" Apr 23 16:54:21.953320 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:21.953290 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8zwmw_0f6f780e-a2ae-473d-ad75-c644275b6cdb/kube-multus-additional-cni-plugins/0.log" Apr 23 16:54:21.974351 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:21.974326 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8zwmw_0f6f780e-a2ae-473d-ad75-c644275b6cdb/egress-router-binary-copy/0.log" Apr 23 16:54:21.993572 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:21.993550 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8zwmw_0f6f780e-a2ae-473d-ad75-c644275b6cdb/cni-plugins/0.log" Apr 23 16:54:22.012961 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:22.012942 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8zwmw_0f6f780e-a2ae-473d-ad75-c644275b6cdb/bond-cni-plugin/0.log" Apr 23 16:54:22.035002 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:22.034979 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8zwmw_0f6f780e-a2ae-473d-ad75-c644275b6cdb/routeoverride-cni/0.log" Apr 23 16:54:22.054796 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:22.054774 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8zwmw_0f6f780e-a2ae-473d-ad75-c644275b6cdb/whereabouts-cni-bincopy/0.log" Apr 23 16:54:22.074121 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:22.074099 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-8zwmw_0f6f780e-a2ae-473d-ad75-c644275b6cdb/whereabouts-cni/0.log" Apr 23 16:54:22.261169 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:22.261094 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-b7t9t_0056f1d2-57d7-40d1-9290-31c514f0d40e/kube-multus/0.log" Apr 23 16:54:22.392009 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:22.391971 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jpzq7_8306d95a-dbae-4dd7-bf93-637a12f98c59/network-metrics-daemon/0.log" Apr 23 16:54:22.411391 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:22.411364 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-jpzq7_8306d95a-dbae-4dd7-bf93-637a12f98c59/kube-rbac-proxy/0.log" Apr 23 16:54:23.460715 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:23.460637 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wd2cz_52988e90-484a-49cd-98f6-5510a28890d6/ovn-controller/0.log" Apr 23 16:54:23.478131 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:23.478111 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wd2cz_52988e90-484a-49cd-98f6-5510a28890d6/ovn-acl-logging/0.log" Apr 23 16:54:23.483179 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:23.483162 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wd2cz_52988e90-484a-49cd-98f6-5510a28890d6/ovn-acl-logging/1.log" Apr 23 16:54:23.498468 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:23.498454 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wd2cz_52988e90-484a-49cd-98f6-5510a28890d6/kube-rbac-proxy-node/0.log" Apr 23 16:54:23.519686 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:23.519667 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wd2cz_52988e90-484a-49cd-98f6-5510a28890d6/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 16:54:23.542534 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:23.542513 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wd2cz_52988e90-484a-49cd-98f6-5510a28890d6/northd/0.log" Apr 23 16:54:23.562542 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:23.562508 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wd2cz_52988e90-484a-49cd-98f6-5510a28890d6/nbdb/0.log" Apr 23 16:54:23.585843 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:23.585822 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wd2cz_52988e90-484a-49cd-98f6-5510a28890d6/sbdb/0.log" Apr 23 16:54:23.699990 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:23.699966 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-wd2cz_52988e90-484a-49cd-98f6-5510a28890d6/ovnkube-controller/0.log" Apr 23 16:54:24.972630 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:24.972596 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-ch99b_493d9466-44b1-4315-9f1b-a60f6bb428c1/network-check-target-container/0.log" Apr 23 16:54:25.874162 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:25.874136 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-xrqlx_a4c2c1ca-68ee-40d1-8110-1c24a086d157/iptables-alerter/0.log" Apr 23 16:54:26.486979 ip-10-0-137-14 kubenswrapper[2573]: I0423 16:54:26.486957 2573 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_tuned-xjm86_cb4c9ed2-c60a-4c94-9f67-e156f422d6a0/tuned/0.log"