Apr 16 13:59:33.686895 ip-10-0-133-133 systemd[1]: Starting Kubernetes Kubelet... Apr 16 13:59:34.077247 ip-10-0-133-133 kubenswrapper[2570]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:34.077247 ip-10-0-133-133 kubenswrapper[2570]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 13:59:34.077247 ip-10-0-133-133 kubenswrapper[2570]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:34.077247 ip-10-0-133-133 kubenswrapper[2570]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 13:59:34.077247 ip-10-0-133-133 kubenswrapper[2570]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:34.078517 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.078419 2570 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 13:59:34.082509 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082482 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:34.082509 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082504 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:34.082509 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082508 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:34.082509 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082511 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:34.082509 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082514 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:34.082509 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082518 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:34.082754 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082521 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:34.082754 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082524 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:34.082754 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082540 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:34.082754 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082545 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:34.082754 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082550 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:34.082754 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082552 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:34.082754 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082555 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:34.082754 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082558 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:34.082754 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082560 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:34.082754 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082563 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:34.082754 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082566 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:34.082754 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082569 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:34.082754 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082572 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:34.082754 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082578 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:34.082754 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082581 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:34.082754 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082583 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:34.082754 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082586 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:34.082754 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082589 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:34.082754 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082591 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:34.082754 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082594 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:34.083233 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082596 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:34.083233 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082599 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:34.083233 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082602 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:34.083233 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082605 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:34.083233 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082607 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:34.083233 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082610 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:34.083233 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082613 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:34.083233 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082615 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:34.083233 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082618 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:34.083233 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082620 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:34.083233 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082623 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:34.083233 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082625 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:34.083233 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082628 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:34.083233 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082631 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:34.083233 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082634 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:34.083233 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082637 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:34.083233 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082640 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:34.083233 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082643 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:34.083233 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082645 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:34.083233 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082649 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:34.083776 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082652 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:34.083776 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082654 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:34.083776 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082657 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:34.083776 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082659 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:34.083776 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082665 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:34.083776 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082668 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:34.083776 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082671 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:34.083776 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082674 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:34.083776 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082677 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:34.083776 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082680 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:34.083776 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082683 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:34.083776 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082686 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:34.083776 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082688 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:34.083776 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082691 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:34.083776 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082693 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:34.083776 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082696 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:34.083776 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082699 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:34.083776 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082701 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:34.083776 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082704 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:34.084247 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082707 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:34.084247 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082709 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:34.084247 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082712 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:34.084247 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082714 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:34.084247 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082717 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:34.084247 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082719 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:34.084247 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082724 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:34.084247 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082727 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:34.084247 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082729 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:34.084247 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082732 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:34.084247 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082735 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:34.084247 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082739 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:34.084247 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082741 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:34.084247 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082744 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:34.084247 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082747 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:34.084247 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082749 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:34.084247 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082752 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:34.084247 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082754 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:34.084247 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082757 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:34.084731 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082760 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:34.084731 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.082763 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:34.084731 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083177 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:34.084731 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083182 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:34.084731 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083185 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:34.084731 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083187 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:34.084731 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083190 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:34.084731 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083193 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:34.084731 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083197 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:34.084731 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083201 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:34.084731 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083204 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:34.084731 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083207 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:34.084731 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083209 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:34.084731 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083212 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:34.084731 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083215 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:34.084731 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083217 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:34.084731 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083220 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:34.084731 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083223 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:34.084731 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083225 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:34.084731 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083230 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:34.085201 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083233 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:34.085201 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083235 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:34.085201 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083238 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:34.085201 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083241 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:34.085201 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083244 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:34.085201 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083247 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:34.085201 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083250 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:34.085201 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083253 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:34.085201 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083255 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:34.085201 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083258 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:34.085201 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083261 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:34.085201 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083263 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:34.085201 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083266 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:34.085201 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083268 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:34.085201 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083271 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:34.085201 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083273 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:34.085201 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083276 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:34.085201 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083279 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:34.085201 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083281 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:34.085201 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083284 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:34.085718 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083286 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:34.085718 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083289 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:34.085718 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083291 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:34.085718 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083294 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:34.085718 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083298 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:34.085718 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083301 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:34.085718 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083304 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:34.085718 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083308 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:34.085718 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083311 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:34.085718 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083314 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:34.085718 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083317 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:34.085718 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083321 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:34.085718 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083324 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:34.085718 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083327 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:34.085718 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083330 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:34.085718 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083332 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:34.085718 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083335 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:34.085718 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083337 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:34.085718 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083340 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:34.085718 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083343 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:34.086225 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083345 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:34.086225 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083348 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:34.086225 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083351 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:34.086225 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083354 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:34.086225 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083357 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:34.086225 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083359 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:34.086225 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083362 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:34.086225 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083365 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:34.086225 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083367 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:34.086225 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083370 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:34.086225 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083372 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:34.086225 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083375 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:34.086225 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083377 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:34.086225 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083380 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:34.086225 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083382 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:34.086225 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083385 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:34.086225 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083388 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:34.086225 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083391 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:34.086225 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083393 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:34.086708 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083396 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:34.086708 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083398 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:34.086708 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083401 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:34.086708 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083404 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:34.086708 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083406 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:34.086708 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083409 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:34.086708 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083411 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:34.086708 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083414 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:34.086708 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.083416 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:34.086708 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084039 2570 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 13:59:34.086708 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084049 2570 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 13:59:34.086708 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084062 2570 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 13:59:34.086708 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084067 2570 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 13:59:34.086708 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084072 2570 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 13:59:34.086708 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084076 2570 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 13:59:34.086708 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084080 2570 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 13:59:34.086708 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084084 2570 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 13:59:34.086708 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084088 2570 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 13:59:34.086708 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084091 2570 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 13:59:34.086708 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084094 2570 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 13:59:34.086708 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084097 2570 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 13:59:34.086708 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084100 2570 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 13:59:34.087271 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084104 2570 flags.go:64] FLAG: --cgroup-root="" Apr 16 13:59:34.087271 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084106 2570 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 13:59:34.087271 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084109 2570 flags.go:64] FLAG: --client-ca-file="" Apr 16 13:59:34.087271 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084112 2570 flags.go:64] FLAG: --cloud-config="" Apr 16 13:59:34.087271 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084115 2570 flags.go:64] FLAG: --cloud-provider="external" Apr 16 13:59:34.087271 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084118 2570 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 13:59:34.087271 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084123 2570 flags.go:64] FLAG: --cluster-domain="" Apr 16 13:59:34.087271 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084126 2570 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 13:59:34.087271 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084130 2570 flags.go:64] FLAG: --config-dir="" Apr 16 13:59:34.087271 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084133 2570 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 13:59:34.087271 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084136 2570 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 13:59:34.087271 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084141 2570 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 13:59:34.087271 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084144 2570 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 13:59:34.087271 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084147 2570 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 13:59:34.087271 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084150 2570 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 13:59:34.087271 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084153 2570 flags.go:64] FLAG: --contention-profiling="false" Apr 16 13:59:34.087271 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084157 2570 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 13:59:34.087271 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084160 2570 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 13:59:34.087271 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084163 2570 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 13:59:34.087271 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084166 2570 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 13:59:34.087271 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084170 2570 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 13:59:34.087271 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084173 2570 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 13:59:34.087271 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084176 2570 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 13:59:34.087271 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084179 2570 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 13:59:34.087271 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084183 2570 flags.go:64] FLAG: --enable-server="true" Apr 16 13:59:34.087898 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084186 2570 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 13:59:34.087898 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084191 2570 flags.go:64] FLAG: --event-burst="100" Apr 16 13:59:34.087898 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084195 2570 flags.go:64] FLAG: --event-qps="50" Apr 16 13:59:34.087898 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084198 2570 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 13:59:34.087898 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084202 2570 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 13:59:34.087898 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084205 2570 flags.go:64] FLAG: --eviction-hard="" Apr 16 13:59:34.087898 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084209 2570 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 13:59:34.087898 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084212 2570 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 13:59:34.087898 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084215 2570 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 13:59:34.087898 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084218 2570 flags.go:64] FLAG: --eviction-soft="" Apr 16 13:59:34.087898 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084221 2570 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 13:59:34.087898 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084224 2570 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 13:59:34.087898 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084227 2570 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 13:59:34.087898 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084230 2570 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 13:59:34.087898 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084233 2570 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 13:59:34.087898 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084236 2570 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 13:59:34.087898 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084239 2570 flags.go:64] FLAG: --feature-gates="" Apr 16 13:59:34.087898 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084243 2570 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 13:59:34.087898 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084246 2570 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 13:59:34.087898 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084250 2570 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 13:59:34.087898 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084253 2570 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 13:59:34.087898 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084257 2570 flags.go:64] FLAG: --healthz-port="10248" Apr 16 13:59:34.087898 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084260 2570 flags.go:64] FLAG: --help="false" Apr 16 13:59:34.087898 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084263 2570 flags.go:64] FLAG: --hostname-override="ip-10-0-133-133.ec2.internal" Apr 16 13:59:34.087898 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084266 2570 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 13:59:34.088502 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084269 2570 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 13:59:34.088502 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084272 2570 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 13:59:34.088502 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084276 2570 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 13:59:34.088502 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084279 2570 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 13:59:34.088502 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084282 2570 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 13:59:34.088502 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084285 2570 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 13:59:34.088502 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084288 2570 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 13:59:34.088502 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084292 2570 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 13:59:34.088502 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084295 2570 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 13:59:34.088502 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084298 2570 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 13:59:34.088502 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084301 2570 flags.go:64] FLAG: --kube-reserved="" Apr 16 13:59:34.088502 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084304 2570 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 13:59:34.088502 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084307 2570 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 13:59:34.088502 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084310 2570 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 13:59:34.088502 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084313 2570 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 13:59:34.088502 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084316 2570 flags.go:64] FLAG: --lock-file="" Apr 16 13:59:34.088502 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084319 2570 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 13:59:34.088502 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084323 2570 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 13:59:34.088502 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084326 2570 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 13:59:34.088502 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084331 2570 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 13:59:34.088502 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084337 2570 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 13:59:34.088502 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084340 2570 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 13:59:34.088502 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084343 2570 flags.go:64] FLAG: --logging-format="text" Apr 16 13:59:34.089120 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084346 2570 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 13:59:34.089120 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084350 2570 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 13:59:34.089120 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084353 2570 flags.go:64] FLAG: --manifest-url="" Apr 16 13:59:34.089120 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084356 2570 flags.go:64] FLAG: --manifest-url-header="" Apr 16 13:59:34.089120 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084360 2570 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 13:59:34.089120 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084363 2570 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 13:59:34.089120 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084368 2570 flags.go:64] FLAG: --max-pods="110" Apr 16 13:59:34.089120 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084371 2570 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 13:59:34.089120 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084374 2570 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 13:59:34.089120 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084377 2570 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 13:59:34.089120 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084380 2570 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 13:59:34.089120 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084383 2570 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 13:59:34.089120 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084386 2570 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 13:59:34.089120 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084389 2570 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 13:59:34.089120 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084397 2570 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 13:59:34.089120 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084403 2570 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 13:59:34.089120 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084406 2570 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 13:59:34.089120 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084409 2570 flags.go:64] FLAG: --pod-cidr="" Apr 16 13:59:34.089120 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084413 2570 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 13:59:34.089120 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084419 2570 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 13:59:34.089120 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084422 2570 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 13:59:34.089120 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084426 2570 flags.go:64] FLAG: --pods-per-core="0" Apr 16 13:59:34.089120 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084429 2570 flags.go:64] FLAG: --port="10250" Apr 16 13:59:34.089120 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084432 2570 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 13:59:34.089705 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084435 2570 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-07b156a1227da3543" Apr 16 13:59:34.089705 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084438 2570 flags.go:64] FLAG: --qos-reserved="" Apr 16 13:59:34.089705 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084441 2570 flags.go:64] FLAG: --read-only-port="10255" Apr 16 13:59:34.089705 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084444 2570 flags.go:64] FLAG: --register-node="true" Apr 16 13:59:34.089705 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084447 2570 flags.go:64] FLAG: --register-schedulable="true" Apr 16 13:59:34.089705 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084450 2570 flags.go:64] FLAG: --register-with-taints="" Apr 16 13:59:34.089705 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084454 2570 flags.go:64] FLAG: --registry-burst="10" Apr 16 13:59:34.089705 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084457 2570 flags.go:64] FLAG: --registry-qps="5" Apr 16 13:59:34.089705 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084460 2570 flags.go:64] FLAG: --reserved-cpus="" Apr 16 13:59:34.089705 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084463 2570 flags.go:64] FLAG: --reserved-memory="" Apr 16 13:59:34.089705 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084466 2570 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 13:59:34.089705 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084470 2570 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 13:59:34.089705 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084473 2570 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 13:59:34.089705 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084475 2570 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 13:59:34.089705 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084478 2570 flags.go:64] FLAG: --runonce="false" Apr 16 13:59:34.089705 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084481 2570 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 13:59:34.089705 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084484 2570 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 13:59:34.089705 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084487 2570 flags.go:64] FLAG: --seccomp-default="false" Apr 16 13:59:34.089705 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084490 2570 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 13:59:34.089705 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084493 2570 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 13:59:34.089705 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084496 2570 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 13:59:34.089705 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084499 2570 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 13:59:34.089705 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084502 2570 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 13:59:34.089705 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084505 2570 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 13:59:34.089705 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084509 2570 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 13:59:34.089705 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084512 2570 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 13:59:34.090344 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084517 2570 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 13:59:34.090344 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084520 2570 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 13:59:34.090344 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084523 2570 flags.go:64] FLAG: --system-cgroups="" Apr 16 13:59:34.090344 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084526 2570 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 13:59:34.090344 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084545 2570 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 13:59:34.090344 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084548 2570 flags.go:64] FLAG: --tls-cert-file="" Apr 16 13:59:34.090344 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084551 2570 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 13:59:34.090344 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084555 2570 flags.go:64] FLAG: --tls-min-version="" Apr 16 13:59:34.090344 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084558 2570 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 13:59:34.090344 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084561 2570 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 13:59:34.090344 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084564 2570 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 13:59:34.090344 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084572 2570 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 13:59:34.090344 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084575 2570 flags.go:64] FLAG: --v="2" Apr 16 13:59:34.090344 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084579 2570 flags.go:64] FLAG: --version="false" Apr 16 13:59:34.090344 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084583 2570 flags.go:64] FLAG: --vmodule="" Apr 16 13:59:34.090344 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084587 2570 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 13:59:34.090344 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084591 2570 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 13:59:34.090344 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084708 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:34.090344 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084712 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:34.090344 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084715 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:34.090344 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084718 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:34.090344 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084721 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:34.090344 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084724 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:34.090344 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084727 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:34.090932 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084730 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:34.090932 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084734 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:34.090932 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084737 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:34.090932 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084740 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:34.090932 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084743 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:34.090932 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084746 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:34.090932 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084750 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:34.090932 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084753 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:34.090932 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084756 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:34.090932 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084762 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:34.090932 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084765 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:34.090932 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084767 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:34.090932 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084771 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:34.090932 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084775 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:34.090932 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084777 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:34.090932 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084780 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:34.090932 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084784 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:34.090932 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084788 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:34.090932 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084791 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:34.091415 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084794 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:34.091415 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084797 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:34.091415 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084799 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:34.091415 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084802 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:34.091415 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084805 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:34.091415 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084807 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:34.091415 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084810 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:34.091415 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084813 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:34.091415 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084815 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:34.091415 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084818 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:34.091415 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084821 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:34.091415 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084823 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:34.091415 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084826 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:34.091415 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084828 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:34.091415 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084831 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:34.091415 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084834 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:34.091415 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084836 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:34.091415 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084839 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:34.091415 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084842 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:34.091415 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084846 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:34.091945 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084849 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:34.091945 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084852 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:34.091945 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084855 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:34.091945 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084858 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:34.091945 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084860 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:34.091945 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084863 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:34.091945 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084865 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:34.091945 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084868 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:34.091945 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084870 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:34.091945 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084873 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:34.091945 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084876 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:34.091945 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084878 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:34.091945 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084881 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:34.091945 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084884 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:34.091945 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084887 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:34.091945 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084889 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:34.091945 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084892 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:34.091945 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084894 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:34.091945 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084897 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:34.092454 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084900 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:34.092454 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084902 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:34.092454 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084905 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:34.092454 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084907 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:34.092454 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084910 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:34.092454 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084912 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:34.092454 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084915 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:34.092454 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084917 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:34.092454 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084920 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:34.092454 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084922 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:34.092454 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084925 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:34.092454 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084927 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:34.092454 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084931 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:34.092454 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084934 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:34.092454 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084937 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:34.092454 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084939 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:34.092454 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084942 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:34.092454 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084945 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:34.092454 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084948 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:34.092454 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084950 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:34.092967 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.084953 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:34.092967 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.084958 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:34.092967 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.091502 2570 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 13:59:34.092967 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.091522 2570 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 13:59:34.092967 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091598 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:34.092967 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091604 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:34.092967 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091607 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:34.092967 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091611 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:34.092967 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091615 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:34.092967 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091639 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:34.092967 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091643 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:34.092967 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091646 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:34.092967 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091649 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:34.092967 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091652 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:34.092967 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091656 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:34.093350 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091661 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:34.093350 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091664 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:34.093350 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091667 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:34.093350 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091670 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:34.093350 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091673 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:34.093350 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091676 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:34.093350 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091679 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:34.093350 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091681 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:34.093350 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091684 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:34.093350 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091688 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:34.093350 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091691 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:34.093350 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091693 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:34.093350 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091696 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:34.093350 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091699 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:34.093350 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091702 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:34.093350 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091704 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:34.093350 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091707 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:34.093350 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091709 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:34.093350 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091712 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:34.093350 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091715 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:34.093862 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091719 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:34.093862 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091722 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:34.093862 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091725 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:34.093862 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091728 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:34.093862 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091731 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:34.093862 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091733 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:34.093862 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091736 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:34.093862 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091739 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:34.093862 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091741 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:34.093862 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091744 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:34.093862 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091746 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:34.093862 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091749 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:34.093862 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091751 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:34.093862 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091754 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:34.093862 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091765 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:34.093862 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091768 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:34.093862 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091771 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:34.093862 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091774 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:34.093862 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091777 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:34.094333 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091779 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:34.094333 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091782 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:34.094333 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091785 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:34.094333 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091788 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:34.094333 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091790 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:34.094333 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091793 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:34.094333 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091795 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:34.094333 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091798 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:34.094333 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091801 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:34.094333 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091813 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:34.094333 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091816 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:34.094333 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091819 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:34.094333 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091822 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:34.094333 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091825 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:34.094333 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091829 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:34.094333 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091832 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:34.094333 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091835 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:34.094333 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091837 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:34.094333 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091840 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:34.094333 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091843 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:34.094845 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091845 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:34.094845 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091848 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:34.094845 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091850 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:34.094845 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091853 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:34.094845 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091855 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:34.094845 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091858 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:34.094845 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091861 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:34.094845 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091863 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:34.094845 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091866 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:34.094845 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091869 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:34.094845 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091872 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:34.094845 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091874 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:34.094845 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091877 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:34.094845 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091880 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:34.094845 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091882 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:34.094845 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091885 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:34.095237 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.091890 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:34.095237 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.091995 2570 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:34.095237 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092001 2570 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:34.095237 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092004 2570 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:34.095237 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092007 2570 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:34.095237 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092010 2570 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:34.095237 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092013 2570 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:34.095237 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092016 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:34.095237 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092019 2570 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:34.095237 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092022 2570 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:34.095237 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092025 2570 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:34.095237 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092029 2570 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:34.095237 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092032 2570 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:34.095237 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092034 2570 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:34.095237 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092037 2570 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:34.095636 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092040 2570 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:34.095636 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092042 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:34.095636 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092045 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:34.095636 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092047 2570 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:34.095636 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092050 2570 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:34.095636 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092053 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:34.095636 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092055 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:34.095636 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092058 2570 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:34.095636 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092061 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:34.095636 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092064 2570 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:34.095636 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092067 2570 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:34.095636 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092069 2570 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:34.095636 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092072 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:34.095636 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092075 2570 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:34.095636 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092078 2570 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:34.095636 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092080 2570 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:34.095636 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092083 2570 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:34.095636 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092085 2570 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:34.095636 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092089 2570 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:34.096169 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092093 2570 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:34.096169 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092097 2570 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:34.096169 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092099 2570 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:34.096169 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092102 2570 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:34.096169 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092105 2570 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:34.096169 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092108 2570 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:34.096169 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092111 2570 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:34.096169 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092114 2570 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:34.096169 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092117 2570 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:34.096169 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092120 2570 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:34.096169 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092123 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:34.096169 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092126 2570 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:34.096169 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092129 2570 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:34.096169 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092132 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:34.096169 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092135 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:34.096169 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092138 2570 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:34.096169 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092140 2570 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:34.096169 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092143 2570 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:34.096169 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092145 2570 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:34.096169 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092148 2570 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:34.096702 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092151 2570 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:34.096702 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092153 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:34.096702 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092156 2570 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:34.096702 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092159 2570 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:34.096702 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092161 2570 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:34.096702 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092164 2570 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:34.096702 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092167 2570 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:34.096702 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092169 2570 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:34.096702 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092172 2570 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:34.096702 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092175 2570 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:34.096702 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092177 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:34.096702 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092180 2570 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:34.096702 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092182 2570 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:34.096702 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092186 2570 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:34.096702 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092190 2570 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:34.096702 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092192 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:34.096702 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092195 2570 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:34.096702 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092198 2570 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:34.096702 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092200 2570 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:34.096702 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092203 2570 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:34.097195 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092206 2570 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:34.097195 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092209 2570 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:34.097195 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092211 2570 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:34.097195 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092214 2570 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:34.097195 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092217 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:34.097195 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092219 2570 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:34.097195 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092222 2570 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:34.097195 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092225 2570 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:34.097195 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092227 2570 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:34.097195 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092230 2570 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:34.097195 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092232 2570 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:34.097195 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092235 2570 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:34.097195 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:34.092237 2570 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:34.097195 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.092242 2570 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:34.097195 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.092912 2570 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 13:59:34.097195 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.095419 2570 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 13:59:34.097613 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.096297 2570 server.go:1019] "Starting client certificate rotation" Apr 16 13:59:34.097613 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.096396 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:59:34.097613 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.096440 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:59:34.117423 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.117397 2570 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:59:34.120584 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.120467 2570 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:59:34.131915 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.131889 2570 log.go:25] "Validated CRI v1 runtime API" Apr 16 13:59:34.137291 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.137272 2570 log.go:25] "Validated CRI v1 image API" Apr 16 13:59:34.138613 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.138589 2570 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 13:59:34.142344 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.142322 2570 fs.go:135] Filesystem UUIDs: map[57691e02-e928-41f8-bc85-479817f97a3b:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 f04e133f-ac7c-442f-b175-f13a373cc97e:/dev/nvme0n1p3] Apr 16 13:59:34.142410 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.142344 2570 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 13:59:34.148370 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.148212 2570 manager.go:217] Machine: {Timestamp:2026-04-16 13:59:34.146307613 +0000 UTC m=+0.351950768 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3088291 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2b38bf20d867ab2ee5c7a91dee99f6 SystemUUID:ec2b38bf-20d8-67ab-2ee5-c7a91dee99f6 BootID:cbbf6c3e-00d2-433f-94dd-cde929321983 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:89:e8:2a:c6:d7 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:89:e8:2a:c6:d7 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:32:17:92:58:98:4d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 13:59:34.148370 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.148356 2570 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 13:59:34.148515 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.148451 2570 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 13:59:34.149618 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.149589 2570 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 13:59:34.149788 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.149621 2570 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-133.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 13:59:34.149861 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.149797 2570 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 13:59:34.149861 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.149806 2570 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 13:59:34.149861 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.149828 2570 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:59:34.150506 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.150494 2570 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:59:34.152344 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.152324 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:59:34.152422 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.152407 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:59:34.152805 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.152794 2570 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 13:59:34.154916 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.154905 2570 kubelet.go:491] "Attempting to sync node with API server" Apr 16 13:59:34.154984 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.154927 2570 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 13:59:34.154984 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.154941 2570 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 13:59:34.154984 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.154951 2570 kubelet.go:397] "Adding apiserver pod source" Apr 16 13:59:34.154984 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.154962 2570 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 13:59:34.155949 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.155933 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:59:34.155949 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.155952 2570 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:59:34.158824 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.158799 2570 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 13:59:34.163211 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.163182 2570 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 13:59:34.165382 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.165360 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 13:59:34.165466 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.165390 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 13:59:34.165466 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.165397 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 13:59:34.165466 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.165403 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 13:59:34.165466 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.165409 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 13:59:34.165466 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.165415 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 13:59:34.165466 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.165420 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 13:59:34.165466 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.165426 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 13:59:34.165466 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.165433 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 13:59:34.165466 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.165440 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 13:59:34.165466 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.165449 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 13:59:34.165466 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.165457 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 13:59:34.165767 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.165484 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 13:59:34.165767 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.165492 2570 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 13:59:34.167146 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:34.167120 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-133.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 13:59:34.167224 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:34.167162 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 13:59:34.168345 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.168329 2570 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-133.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 13:59:34.169628 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.169614 2570 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 13:59:34.169704 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.169657 2570 server.go:1295] "Started kubelet" Apr 16 13:59:34.169775 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.169748 2570 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 13:59:34.169851 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.169806 2570 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 13:59:34.169906 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.169878 2570 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 13:59:34.170616 ip-10-0-133-133 systemd[1]: Started Kubernetes Kubelet. Apr 16 13:59:34.171208 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.171107 2570 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 13:59:34.172264 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.172250 2570 server.go:317] "Adding debug handlers to kubelet server" Apr 16 13:59:34.174855 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.174837 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5n92m" Apr 16 13:59:34.177754 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.177733 2570 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 13:59:34.178085 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.178065 2570 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 13:59:34.178580 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:34.177625 2570 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-133.ec2.internal.18a6db122ddd87fa default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-133.ec2.internal,UID:ip-10-0-133-133.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-133-133.ec2.internal,},FirstTimestamp:2026-04-16 13:59:34.169626618 +0000 UTC m=+0.375269773,LastTimestamp:2026-04-16 13:59:34.169626618 +0000 UTC m=+0.375269773,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-133.ec2.internal,}" Apr 16 13:59:34.178849 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.178825 2570 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 13:59:34.178970 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.178854 2570 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 13:59:34.178970 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:34.178955 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-133.ec2.internal\" not found" Apr 16 13:59:34.179303 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.179225 2570 reconstruct.go:97] "Volume reconstruction finished" Apr 16 13:59:34.179303 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.179235 2570 reconciler.go:26] "Reconciler: start to sync state" Apr 16 13:59:34.179393 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.178990 2570 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 13:59:34.180742 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.180710 2570 factory.go:55] Registering systemd factory Apr 16 13:59:34.180742 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.180730 2570 factory.go:223] Registration of the systemd container factory successfully Apr 16 13:59:34.181076 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.181056 2570 factory.go:153] Registering CRI-O factory Apr 16 13:59:34.181175 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.181082 2570 factory.go:223] Registration of the crio container factory successfully Apr 16 13:59:34.181175 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:34.181061 2570 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 13:59:34.181175 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.181142 2570 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 13:59:34.181318 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.181188 2570 factory.go:103] Registering Raw factory Apr 16 13:59:34.181318 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.181206 2570 manager.go:1196] Started watching for new ooms in manager Apr 16 13:59:34.181555 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.181526 2570 manager.go:319] Starting recovery of all containers Apr 16 13:59:34.182897 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:34.182868 2570 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 13:59:34.183020 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:34.182910 2570 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-133-133.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 13:59:34.183350 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.183330 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-5n92m" Apr 16 13:59:34.190726 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.190498 2570 manager.go:324] Recovery completed Apr 16 13:59:34.191293 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.190483 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 13:59:34.196367 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.196349 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:34.199072 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.199056 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-133.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:34.199133 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.199086 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-133.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:34.199133 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.199097 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-133.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:34.199642 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.199630 2570 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 13:59:34.199642 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.199642 2570 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 13:59:34.199715 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.199657 2570 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:59:34.201021 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:34.200948 2570 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-133.ec2.internal.18a6db122f9ed711 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-133.ec2.internal,UID:ip-10-0-133-133.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-133-133.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-133-133.ec2.internal,},FirstTimestamp:2026-04-16 13:59:34.199072529 +0000 UTC m=+0.404715684,LastTimestamp:2026-04-16 13:59:34.199072529 +0000 UTC m=+0.404715684,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-133.ec2.internal,}" Apr 16 13:59:34.203096 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.203083 2570 policy_none.go:49] "None policy: Start" Apr 16 13:59:34.203096 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.203099 2570 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 13:59:34.203187 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.203109 2570 state_mem.go:35] "Initializing new in-memory state store" Apr 16 13:59:34.240408 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.240369 2570 manager.go:341] "Starting Device Plugin manager" Apr 16 13:59:34.266846 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:34.240430 2570 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 13:59:34.266846 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.240444 2570 server.go:85] "Starting device plugin registration server" Apr 16 13:59:34.266846 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.240746 2570 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 13:59:34.266846 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.240759 2570 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 13:59:34.266846 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.240858 2570 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 13:59:34.266846 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.240957 2570 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 13:59:34.266846 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.240967 2570 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 13:59:34.266846 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:34.241558 2570 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 13:59:34.266846 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:34.241596 2570 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-133.ec2.internal\" not found" Apr 16 13:59:34.304086 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.304052 2570 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 13:59:34.304086 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.304087 2570 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 13:59:34.304285 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.304106 2570 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 13:59:34.304285 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.304113 2570 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 13:59:34.304285 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:34.304147 2570 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 13:59:34.308699 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.308677 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:34.340954 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.340882 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:34.342180 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.342164 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-133.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:34.342257 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.342197 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-133.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:34.342257 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.342214 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-133.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:34.342257 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.342239 2570 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-133.ec2.internal" Apr 16 13:59:34.349082 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.349066 2570 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-133.ec2.internal" Apr 16 13:59:34.349133 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:34.349091 2570 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-133.ec2.internal\": node \"ip-10-0-133-133.ec2.internal\" not found" Apr 16 13:59:34.378236 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:34.378200 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-133.ec2.internal\" not found" Apr 16 13:59:34.404517 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.404482 2570 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-133.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-133.ec2.internal"] Apr 16 13:59:34.404668 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.404585 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:34.406097 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.406081 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-133.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:34.406197 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.406110 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-133.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:34.406197 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.406125 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-133.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:34.408479 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.408465 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:34.408648 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.408634 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-133.ec2.internal" Apr 16 13:59:34.408704 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.408663 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:34.409272 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.409251 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-133.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:34.409272 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.409269 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-133.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:34.409395 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.409280 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-133.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:34.409395 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.409290 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-133.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:34.409395 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.409292 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-133.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:34.409395 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.409305 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-133.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:34.412082 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.412066 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-133.ec2.internal" Apr 16 13:59:34.412142 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.412094 2570 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:34.412974 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.412960 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-133.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:34.413036 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.412986 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-133.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:34.413036 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.412997 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-133.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:34.436338 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:34.436313 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-133.ec2.internal\" not found" node="ip-10-0-133-133.ec2.internal" Apr 16 13:59:34.440658 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:34.440639 2570 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-133.ec2.internal\" not found" node="ip-10-0-133-133.ec2.internal" Apr 16 13:59:34.478708 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:34.478667 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-133.ec2.internal\" not found" Apr 16 13:59:34.480983 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.480961 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/39d671c484f3397b8f32690faf7d7a58-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-133.ec2.internal\" (UID: \"39d671c484f3397b8f32690faf7d7a58\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-133.ec2.internal" Apr 16 13:59:34.481084 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.480994 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/39d671c484f3397b8f32690faf7d7a58-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-133.ec2.internal\" (UID: \"39d671c484f3397b8f32690faf7d7a58\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-133.ec2.internal" Apr 16 13:59:34.481084 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.481013 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/02f569f2777966beb695e394b803ecc2-config\") pod \"kube-apiserver-proxy-ip-10-0-133-133.ec2.internal\" (UID: \"02f569f2777966beb695e394b803ecc2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-133.ec2.internal" Apr 16 13:59:34.579171 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:34.579136 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-133.ec2.internal\" not found" Apr 16 13:59:34.581366 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.581346 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/02f569f2777966beb695e394b803ecc2-config\") pod \"kube-apiserver-proxy-ip-10-0-133-133.ec2.internal\" (UID: \"02f569f2777966beb695e394b803ecc2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-133.ec2.internal" Apr 16 13:59:34.581425 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.581375 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/39d671c484f3397b8f32690faf7d7a58-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-133.ec2.internal\" (UID: \"39d671c484f3397b8f32690faf7d7a58\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-133.ec2.internal" Apr 16 13:59:34.581425 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.581392 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/39d671c484f3397b8f32690faf7d7a58-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-133.ec2.internal\" (UID: \"39d671c484f3397b8f32690faf7d7a58\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-133.ec2.internal" Apr 16 13:59:34.581487 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.581424 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/39d671c484f3397b8f32690faf7d7a58-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-133.ec2.internal\" (UID: \"39d671c484f3397b8f32690faf7d7a58\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-133.ec2.internal" Apr 16 13:59:34.581487 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.581433 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/02f569f2777966beb695e394b803ecc2-config\") pod \"kube-apiserver-proxy-ip-10-0-133-133.ec2.internal\" (UID: \"02f569f2777966beb695e394b803ecc2\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-133.ec2.internal" Apr 16 13:59:34.581487 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.581461 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/39d671c484f3397b8f32690faf7d7a58-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-133.ec2.internal\" (UID: \"39d671c484f3397b8f32690faf7d7a58\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-133.ec2.internal" Apr 16 13:59:34.680281 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:34.680206 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-133.ec2.internal\" not found" Apr 16 13:59:34.740601 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.740567 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-133.ec2.internal" Apr 16 13:59:34.743826 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:34.743806 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-133.ec2.internal" Apr 16 13:59:34.780421 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:34.780384 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-133.ec2.internal\" not found" Apr 16 13:59:34.881002 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:34.880969 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-133.ec2.internal\" not found" Apr 16 13:59:34.981636 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:34.981562 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-133.ec2.internal\" not found" Apr 16 13:59:35.004824 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:35.004805 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:35.082633 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:35.082601 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-133.ec2.internal\" not found" Apr 16 13:59:35.095941 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:35.095918 2570 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 13:59:35.096125 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:35.096101 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:59:35.096167 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:35.096101 2570 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:59:35.178395 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:35.178364 2570 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 13:59:35.183622 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:35.183598 2570 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-133.ec2.internal\" not found" Apr 16 13:59:35.185800 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:35.185761 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 13:54:34 +0000 UTC" deadline="2028-02-01 01:35:21.568122984 +0000 UTC" Apr 16 13:59:35.185860 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:35.185801 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="15731h35m46.382325552s" Apr 16 13:59:35.202061 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:35.202028 2570 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:59:35.218254 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:35.218229 2570 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:35.254955 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:35.254880 2570 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-r9shd" Apr 16 13:59:35.263726 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:35.263698 2570 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-r9shd" Apr 16 13:59:35.279683 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:35.279650 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-133.ec2.internal" Apr 16 13:59:35.290927 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:35.290901 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:59:35.293316 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:35.293300 2570 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-133.ec2.internal" Apr 16 13:59:35.296790 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:35.296771 2570 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:35.302179 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:35.302151 2570 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:59:35.487188 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:35.487152 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod02f569f2777966beb695e394b803ecc2.slice/crio-45acb78ed5d4aa99836ea3f9034dd39966e5640333a4ce9e922d2ef641c57525 WatchSource:0}: Error finding container 45acb78ed5d4aa99836ea3f9034dd39966e5640333a4ce9e922d2ef641c57525: Status 404 returned error can't find the container with id 45acb78ed5d4aa99836ea3f9034dd39966e5640333a4ce9e922d2ef641c57525 Apr 16 13:59:35.489056 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:35.489027 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39d671c484f3397b8f32690faf7d7a58.slice/crio-2b2c8cffa5d578cd5e4a5708c58fb1a27237af120cfa5a9dd9b0cbcff83bc6ad WatchSource:0}: Error finding container 2b2c8cffa5d578cd5e4a5708c58fb1a27237af120cfa5a9dd9b0cbcff83bc6ad: Status 404 returned error can't find the container with id 2b2c8cffa5d578cd5e4a5708c58fb1a27237af120cfa5a9dd9b0cbcff83bc6ad Apr 16 13:59:35.493106 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:35.493091 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 13:59:36.155753 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.155718 2570 apiserver.go:52] "Watching apiserver" Apr 16 13:59:36.162613 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.162586 2570 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 13:59:36.162947 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.162923 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-kqv5v","openshift-image-registry/node-ca-p9j4h","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-133.ec2.internal","openshift-multus/multus-vp4gn","openshift-multus/network-metrics-daemon-29pd4","openshift-ovn-kubernetes/ovnkube-node-p2d74","kube-system/konnectivity-agent-wkbwm","kube-system/kube-apiserver-proxy-ip-10-0-133-133.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mh9s8","openshift-cluster-node-tuning-operator/tuned-brkpj","openshift-multus/multus-additional-cni-plugins-bg5bb","openshift-network-diagnostics/network-check-target-88c5t","openshift-network-operator/iptables-alerter-rtqtn"] Apr 16 13:59:36.169262 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.169239 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-wkbwm" Apr 16 13:59:36.171332 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.171303 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-p9j4h" Apr 16 13:59:36.171748 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.171730 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 13:59:36.171871 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.171815 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-hbb7x\"" Apr 16 13:59:36.172268 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.172254 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 13:59:36.173000 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.172982 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 13:59:36.173226 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.173199 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 13:59:36.173397 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.173380 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 13:59:36.173521 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.173505 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.173812 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.173639 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-29pd4" Apr 16 13:59:36.173812 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.173669 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-d49wl\"" Apr 16 13:59:36.173812 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:36.173767 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-29pd4" podUID="e6284f77-08e3-4846-904d-6a21f10707ae" Apr 16 13:59:36.175084 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.175069 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 13:59:36.175556 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.175487 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 13:59:36.175700 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.175683 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 13:59:36.175776 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.175731 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 13:59:36.175882 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.175868 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-7c6ww\"" Apr 16 13:59:36.178063 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.178045 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.178134 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.178073 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kqv5v" Apr 16 13:59:36.180095 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.180079 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 13:59:36.180197 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.180080 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 13:59:36.180197 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.180111 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 13:59:36.180333 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.180309 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mh9s8" Apr 16 13:59:36.180441 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.180395 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 13:59:36.180491 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.180482 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 13:59:36.180885 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.180865 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 13:59:36.180986 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.180917 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 13:59:36.180986 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.180917 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-6q6qc\"" Apr 16 13:59:36.180986 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.180976 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 13:59:36.181106 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.181014 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-g9hw5\"" Apr 16 13:59:36.182477 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.182440 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.182610 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.182498 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 13:59:36.182679 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.182612 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 13:59:36.182942 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.182924 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 13:59:36.182942 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.182938 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-jt6pm\"" Apr 16 13:59:36.184287 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.184272 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:59:36.184401 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.184385 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-27xcp\"" Apr 16 13:59:36.184451 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.184421 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 13:59:36.184931 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.184917 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bg5bb" Apr 16 13:59:36.186668 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.186648 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 13:59:36.186742 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.186674 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-g24vs\"" Apr 16 13:59:36.186830 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.186812 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 13:59:36.188108 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.187367 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-88c5t" Apr 16 13:59:36.188108 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:36.187476 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-88c5t" podUID="5a916223-1676-42c3-a13e-815b7355eb26" Apr 16 13:59:36.190083 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.190057 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-host-var-lib-cni-multus\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.190189 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.190139 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-multus-socket-dir-parent\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.190255 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.190196 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3d42db8b-b5be-43e6-bad1-6040da8c586f-sys-fs\") pod \"aws-ebs-csi-driver-node-mh9s8\" (UID: \"3d42db8b-b5be-43e6-bad1-6040da8c586f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mh9s8" Apr 16 13:59:36.190255 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.190233 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/346c3280-2b45-4be3-8629-46903ecfe4b8-ovnkube-script-lib\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.190353 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.190277 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9npd\" (UniqueName: \"kubernetes.io/projected/e6284f77-08e3-4846-904d-6a21f10707ae-kube-api-access-n9npd\") pod \"network-metrics-daemon-29pd4\" (UID: \"e6284f77-08e3-4846-904d-6a21f10707ae\") " pod="openshift-multus/network-metrics-daemon-29pd4" Apr 16 13:59:36.190353 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.190311 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3d42db8b-b5be-43e6-bad1-6040da8c586f-device-dir\") pod \"aws-ebs-csi-driver-node-mh9s8\" (UID: \"3d42db8b-b5be-43e6-bad1-6040da8c586f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mh9s8" Apr 16 13:59:36.190448 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.190352 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/29d8d95e-1f57-49fb-9896-340b389f0eea-serviceca\") pod \"node-ca-p9j4h\" (UID: \"29d8d95e-1f57-49fb-9896-340b389f0eea\") " pod="openshift-image-registry/node-ca-p9j4h" Apr 16 13:59:36.190448 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.190393 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-log-socket\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.190448 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.190426 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-run-systemd\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.190623 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.190461 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-run-ovn\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.190623 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.190494 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-host-cni-netd\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.190623 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.190520 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-host-var-lib-cni-bin\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.190623 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.190572 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-host-run-multus-certs\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.190623 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.190613 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m792q\" (UniqueName: \"kubernetes.io/projected/258d5bb3-0083-4fff-96dd-2c9e007b3c05-kube-api-access-m792q\") pod \"node-resolver-kqv5v\" (UID: \"258d5bb3-0083-4fff-96dd-2c9e007b3c05\") " pod="openshift-dns/node-resolver-kqv5v" Apr 16 13:59:36.190841 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.190645 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-host-run-netns\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.190841 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.190678 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-host-run-ovn-kubernetes\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.190841 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.190710 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-system-cni-dir\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.190841 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.190770 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-host-slash\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.190841 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.190828 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3d42db8b-b5be-43e6-bad1-6040da8c586f-socket-dir\") pod \"aws-ebs-csi-driver-node-mh9s8\" (UID: \"3d42db8b-b5be-43e6-bad1-6040da8c586f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mh9s8" Apr 16 13:59:36.191045 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.190879 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-os-release\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.191045 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.190911 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/346c3280-2b45-4be3-8629-46903ecfe4b8-env-overrides\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.191045 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.190924 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rtqtn" Apr 16 13:59:36.191045 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.190982 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-host-run-netns\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.191045 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.191021 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1edb190f-96e8-4548-8b55-97073b01a7ed-multus-daemon-config\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.191245 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.191047 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3d42db8b-b5be-43e6-bad1-6040da8c586f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mh9s8\" (UID: \"3d42db8b-b5be-43e6-bad1-6040da8c586f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mh9s8" Apr 16 13:59:36.191245 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.191095 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-445th\" (UniqueName: \"kubernetes.io/projected/29d8d95e-1f57-49fb-9896-340b389f0eea-kube-api-access-445th\") pod \"node-ca-p9j4h\" (UID: \"29d8d95e-1f57-49fb-9896-340b389f0eea\") " pod="openshift-image-registry/node-ca-p9j4h" Apr 16 13:59:36.191245 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.191135 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-var-lib-openvswitch\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.191245 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.191233 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkmc5\" (UniqueName: \"kubernetes.io/projected/346c3280-2b45-4be3-8629-46903ecfe4b8-kube-api-access-qkmc5\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.191429 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.191278 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-host-kubelet\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.191477 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.191464 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-node-log\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.191570 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.191549 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-cnibin\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.191868 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.191849 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-host-run-k8s-cni-cncf-io\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.191943 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.191880 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-hostroot\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.191943 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.191902 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29d8d95e-1f57-49fb-9896-340b389f0eea-host\") pod \"node-ca-p9j4h\" (UID: \"29d8d95e-1f57-49fb-9896-340b389f0eea\") " pod="openshift-image-registry/node-ca-p9j4h" Apr 16 13:59:36.192043 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.191941 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-slnwb\" (UniqueName: \"kubernetes.io/projected/3d42db8b-b5be-43e6-bad1-6040da8c586f-kube-api-access-slnwb\") pod \"aws-ebs-csi-driver-node-mh9s8\" (UID: \"3d42db8b-b5be-43e6-bad1-6040da8c586f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mh9s8" Apr 16 13:59:36.192043 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.191964 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/529a37ce-5549-43e3-bcab-ff0f9a6e46d6-konnectivity-ca\") pod \"konnectivity-agent-wkbwm\" (UID: \"529a37ce-5549-43e3-bcab-ff0f9a6e46d6\") " pod="kube-system/konnectivity-agent-wkbwm" Apr 16 13:59:36.192043 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.191985 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-systemd-units\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.192043 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.192026 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-run-openvswitch\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.192238 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.192054 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-host-cni-bin\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.192238 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.192075 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.192238 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.192103 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/346c3280-2b45-4be3-8629-46903ecfe4b8-ovn-node-metrics-cert\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.192238 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.192142 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-host-var-lib-kubelet\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.192238 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.192172 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-etc-openvswitch\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.192238 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.192197 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgjsd\" (UniqueName: \"kubernetes.io/projected/1edb190f-96e8-4548-8b55-97073b01a7ed-kube-api-access-xgjsd\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.192238 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.192229 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e6284f77-08e3-4846-904d-6a21f10707ae-metrics-certs\") pod \"network-metrics-daemon-29pd4\" (UID: \"e6284f77-08e3-4846-904d-6a21f10707ae\") " pod="openshift-multus/network-metrics-daemon-29pd4" Apr 16 13:59:36.192485 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.192274 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-etc-kubernetes\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.192485 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.192325 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/258d5bb3-0083-4fff-96dd-2c9e007b3c05-tmp-dir\") pod \"node-resolver-kqv5v\" (UID: \"258d5bb3-0083-4fff-96dd-2c9e007b3c05\") " pod="openshift-dns/node-resolver-kqv5v" Apr 16 13:59:36.192485 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.192349 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/258d5bb3-0083-4fff-96dd-2c9e007b3c05-hosts-file\") pod \"node-resolver-kqv5v\" (UID: \"258d5bb3-0083-4fff-96dd-2c9e007b3c05\") " pod="openshift-dns/node-resolver-kqv5v" Apr 16 13:59:36.192485 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.192372 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1edb190f-96e8-4548-8b55-97073b01a7ed-cni-binary-copy\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.192485 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.192404 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/346c3280-2b45-4be3-8629-46903ecfe4b8-ovnkube-config\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.192485 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.192452 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-multus-conf-dir\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.192485 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.192482 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3d42db8b-b5be-43e6-bad1-6040da8c586f-registration-dir\") pod \"aws-ebs-csi-driver-node-mh9s8\" (UID: \"3d42db8b-b5be-43e6-bad1-6040da8c586f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mh9s8" Apr 16 13:59:36.192737 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.192516 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3d42db8b-b5be-43e6-bad1-6040da8c586f-etc-selinux\") pod \"aws-ebs-csi-driver-node-mh9s8\" (UID: \"3d42db8b-b5be-43e6-bad1-6040da8c586f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mh9s8" Apr 16 13:59:36.192737 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.192559 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/529a37ce-5549-43e3-bcab-ff0f9a6e46d6-agent-certs\") pod \"konnectivity-agent-wkbwm\" (UID: \"529a37ce-5549-43e3-bcab-ff0f9a6e46d6\") " pod="kube-system/konnectivity-agent-wkbwm" Apr 16 13:59:36.192737 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.192618 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-multus-cni-dir\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.192828 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.192773 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:59:36.192944 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.192924 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 13:59:36.192998 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.192968 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-9knzn\"" Apr 16 13:59:36.193048 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.192998 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 13:59:36.266399 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.266357 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:54:35 +0000 UTC" deadline="2027-10-22 15:52:26.114451138 +0000 UTC" Apr 16 13:59:36.266399 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.266392 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13297h52m49.848061177s" Apr 16 13:59:36.280730 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.280706 2570 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 13:59:36.293465 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.293439 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-host-kubelet\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.293569 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.293468 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-node-log\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.293569 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.293493 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-cnibin\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.293569 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.293553 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-node-log\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.293569 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.293564 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-host-kubelet\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.293746 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.293599 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-host-run-k8s-cni-cncf-io\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.293746 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.293608 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-cnibin\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.293746 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.293618 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-hostroot\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.293746 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.293633 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-host-run-k8s-cni-cncf-io\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.293746 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.293634 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29d8d95e-1f57-49fb-9896-340b389f0eea-host\") pod \"node-ca-p9j4h\" (UID: \"29d8d95e-1f57-49fb-9896-340b389f0eea\") " pod="openshift-image-registry/node-ca-p9j4h" Apr 16 13:59:36.293746 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.293653 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/29d8d95e-1f57-49fb-9896-340b389f0eea-host\") pod \"node-ca-p9j4h\" (UID: \"29d8d95e-1f57-49fb-9896-340b389f0eea\") " pod="openshift-image-registry/node-ca-p9j4h" Apr 16 13:59:36.293746 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.293665 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d0acc482-de34-4188-ba44-20d609da46d0-system-cni-dir\") pod \"multus-additional-cni-plugins-bg5bb\" (UID: \"d0acc482-de34-4188-ba44-20d609da46d0\") " pod="openshift-multus/multus-additional-cni-plugins-bg5bb" Apr 16 13:59:36.293746 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.293685 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4df2426e-59b0-4dd7-ac07-90d478ff86c2-lib-modules\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.293746 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.293699 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-hostroot\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.293746 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.293703 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7mgh\" (UniqueName: \"kubernetes.io/projected/d0acc482-de34-4188-ba44-20d609da46d0-kube-api-access-g7mgh\") pod \"multus-additional-cni-plugins-bg5bb\" (UID: \"d0acc482-de34-4188-ba44-20d609da46d0\") " pod="openshift-multus/multus-additional-cni-plugins-bg5bb" Apr 16 13:59:36.294167 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.293771 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ltp4b\" (UniqueName: \"kubernetes.io/projected/e5c96c36-5440-4e0b-b632-7b2269bd2b80-kube-api-access-ltp4b\") pod \"iptables-alerter-rtqtn\" (UID: \"e5c96c36-5440-4e0b-b632-7b2269bd2b80\") " pod="openshift-network-operator/iptables-alerter-rtqtn" Apr 16 13:59:36.294167 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.293793 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-slnwb\" (UniqueName: \"kubernetes.io/projected/3d42db8b-b5be-43e6-bad1-6040da8c586f-kube-api-access-slnwb\") pod \"aws-ebs-csi-driver-node-mh9s8\" (UID: \"3d42db8b-b5be-43e6-bad1-6040da8c586f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mh9s8" Apr 16 13:59:36.294167 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.293810 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/529a37ce-5549-43e3-bcab-ff0f9a6e46d6-konnectivity-ca\") pod \"konnectivity-agent-wkbwm\" (UID: \"529a37ce-5549-43e3-bcab-ff0f9a6e46d6\") " pod="kube-system/konnectivity-agent-wkbwm" Apr 16 13:59:36.294167 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.293829 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-systemd-units\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.294167 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.293853 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-run-openvswitch\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.294167 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.293890 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-run-openvswitch\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.294167 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.293889 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-systemd-units\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.294167 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.293922 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-host-cni-bin\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.294167 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.293950 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.294167 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.293975 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/346c3280-2b45-4be3-8629-46903ecfe4b8-ovn-node-metrics-cert\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.294167 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294024 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-host-cni-bin\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.294167 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294029 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.294167 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294062 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-host-var-lib-kubelet\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.294167 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294086 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-etc-openvswitch\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.294167 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294107 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xgjsd\" (UniqueName: \"kubernetes.io/projected/1edb190f-96e8-4548-8b55-97073b01a7ed-kube-api-access-xgjsd\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.294167 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294133 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e6284f77-08e3-4846-904d-6a21f10707ae-metrics-certs\") pod \"network-metrics-daemon-29pd4\" (UID: \"e6284f77-08e3-4846-904d-6a21f10707ae\") " pod="openshift-multus/network-metrics-daemon-29pd4" Apr 16 13:59:36.294167 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294155 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-host-var-lib-kubelet\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.294942 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294163 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4df2426e-59b0-4dd7-ac07-90d478ff86c2-etc-modprobe-d\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.294942 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294159 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-etc-openvswitch\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.294942 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294189 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4df2426e-59b0-4dd7-ac07-90d478ff86c2-sys\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.294942 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294217 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4df2426e-59b0-4dd7-ac07-90d478ff86c2-tmp\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.294942 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294242 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d0acc482-de34-4188-ba44-20d609da46d0-cni-binary-copy\") pod \"multus-additional-cni-plugins-bg5bb\" (UID: \"d0acc482-de34-4188-ba44-20d609da46d0\") " pod="openshift-multus/multus-additional-cni-plugins-bg5bb" Apr 16 13:59:36.294942 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:36.294247 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:36.294942 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294298 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d0acc482-de34-4188-ba44-20d609da46d0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bg5bb\" (UID: \"d0acc482-de34-4188-ba44-20d609da46d0\") " pod="openshift-multus/multus-additional-cni-plugins-bg5bb" Apr 16 13:59:36.294942 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294307 2570 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 13:59:36.294942 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:36.294310 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6284f77-08e3-4846-904d-6a21f10707ae-metrics-certs podName:e6284f77-08e3-4846-904d-6a21f10707ae nodeName:}" failed. No retries permitted until 2026-04-16 13:59:36.794286961 +0000 UTC m=+2.999930134 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e6284f77-08e3-4846-904d-6a21f10707ae-metrics-certs") pod "network-metrics-daemon-29pd4" (UID: "e6284f77-08e3-4846-904d-6a21f10707ae") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:36.294942 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294369 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-etc-kubernetes\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.294942 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294409 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/258d5bb3-0083-4fff-96dd-2c9e007b3c05-tmp-dir\") pod \"node-resolver-kqv5v\" (UID: \"258d5bb3-0083-4fff-96dd-2c9e007b3c05\") " pod="openshift-dns/node-resolver-kqv5v" Apr 16 13:59:36.294942 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294437 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-etc-kubernetes\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.294942 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294438 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4df2426e-59b0-4dd7-ac07-90d478ff86c2-var-lib-kubelet\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.294942 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294473 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/258d5bb3-0083-4fff-96dd-2c9e007b3c05-hosts-file\") pod \"node-resolver-kqv5v\" (UID: \"258d5bb3-0083-4fff-96dd-2c9e007b3c05\") " pod="openshift-dns/node-resolver-kqv5v" Apr 16 13:59:36.294942 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294498 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4df2426e-59b0-4dd7-ac07-90d478ff86c2-run\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.294942 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294525 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d0acc482-de34-4188-ba44-20d609da46d0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bg5bb\" (UID: \"d0acc482-de34-4188-ba44-20d609da46d0\") " pod="openshift-multus/multus-additional-cni-plugins-bg5bb" Apr 16 13:59:36.294942 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294563 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/258d5bb3-0083-4fff-96dd-2c9e007b3c05-hosts-file\") pod \"node-resolver-kqv5v\" (UID: \"258d5bb3-0083-4fff-96dd-2c9e007b3c05\") " pod="openshift-dns/node-resolver-kqv5v" Apr 16 13:59:36.295780 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294572 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1edb190f-96e8-4548-8b55-97073b01a7ed-cni-binary-copy\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.295780 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294619 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/346c3280-2b45-4be3-8629-46903ecfe4b8-ovnkube-config\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.295780 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294647 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-multus-conf-dir\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.295780 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294673 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3d42db8b-b5be-43e6-bad1-6040da8c586f-registration-dir\") pod \"aws-ebs-csi-driver-node-mh9s8\" (UID: \"3d42db8b-b5be-43e6-bad1-6040da8c586f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mh9s8" Apr 16 13:59:36.295780 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294690 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/258d5bb3-0083-4fff-96dd-2c9e007b3c05-tmp-dir\") pod \"node-resolver-kqv5v\" (UID: \"258d5bb3-0083-4fff-96dd-2c9e007b3c05\") " pod="openshift-dns/node-resolver-kqv5v" Apr 16 13:59:36.295780 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294698 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3d42db8b-b5be-43e6-bad1-6040da8c586f-etc-selinux\") pod \"aws-ebs-csi-driver-node-mh9s8\" (UID: \"3d42db8b-b5be-43e6-bad1-6040da8c586f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mh9s8" Apr 16 13:59:36.295780 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294751 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/529a37ce-5549-43e3-bcab-ff0f9a6e46d6-agent-certs\") pod \"konnectivity-agent-wkbwm\" (UID: \"529a37ce-5549-43e3-bcab-ff0f9a6e46d6\") " pod="kube-system/konnectivity-agent-wkbwm" Apr 16 13:59:36.295780 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294764 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3d42db8b-b5be-43e6-bad1-6040da8c586f-registration-dir\") pod \"aws-ebs-csi-driver-node-mh9s8\" (UID: \"3d42db8b-b5be-43e6-bad1-6040da8c586f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mh9s8" Apr 16 13:59:36.295780 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294781 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e5c96c36-5440-4e0b-b632-7b2269bd2b80-iptables-alerter-script\") pod \"iptables-alerter-rtqtn\" (UID: \"e5c96c36-5440-4e0b-b632-7b2269bd2b80\") " pod="openshift-network-operator/iptables-alerter-rtqtn" Apr 16 13:59:36.295780 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294802 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-multus-conf-dir\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.295780 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294808 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-multus-cni-dir\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.295780 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294835 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-host-var-lib-cni-multus\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.295780 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294890 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d0acc482-de34-4188-ba44-20d609da46d0-cnibin\") pod \"multus-additional-cni-plugins-bg5bb\" (UID: \"d0acc482-de34-4188-ba44-20d609da46d0\") " pod="openshift-multus/multus-additional-cni-plugins-bg5bb" Apr 16 13:59:36.295780 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294922 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3d42db8b-b5be-43e6-bad1-6040da8c586f-etc-selinux\") pod \"aws-ebs-csi-driver-node-mh9s8\" (UID: \"3d42db8b-b5be-43e6-bad1-6040da8c586f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mh9s8" Apr 16 13:59:36.295780 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294923 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-multus-socket-dir-parent\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.295780 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294963 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3d42db8b-b5be-43e6-bad1-6040da8c586f-sys-fs\") pod \"aws-ebs-csi-driver-node-mh9s8\" (UID: \"3d42db8b-b5be-43e6-bad1-6040da8c586f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mh9s8" Apr 16 13:59:36.295780 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294970 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/529a37ce-5549-43e3-bcab-ff0f9a6e46d6-konnectivity-ca\") pod \"konnectivity-agent-wkbwm\" (UID: \"529a37ce-5549-43e3-bcab-ff0f9a6e46d6\") " pod="kube-system/konnectivity-agent-wkbwm" Apr 16 13:59:36.296517 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.294984 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4df2426e-59b0-4dd7-ac07-90d478ff86c2-etc-sysctl-conf\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.296517 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295008 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-multus-socket-dir-parent\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.296517 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295015 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n62vn\" (UniqueName: \"kubernetes.io/projected/5a916223-1676-42c3-a13e-815b7355eb26-kube-api-access-n62vn\") pod \"network-check-target-88c5t\" (UID: \"5a916223-1676-42c3-a13e-815b7355eb26\") " pod="openshift-network-diagnostics/network-check-target-88c5t" Apr 16 13:59:36.296517 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295033 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/346c3280-2b45-4be3-8629-46903ecfe4b8-ovnkube-script-lib\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.296517 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295049 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n9npd\" (UniqueName: \"kubernetes.io/projected/e6284f77-08e3-4846-904d-6a21f10707ae-kube-api-access-n9npd\") pod \"network-metrics-daemon-29pd4\" (UID: \"e6284f77-08e3-4846-904d-6a21f10707ae\") " pod="openshift-multus/network-metrics-daemon-29pd4" Apr 16 13:59:36.296517 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295066 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3d42db8b-b5be-43e6-bad1-6040da8c586f-device-dir\") pod \"aws-ebs-csi-driver-node-mh9s8\" (UID: \"3d42db8b-b5be-43e6-bad1-6040da8c586f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mh9s8" Apr 16 13:59:36.296517 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295071 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-host-var-lib-cni-multus\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.296517 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295074 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-multus-cni-dir\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.296517 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295090 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/29d8d95e-1f57-49fb-9896-340b389f0eea-serviceca\") pod \"node-ca-p9j4h\" (UID: \"29d8d95e-1f57-49fb-9896-340b389f0eea\") " pod="openshift-image-registry/node-ca-p9j4h" Apr 16 13:59:36.296517 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295137 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3d42db8b-b5be-43e6-bad1-6040da8c586f-sys-fs\") pod \"aws-ebs-csi-driver-node-mh9s8\" (UID: \"3d42db8b-b5be-43e6-bad1-6040da8c586f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mh9s8" Apr 16 13:59:36.296517 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295140 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-log-socket\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.296517 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295169 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-run-systemd\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.296517 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295216 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-log-socket\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.296517 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295232 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/346c3280-2b45-4be3-8629-46903ecfe4b8-ovnkube-config\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.296517 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295275 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-run-ovn\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.296517 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295335 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3d42db8b-b5be-43e6-bad1-6040da8c586f-device-dir\") pod \"aws-ebs-csi-driver-node-mh9s8\" (UID: \"3d42db8b-b5be-43e6-bad1-6040da8c586f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mh9s8" Apr 16 13:59:36.296517 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295413 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-host-cni-netd\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.297044 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295445 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-host-var-lib-cni-bin\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.297044 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295419 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-run-ovn\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.297044 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295490 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-host-var-lib-cni-bin\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.297044 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295492 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-run-systemd\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.297044 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295447 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-host-cni-netd\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.297044 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295497 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-host-run-multus-certs\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.297044 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295556 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-host-run-multus-certs\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.297044 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295580 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m792q\" (UniqueName: \"kubernetes.io/projected/258d5bb3-0083-4fff-96dd-2c9e007b3c05-kube-api-access-m792q\") pod \"node-resolver-kqv5v\" (UID: \"258d5bb3-0083-4fff-96dd-2c9e007b3c05\") " pod="openshift-dns/node-resolver-kqv5v" Apr 16 13:59:36.297044 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295609 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4df2426e-59b0-4dd7-ac07-90d478ff86c2-etc-sysconfig\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.297044 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295632 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4df2426e-59b0-4dd7-ac07-90d478ff86c2-etc-kubernetes\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.297044 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295655 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4df2426e-59b0-4dd7-ac07-90d478ff86c2-etc-sysctl-d\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.297044 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295682 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4df2426e-59b0-4dd7-ac07-90d478ff86c2-etc-systemd\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.297044 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295706 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/346c3280-2b45-4be3-8629-46903ecfe4b8-ovnkube-script-lib\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.297044 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295729 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d0acc482-de34-4188-ba44-20d609da46d0-os-release\") pod \"multus-additional-cni-plugins-bg5bb\" (UID: \"d0acc482-de34-4188-ba44-20d609da46d0\") " pod="openshift-multus/multus-additional-cni-plugins-bg5bb" Apr 16 13:59:36.297044 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295765 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d0acc482-de34-4188-ba44-20d609da46d0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bg5bb\" (UID: \"d0acc482-de34-4188-ba44-20d609da46d0\") " pod="openshift-multus/multus-additional-cni-plugins-bg5bb" Apr 16 13:59:36.297044 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295798 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-host-run-netns\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.297044 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295827 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-host-run-ovn-kubernetes\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.297631 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295853 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/29d8d95e-1f57-49fb-9896-340b389f0eea-serviceca\") pod \"node-ca-p9j4h\" (UID: \"29d8d95e-1f57-49fb-9896-340b389f0eea\") " pod="openshift-image-registry/node-ca-p9j4h" Apr 16 13:59:36.297631 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295860 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-system-cni-dir\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.297631 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295882 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-host-run-netns\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.297631 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295889 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkrcm\" (UniqueName: \"kubernetes.io/projected/4df2426e-59b0-4dd7-ac07-90d478ff86c2-kube-api-access-lkrcm\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.297631 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295905 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-host-run-ovn-kubernetes\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.297631 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295920 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-host-slash\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.297631 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295938 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-system-cni-dir\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.297631 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295948 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3d42db8b-b5be-43e6-bad1-6040da8c586f-socket-dir\") pod \"aws-ebs-csi-driver-node-mh9s8\" (UID: \"3d42db8b-b5be-43e6-bad1-6040da8c586f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mh9s8" Apr 16 13:59:36.297631 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295978 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-os-release\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.297631 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.296000 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1edb190f-96e8-4548-8b55-97073b01a7ed-cni-binary-copy\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.297631 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.296027 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/346c3280-2b45-4be3-8629-46903ecfe4b8-env-overrides\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.297631 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.296051 2570 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:36.297631 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.296058 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-os-release\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.297631 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.295997 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-host-slash\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.297631 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.296054 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-host-run-netns\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.297631 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.296100 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1edb190f-96e8-4548-8b55-97073b01a7ed-host-run-netns\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.297631 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.296128 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3d42db8b-b5be-43e6-bad1-6040da8c586f-socket-dir\") pod \"aws-ebs-csi-driver-node-mh9s8\" (UID: \"3d42db8b-b5be-43e6-bad1-6040da8c586f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mh9s8" Apr 16 13:59:36.297631 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.296138 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1edb190f-96e8-4548-8b55-97073b01a7ed-multus-daemon-config\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.298184 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.296197 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3d42db8b-b5be-43e6-bad1-6040da8c586f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mh9s8\" (UID: \"3d42db8b-b5be-43e6-bad1-6040da8c586f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mh9s8" Apr 16 13:59:36.298184 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.296240 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-445th\" (UniqueName: \"kubernetes.io/projected/29d8d95e-1f57-49fb-9896-340b389f0eea-kube-api-access-445th\") pod \"node-ca-p9j4h\" (UID: \"29d8d95e-1f57-49fb-9896-340b389f0eea\") " pod="openshift-image-registry/node-ca-p9j4h" Apr 16 13:59:36.298184 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.296260 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3d42db8b-b5be-43e6-bad1-6040da8c586f-kubelet-dir\") pod \"aws-ebs-csi-driver-node-mh9s8\" (UID: \"3d42db8b-b5be-43e6-bad1-6040da8c586f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mh9s8" Apr 16 13:59:36.298184 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.296272 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4df2426e-59b0-4dd7-ac07-90d478ff86c2-etc-tuned\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.298184 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.296303 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-var-lib-openvswitch\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.298184 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.296341 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkmc5\" (UniqueName: \"kubernetes.io/projected/346c3280-2b45-4be3-8629-46903ecfe4b8-kube-api-access-qkmc5\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.298184 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.296357 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/346c3280-2b45-4be3-8629-46903ecfe4b8-var-lib-openvswitch\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.298184 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.296387 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/346c3280-2b45-4be3-8629-46903ecfe4b8-env-overrides\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.298184 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.296372 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4df2426e-59b0-4dd7-ac07-90d478ff86c2-host\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.298184 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.296440 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e5c96c36-5440-4e0b-b632-7b2269bd2b80-host-slash\") pod \"iptables-alerter-rtqtn\" (UID: \"e5c96c36-5440-4e0b-b632-7b2269bd2b80\") " pod="openshift-network-operator/iptables-alerter-rtqtn" Apr 16 13:59:36.298184 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.297137 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1edb190f-96e8-4548-8b55-97073b01a7ed-multus-daemon-config\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.298184 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.297953 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/529a37ce-5549-43e3-bcab-ff0f9a6e46d6-agent-certs\") pod \"konnectivity-agent-wkbwm\" (UID: \"529a37ce-5549-43e3-bcab-ff0f9a6e46d6\") " pod="kube-system/konnectivity-agent-wkbwm" Apr 16 13:59:36.298565 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.298237 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/346c3280-2b45-4be3-8629-46903ecfe4b8-ovn-node-metrics-cert\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.302444 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.302419 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xgjsd\" (UniqueName: \"kubernetes.io/projected/1edb190f-96e8-4548-8b55-97073b01a7ed-kube-api-access-xgjsd\") pod \"multus-vp4gn\" (UID: \"1edb190f-96e8-4548-8b55-97073b01a7ed\") " pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.302927 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.302903 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-slnwb\" (UniqueName: \"kubernetes.io/projected/3d42db8b-b5be-43e6-bad1-6040da8c586f-kube-api-access-slnwb\") pod \"aws-ebs-csi-driver-node-mh9s8\" (UID: \"3d42db8b-b5be-43e6-bad1-6040da8c586f\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mh9s8" Apr 16 13:59:36.303140 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.303124 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m792q\" (UniqueName: \"kubernetes.io/projected/258d5bb3-0083-4fff-96dd-2c9e007b3c05-kube-api-access-m792q\") pod \"node-resolver-kqv5v\" (UID: \"258d5bb3-0083-4fff-96dd-2c9e007b3c05\") " pod="openshift-dns/node-resolver-kqv5v" Apr 16 13:59:36.303744 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.303721 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-445th\" (UniqueName: \"kubernetes.io/projected/29d8d95e-1f57-49fb-9896-340b389f0eea-kube-api-access-445th\") pod \"node-ca-p9j4h\" (UID: \"29d8d95e-1f57-49fb-9896-340b389f0eea\") " pod="openshift-image-registry/node-ca-p9j4h" Apr 16 13:59:36.303969 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.303953 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9npd\" (UniqueName: \"kubernetes.io/projected/e6284f77-08e3-4846-904d-6a21f10707ae-kube-api-access-n9npd\") pod \"network-metrics-daemon-29pd4\" (UID: \"e6284f77-08e3-4846-904d-6a21f10707ae\") " pod="openshift-multus/network-metrics-daemon-29pd4" Apr 16 13:59:36.304027 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.304010 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkmc5\" (UniqueName: \"kubernetes.io/projected/346c3280-2b45-4be3-8629-46903ecfe4b8-kube-api-access-qkmc5\") pod \"ovnkube-node-p2d74\" (UID: \"346c3280-2b45-4be3-8629-46903ecfe4b8\") " pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.310807 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.310759 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-133.ec2.internal" event={"ID":"39d671c484f3397b8f32690faf7d7a58","Type":"ContainerStarted","Data":"2b2c8cffa5d578cd5e4a5708c58fb1a27237af120cfa5a9dd9b0cbcff83bc6ad"} Apr 16 13:59:36.311668 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.311651 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-133.ec2.internal" event={"ID":"02f569f2777966beb695e394b803ecc2","Type":"ContainerStarted","Data":"45acb78ed5d4aa99836ea3f9034dd39966e5640333a4ce9e922d2ef641c57525"} Apr 16 13:59:36.396755 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.396717 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4df2426e-59b0-4dd7-ac07-90d478ff86c2-etc-sysctl-conf\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.396755 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.396754 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n62vn\" (UniqueName: \"kubernetes.io/projected/5a916223-1676-42c3-a13e-815b7355eb26-kube-api-access-n62vn\") pod \"network-check-target-88c5t\" (UID: \"5a916223-1676-42c3-a13e-815b7355eb26\") " pod="openshift-network-diagnostics/network-check-target-88c5t" Apr 16 13:59:36.396970 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.396777 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4df2426e-59b0-4dd7-ac07-90d478ff86c2-etc-sysconfig\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.396970 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.396828 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/4df2426e-59b0-4dd7-ac07-90d478ff86c2-etc-sysconfig\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.396970 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.396893 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4df2426e-59b0-4dd7-ac07-90d478ff86c2-etc-kubernetes\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.396970 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.396928 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4df2426e-59b0-4dd7-ac07-90d478ff86c2-etc-sysctl-d\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.396970 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.396937 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/4df2426e-59b0-4dd7-ac07-90d478ff86c2-etc-kubernetes\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.396970 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.396945 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4df2426e-59b0-4dd7-ac07-90d478ff86c2-etc-systemd\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.396970 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.396925 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/4df2426e-59b0-4dd7-ac07-90d478ff86c2-etc-sysctl-conf\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.397191 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.396977 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/4df2426e-59b0-4dd7-ac07-90d478ff86c2-etc-systemd\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.397191 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.396998 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d0acc482-de34-4188-ba44-20d609da46d0-os-release\") pod \"multus-additional-cni-plugins-bg5bb\" (UID: \"d0acc482-de34-4188-ba44-20d609da46d0\") " pod="openshift-multus/multus-additional-cni-plugins-bg5bb" Apr 16 13:59:36.397191 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.397036 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d0acc482-de34-4188-ba44-20d609da46d0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bg5bb\" (UID: \"d0acc482-de34-4188-ba44-20d609da46d0\") " pod="openshift-multus/multus-additional-cni-plugins-bg5bb" Apr 16 13:59:36.397191 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.397065 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/4df2426e-59b0-4dd7-ac07-90d478ff86c2-etc-sysctl-d\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.397191 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.397096 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lkrcm\" (UniqueName: \"kubernetes.io/projected/4df2426e-59b0-4dd7-ac07-90d478ff86c2-kube-api-access-lkrcm\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.397191 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.397114 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d0acc482-de34-4188-ba44-20d609da46d0-os-release\") pod \"multus-additional-cni-plugins-bg5bb\" (UID: \"d0acc482-de34-4188-ba44-20d609da46d0\") " pod="openshift-multus/multus-additional-cni-plugins-bg5bb" Apr 16 13:59:36.397191 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.397123 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4df2426e-59b0-4dd7-ac07-90d478ff86c2-etc-tuned\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.397191 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.397157 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4df2426e-59b0-4dd7-ac07-90d478ff86c2-host\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.397191 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.397183 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e5c96c36-5440-4e0b-b632-7b2269bd2b80-host-slash\") pod \"iptables-alerter-rtqtn\" (UID: \"e5c96c36-5440-4e0b-b632-7b2269bd2b80\") " pod="openshift-network-operator/iptables-alerter-rtqtn" Apr 16 13:59:36.397550 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.397215 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d0acc482-de34-4188-ba44-20d609da46d0-system-cni-dir\") pod \"multus-additional-cni-plugins-bg5bb\" (UID: \"d0acc482-de34-4188-ba44-20d609da46d0\") " pod="openshift-multus/multus-additional-cni-plugins-bg5bb" Apr 16 13:59:36.397550 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.397245 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4df2426e-59b0-4dd7-ac07-90d478ff86c2-lib-modules\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.397550 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.397253 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/4df2426e-59b0-4dd7-ac07-90d478ff86c2-host\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.397550 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.397261 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e5c96c36-5440-4e0b-b632-7b2269bd2b80-host-slash\") pod \"iptables-alerter-rtqtn\" (UID: \"e5c96c36-5440-4e0b-b632-7b2269bd2b80\") " pod="openshift-network-operator/iptables-alerter-rtqtn" Apr 16 13:59:36.397550 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.397274 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g7mgh\" (UniqueName: \"kubernetes.io/projected/d0acc482-de34-4188-ba44-20d609da46d0-kube-api-access-g7mgh\") pod \"multus-additional-cni-plugins-bg5bb\" (UID: \"d0acc482-de34-4188-ba44-20d609da46d0\") " pod="openshift-multus/multus-additional-cni-plugins-bg5bb" Apr 16 13:59:36.397550 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.397299 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d0acc482-de34-4188-ba44-20d609da46d0-system-cni-dir\") pod \"multus-additional-cni-plugins-bg5bb\" (UID: \"d0acc482-de34-4188-ba44-20d609da46d0\") " pod="openshift-multus/multus-additional-cni-plugins-bg5bb" Apr 16 13:59:36.397550 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.397331 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ltp4b\" (UniqueName: \"kubernetes.io/projected/e5c96c36-5440-4e0b-b632-7b2269bd2b80-kube-api-access-ltp4b\") pod \"iptables-alerter-rtqtn\" (UID: \"e5c96c36-5440-4e0b-b632-7b2269bd2b80\") " pod="openshift-network-operator/iptables-alerter-rtqtn" Apr 16 13:59:36.397550 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.397370 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4df2426e-59b0-4dd7-ac07-90d478ff86c2-etc-modprobe-d\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.397550 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.397386 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/4df2426e-59b0-4dd7-ac07-90d478ff86c2-lib-modules\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.397550 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.397390 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4df2426e-59b0-4dd7-ac07-90d478ff86c2-sys\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.397550 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.397421 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4df2426e-59b0-4dd7-ac07-90d478ff86c2-tmp\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.397550 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.397432 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4df2426e-59b0-4dd7-ac07-90d478ff86c2-sys\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.397550 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.397445 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d0acc482-de34-4188-ba44-20d609da46d0-cni-binary-copy\") pod \"multus-additional-cni-plugins-bg5bb\" (UID: \"d0acc482-de34-4188-ba44-20d609da46d0\") " pod="openshift-multus/multus-additional-cni-plugins-bg5bb" Apr 16 13:59:36.397550 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.397468 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d0acc482-de34-4188-ba44-20d609da46d0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bg5bb\" (UID: \"d0acc482-de34-4188-ba44-20d609da46d0\") " pod="openshift-multus/multus-additional-cni-plugins-bg5bb" Apr 16 13:59:36.397550 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.397492 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4df2426e-59b0-4dd7-ac07-90d478ff86c2-var-lib-kubelet\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.397550 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.397508 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4df2426e-59b0-4dd7-ac07-90d478ff86c2-run\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.397550 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.397517 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/4df2426e-59b0-4dd7-ac07-90d478ff86c2-etc-modprobe-d\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.398284 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.397525 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d0acc482-de34-4188-ba44-20d609da46d0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bg5bb\" (UID: \"d0acc482-de34-4188-ba44-20d609da46d0\") " pod="openshift-multus/multus-additional-cni-plugins-bg5bb" Apr 16 13:59:36.398284 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.397575 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e5c96c36-5440-4e0b-b632-7b2269bd2b80-iptables-alerter-script\") pod \"iptables-alerter-rtqtn\" (UID: \"e5c96c36-5440-4e0b-b632-7b2269bd2b80\") " pod="openshift-network-operator/iptables-alerter-rtqtn" Apr 16 13:59:36.398284 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.397601 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d0acc482-de34-4188-ba44-20d609da46d0-cnibin\") pod \"multus-additional-cni-plugins-bg5bb\" (UID: \"d0acc482-de34-4188-ba44-20d609da46d0\") " pod="openshift-multus/multus-additional-cni-plugins-bg5bb" Apr 16 13:59:36.398284 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.397657 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d0acc482-de34-4188-ba44-20d609da46d0-cnibin\") pod \"multus-additional-cni-plugins-bg5bb\" (UID: \"d0acc482-de34-4188-ba44-20d609da46d0\") " pod="openshift-multus/multus-additional-cni-plugins-bg5bb" Apr 16 13:59:36.398284 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.397681 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/d0acc482-de34-4188-ba44-20d609da46d0-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-bg5bb\" (UID: \"d0acc482-de34-4188-ba44-20d609da46d0\") " pod="openshift-multus/multus-additional-cni-plugins-bg5bb" Apr 16 13:59:36.398284 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.397734 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/4df2426e-59b0-4dd7-ac07-90d478ff86c2-run\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.398284 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.397839 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/4df2426e-59b0-4dd7-ac07-90d478ff86c2-var-lib-kubelet\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.398284 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.397962 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d0acc482-de34-4188-ba44-20d609da46d0-cni-binary-copy\") pod \"multus-additional-cni-plugins-bg5bb\" (UID: \"d0acc482-de34-4188-ba44-20d609da46d0\") " pod="openshift-multus/multus-additional-cni-plugins-bg5bb" Apr 16 13:59:36.398284 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.398188 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/d0acc482-de34-4188-ba44-20d609da46d0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-bg5bb\" (UID: \"d0acc482-de34-4188-ba44-20d609da46d0\") " pod="openshift-multus/multus-additional-cni-plugins-bg5bb" Apr 16 13:59:36.398697 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.398358 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e5c96c36-5440-4e0b-b632-7b2269bd2b80-iptables-alerter-script\") pod \"iptables-alerter-rtqtn\" (UID: \"e5c96c36-5440-4e0b-b632-7b2269bd2b80\") " pod="openshift-network-operator/iptables-alerter-rtqtn" Apr 16 13:59:36.398697 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.398418 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d0acc482-de34-4188-ba44-20d609da46d0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-bg5bb\" (UID: \"d0acc482-de34-4188-ba44-20d609da46d0\") " pod="openshift-multus/multus-additional-cni-plugins-bg5bb" Apr 16 13:59:36.399768 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.399741 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/4df2426e-59b0-4dd7-ac07-90d478ff86c2-etc-tuned\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.399860 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.399839 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/4df2426e-59b0-4dd7-ac07-90d478ff86c2-tmp\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.405948 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:36.405893 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:36.405948 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:36.405914 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:36.405948 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:36.405924 2570 projected.go:194] Error preparing data for projected volume kube-api-access-n62vn for pod openshift-network-diagnostics/network-check-target-88c5t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:36.406135 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:36.405997 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5a916223-1676-42c3-a13e-815b7355eb26-kube-api-access-n62vn podName:5a916223-1676-42c3-a13e-815b7355eb26 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:36.905983014 +0000 UTC m=+3.111626156 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-n62vn" (UniqueName: "kubernetes.io/projected/5a916223-1676-42c3-a13e-815b7355eb26-kube-api-access-n62vn") pod "network-check-target-88c5t" (UID: "5a916223-1676-42c3-a13e-815b7355eb26") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:36.408029 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.408010 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ltp4b\" (UniqueName: \"kubernetes.io/projected/e5c96c36-5440-4e0b-b632-7b2269bd2b80-kube-api-access-ltp4b\") pod \"iptables-alerter-rtqtn\" (UID: \"e5c96c36-5440-4e0b-b632-7b2269bd2b80\") " pod="openshift-network-operator/iptables-alerter-rtqtn" Apr 16 13:59:36.408167 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.408150 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkrcm\" (UniqueName: \"kubernetes.io/projected/4df2426e-59b0-4dd7-ac07-90d478ff86c2-kube-api-access-lkrcm\") pod \"tuned-brkpj\" (UID: \"4df2426e-59b0-4dd7-ac07-90d478ff86c2\") " pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.408324 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.408306 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7mgh\" (UniqueName: \"kubernetes.io/projected/d0acc482-de34-4188-ba44-20d609da46d0-kube-api-access-g7mgh\") pod \"multus-additional-cni-plugins-bg5bb\" (UID: \"d0acc482-de34-4188-ba44-20d609da46d0\") " pod="openshift-multus/multus-additional-cni-plugins-bg5bb" Apr 16 13:59:36.479373 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.479326 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-wkbwm" Apr 16 13:59:36.485401 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.485381 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-p9j4h" Apr 16 13:59:36.486642 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:36.486616 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod529a37ce_5549_43e3_bcab_ff0f9a6e46d6.slice/crio-d921989a3a41a378aa48d7201eb76f4fde012ee779f9ebf3a01249ff2846d928 WatchSource:0}: Error finding container d921989a3a41a378aa48d7201eb76f4fde012ee779f9ebf3a01249ff2846d928: Status 404 returned error can't find the container with id d921989a3a41a378aa48d7201eb76f4fde012ee779f9ebf3a01249ff2846d928 Apr 16 13:59:36.491546 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:36.491507 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29d8d95e_1f57_49fb_9896_340b389f0eea.slice/crio-19bf8798a4930cc50bd8b99b13956edf6fe64df82e0827fe23c8fc5b6a09c4e6 WatchSource:0}: Error finding container 19bf8798a4930cc50bd8b99b13956edf6fe64df82e0827fe23c8fc5b6a09c4e6: Status 404 returned error can't find the container with id 19bf8798a4930cc50bd8b99b13956edf6fe64df82e0827fe23c8fc5b6a09c4e6 Apr 16 13:59:36.495287 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.495269 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vp4gn" Apr 16 13:59:36.499579 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.499563 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 13:59:36.502372 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:36.502350 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1edb190f_96e8_4548_8b55_97073b01a7ed.slice/crio-1171d8de0d8584c67401f8a313488a12ed395b5719df3fcde41c3800c103b56d WatchSource:0}: Error finding container 1171d8de0d8584c67401f8a313488a12ed395b5719df3fcde41c3800c103b56d: Status 404 returned error can't find the container with id 1171d8de0d8584c67401f8a313488a12ed395b5719df3fcde41c3800c103b56d Apr 16 13:59:36.504931 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.504913 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-kqv5v" Apr 16 13:59:36.507070 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:36.507048 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod346c3280_2b45_4be3_8629_46903ecfe4b8.slice/crio-c128711921fbc596435ebfd911c7f5d7b8ebc809022f6b03f5a75d2f305fe1d2 WatchSource:0}: Error finding container c128711921fbc596435ebfd911c7f5d7b8ebc809022f6b03f5a75d2f305fe1d2: Status 404 returned error can't find the container with id c128711921fbc596435ebfd911c7f5d7b8ebc809022f6b03f5a75d2f305fe1d2 Apr 16 13:59:36.511178 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.511147 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mh9s8" Apr 16 13:59:36.514780 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:36.514754 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod258d5bb3_0083_4fff_96dd_2c9e007b3c05.slice/crio-f95f767c6cb9968af72ce24210232cf9027456d9e1c7930770d9bff0d74c8ca7 WatchSource:0}: Error finding container f95f767c6cb9968af72ce24210232cf9027456d9e1c7930770d9bff0d74c8ca7: Status 404 returned error can't find the container with id f95f767c6cb9968af72ce24210232cf9027456d9e1c7930770d9bff0d74c8ca7 Apr 16 13:59:36.516137 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.516116 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-brkpj" Apr 16 13:59:36.517762 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:36.517741 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3d42db8b_b5be_43e6_bad1_6040da8c586f.slice/crio-da8c4ea7338aeacf5b9f52f26ee80391bcb8f55518c93c3a7e59222edb3f1f6f WatchSource:0}: Error finding container da8c4ea7338aeacf5b9f52f26ee80391bcb8f55518c93c3a7e59222edb3f1f6f: Status 404 returned error can't find the container with id da8c4ea7338aeacf5b9f52f26ee80391bcb8f55518c93c3a7e59222edb3f1f6f Apr 16 13:59:36.522303 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.522284 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-bg5bb" Apr 16 13:59:36.522869 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:36.522839 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4df2426e_59b0_4dd7_ac07_90d478ff86c2.slice/crio-1be75f1478e2f7798d0d7a1eaa0435b6783e68af67d5d41a8641bb60f4ade729 WatchSource:0}: Error finding container 1be75f1478e2f7798d0d7a1eaa0435b6783e68af67d5d41a8641bb60f4ade729: Status 404 returned error can't find the container with id 1be75f1478e2f7798d0d7a1eaa0435b6783e68af67d5d41a8641bb60f4ade729 Apr 16 13:59:36.527445 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.527430 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-rtqtn" Apr 16 13:59:36.530221 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:36.530184 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0acc482_de34_4188_ba44_20d609da46d0.slice/crio-58626207fe9dd0b2b76fa374cc5e24b9c26af379bd3e699a1b35fb8e4bc83f42 WatchSource:0}: Error finding container 58626207fe9dd0b2b76fa374cc5e24b9c26af379bd3e699a1b35fb8e4bc83f42: Status 404 returned error can't find the container with id 58626207fe9dd0b2b76fa374cc5e24b9c26af379bd3e699a1b35fb8e4bc83f42 Apr 16 13:59:36.535442 ip-10-0-133-133 kubenswrapper[2570]: W0416 13:59:36.535414 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5c96c36_5440_4e0b_b632_7b2269bd2b80.slice/crio-7c2e966461da1fca22115def03ff654a7ddf8af60884b34074eae9fc4899c913 WatchSource:0}: Error finding container 7c2e966461da1fca22115def03ff654a7ddf8af60884b34074eae9fc4899c913: Status 404 returned error can't find the container with id 7c2e966461da1fca22115def03ff654a7ddf8af60884b34074eae9fc4899c913 Apr 16 13:59:36.800183 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.800098 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e6284f77-08e3-4846-904d-6a21f10707ae-metrics-certs\") pod \"network-metrics-daemon-29pd4\" (UID: \"e6284f77-08e3-4846-904d-6a21f10707ae\") " pod="openshift-multus/network-metrics-daemon-29pd4" Apr 16 13:59:36.800327 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:36.800278 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:36.800387 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:36.800352 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6284f77-08e3-4846-904d-6a21f10707ae-metrics-certs podName:e6284f77-08e3-4846-904d-6a21f10707ae nodeName:}" failed. No retries permitted until 2026-04-16 13:59:37.800332247 +0000 UTC m=+4.005975409 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e6284f77-08e3-4846-904d-6a21f10707ae-metrics-certs") pod "network-metrics-daemon-29pd4" (UID: "e6284f77-08e3-4846-904d-6a21f10707ae") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:36.941985 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:36.941952 2570 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:37.002720 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:37.002674 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n62vn\" (UniqueName: \"kubernetes.io/projected/5a916223-1676-42c3-a13e-815b7355eb26-kube-api-access-n62vn\") pod \"network-check-target-88c5t\" (UID: \"5a916223-1676-42c3-a13e-815b7355eb26\") " pod="openshift-network-diagnostics/network-check-target-88c5t" Apr 16 13:59:37.003053 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:37.002912 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:37.003053 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:37.002937 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:37.003053 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:37.002951 2570 projected.go:194] Error preparing data for projected volume kube-api-access-n62vn for pod openshift-network-diagnostics/network-check-target-88c5t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:37.003053 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:37.003013 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5a916223-1676-42c3-a13e-815b7355eb26-kube-api-access-n62vn podName:5a916223-1676-42c3-a13e-815b7355eb26 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:38.002994213 +0000 UTC m=+4.208637369 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-n62vn" (UniqueName: "kubernetes.io/projected/5a916223-1676-42c3-a13e-815b7355eb26-kube-api-access-n62vn") pod "network-check-target-88c5t" (UID: "5a916223-1676-42c3-a13e-815b7355eb26") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:37.267695 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:37.267590 2570 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:54:35 +0000 UTC" deadline="2028-01-19 13:28:56.045081706 +0000 UTC" Apr 16 13:59:37.267695 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:37.267642 2570 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15431h29m18.777444464s" Apr 16 13:59:37.325430 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:37.325380 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-p9j4h" event={"ID":"29d8d95e-1f57-49fb-9896-340b389f0eea","Type":"ContainerStarted","Data":"19bf8798a4930cc50bd8b99b13956edf6fe64df82e0827fe23c8fc5b6a09c4e6"} Apr 16 13:59:37.328991 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:37.328958 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-wkbwm" event={"ID":"529a37ce-5549-43e3-bcab-ff0f9a6e46d6","Type":"ContainerStarted","Data":"d921989a3a41a378aa48d7201eb76f4fde012ee779f9ebf3a01249ff2846d928"} Apr 16 13:59:37.341861 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:37.341824 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bg5bb" event={"ID":"d0acc482-de34-4188-ba44-20d609da46d0","Type":"ContainerStarted","Data":"58626207fe9dd0b2b76fa374cc5e24b9c26af379bd3e699a1b35fb8e4bc83f42"} Apr 16 13:59:37.348765 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:37.348726 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rtqtn" event={"ID":"e5c96c36-5440-4e0b-b632-7b2269bd2b80","Type":"ContainerStarted","Data":"7c2e966461da1fca22115def03ff654a7ddf8af60884b34074eae9fc4899c913"} Apr 16 13:59:37.352646 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:37.352589 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-brkpj" event={"ID":"4df2426e-59b0-4dd7-ac07-90d478ff86c2","Type":"ContainerStarted","Data":"1be75f1478e2f7798d0d7a1eaa0435b6783e68af67d5d41a8641bb60f4ade729"} Apr 16 13:59:37.360459 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:37.360422 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mh9s8" event={"ID":"3d42db8b-b5be-43e6-bad1-6040da8c586f","Type":"ContainerStarted","Data":"da8c4ea7338aeacf5b9f52f26ee80391bcb8f55518c93c3a7e59222edb3f1f6f"} Apr 16 13:59:37.369405 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:37.369368 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kqv5v" event={"ID":"258d5bb3-0083-4fff-96dd-2c9e007b3c05","Type":"ContainerStarted","Data":"f95f767c6cb9968af72ce24210232cf9027456d9e1c7930770d9bff0d74c8ca7"} Apr 16 13:59:37.389990 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:37.389954 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" event={"ID":"346c3280-2b45-4be3-8629-46903ecfe4b8","Type":"ContainerStarted","Data":"c128711921fbc596435ebfd911c7f5d7b8ebc809022f6b03f5a75d2f305fe1d2"} Apr 16 13:59:37.402371 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:37.402311 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vp4gn" event={"ID":"1edb190f-96e8-4548-8b55-97073b01a7ed","Type":"ContainerStarted","Data":"1171d8de0d8584c67401f8a313488a12ed395b5719df3fcde41c3800c103b56d"} Apr 16 13:59:37.809874 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:37.809834 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e6284f77-08e3-4846-904d-6a21f10707ae-metrics-certs\") pod \"network-metrics-daemon-29pd4\" (UID: \"e6284f77-08e3-4846-904d-6a21f10707ae\") " pod="openshift-multus/network-metrics-daemon-29pd4" Apr 16 13:59:37.810074 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:37.810012 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:37.810135 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:37.810077 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6284f77-08e3-4846-904d-6a21f10707ae-metrics-certs podName:e6284f77-08e3-4846-904d-6a21f10707ae nodeName:}" failed. No retries permitted until 2026-04-16 13:59:39.810058158 +0000 UTC m=+6.015701304 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e6284f77-08e3-4846-904d-6a21f10707ae-metrics-certs") pod "network-metrics-daemon-29pd4" (UID: "e6284f77-08e3-4846-904d-6a21f10707ae") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:38.011894 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:38.011856 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n62vn\" (UniqueName: \"kubernetes.io/projected/5a916223-1676-42c3-a13e-815b7355eb26-kube-api-access-n62vn\") pod \"network-check-target-88c5t\" (UID: \"5a916223-1676-42c3-a13e-815b7355eb26\") " pod="openshift-network-diagnostics/network-check-target-88c5t" Apr 16 13:59:38.012066 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:38.011996 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:38.012066 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:38.012011 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:38.012066 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:38.012021 2570 projected.go:194] Error preparing data for projected volume kube-api-access-n62vn for pod openshift-network-diagnostics/network-check-target-88c5t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:38.012232 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:38.012078 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5a916223-1676-42c3-a13e-815b7355eb26-kube-api-access-n62vn podName:5a916223-1676-42c3-a13e-815b7355eb26 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:40.01205988 +0000 UTC m=+6.217703025 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-n62vn" (UniqueName: "kubernetes.io/projected/5a916223-1676-42c3-a13e-815b7355eb26-kube-api-access-n62vn") pod "network-check-target-88c5t" (UID: "5a916223-1676-42c3-a13e-815b7355eb26") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:38.305900 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:38.305824 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-29pd4" Apr 16 13:59:38.306330 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:38.306053 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-29pd4" podUID="e6284f77-08e3-4846-904d-6a21f10707ae" Apr 16 13:59:38.306590 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:38.306567 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-88c5t" Apr 16 13:59:38.306676 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:38.306661 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-88c5t" podUID="5a916223-1676-42c3-a13e-815b7355eb26" Apr 16 13:59:39.827177 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:39.827137 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e6284f77-08e3-4846-904d-6a21f10707ae-metrics-certs\") pod \"network-metrics-daemon-29pd4\" (UID: \"e6284f77-08e3-4846-904d-6a21f10707ae\") " pod="openshift-multus/network-metrics-daemon-29pd4" Apr 16 13:59:39.827702 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:39.827340 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:39.827702 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:39.827411 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6284f77-08e3-4846-904d-6a21f10707ae-metrics-certs podName:e6284f77-08e3-4846-904d-6a21f10707ae nodeName:}" failed. No retries permitted until 2026-04-16 13:59:43.827389124 +0000 UTC m=+10.033032278 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e6284f77-08e3-4846-904d-6a21f10707ae-metrics-certs") pod "network-metrics-daemon-29pd4" (UID: "e6284f77-08e3-4846-904d-6a21f10707ae") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:40.028604 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:40.028558 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n62vn\" (UniqueName: \"kubernetes.io/projected/5a916223-1676-42c3-a13e-815b7355eb26-kube-api-access-n62vn\") pod \"network-check-target-88c5t\" (UID: \"5a916223-1676-42c3-a13e-815b7355eb26\") " pod="openshift-network-diagnostics/network-check-target-88c5t" Apr 16 13:59:40.028812 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:40.028795 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:40.028859 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:40.028820 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:40.028859 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:40.028833 2570 projected.go:194] Error preparing data for projected volume kube-api-access-n62vn for pod openshift-network-diagnostics/network-check-target-88c5t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:40.028920 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:40.028889 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5a916223-1676-42c3-a13e-815b7355eb26-kube-api-access-n62vn podName:5a916223-1676-42c3-a13e-815b7355eb26 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:44.028872427 +0000 UTC m=+10.234515571 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-n62vn" (UniqueName: "kubernetes.io/projected/5a916223-1676-42c3-a13e-815b7355eb26-kube-api-access-n62vn") pod "network-check-target-88c5t" (UID: "5a916223-1676-42c3-a13e-815b7355eb26") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:40.304699 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:40.304666 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-29pd4" Apr 16 13:59:40.304895 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:40.304670 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-88c5t" Apr 16 13:59:40.304895 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:40.304817 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-29pd4" podUID="e6284f77-08e3-4846-904d-6a21f10707ae" Apr 16 13:59:40.304895 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:40.304883 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-88c5t" podUID="5a916223-1676-42c3-a13e-815b7355eb26" Apr 16 13:59:42.304463 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:42.304425 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-88c5t" Apr 16 13:59:42.304932 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:42.304476 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-29pd4" Apr 16 13:59:42.304932 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:42.304588 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-88c5t" podUID="5a916223-1676-42c3-a13e-815b7355eb26" Apr 16 13:59:42.304932 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:42.304831 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-29pd4" podUID="e6284f77-08e3-4846-904d-6a21f10707ae" Apr 16 13:59:43.869089 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:43.868483 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e6284f77-08e3-4846-904d-6a21f10707ae-metrics-certs\") pod \"network-metrics-daemon-29pd4\" (UID: \"e6284f77-08e3-4846-904d-6a21f10707ae\") " pod="openshift-multus/network-metrics-daemon-29pd4" Apr 16 13:59:43.869089 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:43.868665 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:43.869089 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:43.868731 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6284f77-08e3-4846-904d-6a21f10707ae-metrics-certs podName:e6284f77-08e3-4846-904d-6a21f10707ae nodeName:}" failed. No retries permitted until 2026-04-16 13:59:51.868712048 +0000 UTC m=+18.074355192 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e6284f77-08e3-4846-904d-6a21f10707ae-metrics-certs") pod "network-metrics-daemon-29pd4" (UID: "e6284f77-08e3-4846-904d-6a21f10707ae") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:44.070275 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:44.070235 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n62vn\" (UniqueName: \"kubernetes.io/projected/5a916223-1676-42c3-a13e-815b7355eb26-kube-api-access-n62vn\") pod \"network-check-target-88c5t\" (UID: \"5a916223-1676-42c3-a13e-815b7355eb26\") " pod="openshift-network-diagnostics/network-check-target-88c5t" Apr 16 13:59:44.070468 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:44.070448 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:44.070580 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:44.070469 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:44.070580 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:44.070482 2570 projected.go:194] Error preparing data for projected volume kube-api-access-n62vn for pod openshift-network-diagnostics/network-check-target-88c5t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:44.070580 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:44.070555 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5a916223-1676-42c3-a13e-815b7355eb26-kube-api-access-n62vn podName:5a916223-1676-42c3-a13e-815b7355eb26 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:52.070520725 +0000 UTC m=+18.276163881 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-n62vn" (UniqueName: "kubernetes.io/projected/5a916223-1676-42c3-a13e-815b7355eb26-kube-api-access-n62vn") pod "network-check-target-88c5t" (UID: "5a916223-1676-42c3-a13e-815b7355eb26") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:44.305366 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:44.305322 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-29pd4" Apr 16 13:59:44.306483 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:44.306101 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-29pd4" podUID="e6284f77-08e3-4846-904d-6a21f10707ae" Apr 16 13:59:44.306483 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:44.306205 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-88c5t" Apr 16 13:59:44.306483 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:44.306295 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-88c5t" podUID="5a916223-1676-42c3-a13e-815b7355eb26" Apr 16 13:59:46.305281 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:46.305236 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-88c5t" Apr 16 13:59:46.305711 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:46.305290 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-29pd4" Apr 16 13:59:46.305711 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:46.305387 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-88c5t" podUID="5a916223-1676-42c3-a13e-815b7355eb26" Apr 16 13:59:46.305711 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:46.305491 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-29pd4" podUID="e6284f77-08e3-4846-904d-6a21f10707ae" Apr 16 13:59:48.304961 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:48.304918 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-29pd4" Apr 16 13:59:48.304961 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:48.304950 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-88c5t" Apr 16 13:59:48.305612 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:48.305067 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-29pd4" podUID="e6284f77-08e3-4846-904d-6a21f10707ae" Apr 16 13:59:48.305612 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:48.305163 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-88c5t" podUID="5a916223-1676-42c3-a13e-815b7355eb26" Apr 16 13:59:50.305623 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:50.305334 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-88c5t" Apr 16 13:59:50.306089 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:50.305411 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-29pd4" Apr 16 13:59:50.306089 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:50.305735 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-88c5t" podUID="5a916223-1676-42c3-a13e-815b7355eb26" Apr 16 13:59:50.306089 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:50.305823 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-29pd4" podUID="e6284f77-08e3-4846-904d-6a21f10707ae" Apr 16 13:59:51.928997 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:51.928952 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e6284f77-08e3-4846-904d-6a21f10707ae-metrics-certs\") pod \"network-metrics-daemon-29pd4\" (UID: \"e6284f77-08e3-4846-904d-6a21f10707ae\") " pod="openshift-multus/network-metrics-daemon-29pd4" Apr 16 13:59:51.929465 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:51.929116 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:51.929465 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:51.929196 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6284f77-08e3-4846-904d-6a21f10707ae-metrics-certs podName:e6284f77-08e3-4846-904d-6a21f10707ae nodeName:}" failed. No retries permitted until 2026-04-16 14:00:07.929174131 +0000 UTC m=+34.134817273 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e6284f77-08e3-4846-904d-6a21f10707ae-metrics-certs") pod "network-metrics-daemon-29pd4" (UID: "e6284f77-08e3-4846-904d-6a21f10707ae") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:52.131489 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:52.131437 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n62vn\" (UniqueName: \"kubernetes.io/projected/5a916223-1676-42c3-a13e-815b7355eb26-kube-api-access-n62vn\") pod \"network-check-target-88c5t\" (UID: \"5a916223-1676-42c3-a13e-815b7355eb26\") " pod="openshift-network-diagnostics/network-check-target-88c5t" Apr 16 13:59:52.131701 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:52.131641 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:52.131701 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:52.131668 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:52.131701 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:52.131681 2570 projected.go:194] Error preparing data for projected volume kube-api-access-n62vn for pod openshift-network-diagnostics/network-check-target-88c5t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:52.131874 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:52.131756 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5a916223-1676-42c3-a13e-815b7355eb26-kube-api-access-n62vn podName:5a916223-1676-42c3-a13e-815b7355eb26 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:08.131735897 +0000 UTC m=+34.337379060 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-n62vn" (UniqueName: "kubernetes.io/projected/5a916223-1676-42c3-a13e-815b7355eb26-kube-api-access-n62vn") pod "network-check-target-88c5t" (UID: "5a916223-1676-42c3-a13e-815b7355eb26") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:52.304653 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:52.304556 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-29pd4" Apr 16 13:59:52.304813 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:52.304649 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-88c5t" Apr 16 13:59:52.304813 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:52.304727 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-29pd4" podUID="e6284f77-08e3-4846-904d-6a21f10707ae" Apr 16 13:59:52.304813 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:52.304781 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-88c5t" podUID="5a916223-1676-42c3-a13e-815b7355eb26" Apr 16 13:59:54.305485 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:54.305432 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-29pd4" Apr 16 13:59:54.305958 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:54.305602 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-29pd4" podUID="e6284f77-08e3-4846-904d-6a21f10707ae" Apr 16 13:59:54.305958 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:54.305656 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-88c5t" Apr 16 13:59:54.305958 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:54.305793 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-88c5t" podUID="5a916223-1676-42c3-a13e-815b7355eb26" Apr 16 13:59:55.441300 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:55.437487 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-brkpj" event={"ID":"4df2426e-59b0-4dd7-ac07-90d478ff86c2","Type":"ContainerStarted","Data":"f89e4cd47bcd155cd88053daed9f3f9a88faafb5f0da84a87ce2585cd4d79ebc"} Apr 16 13:59:55.441300 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:55.440076 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" event={"ID":"346c3280-2b45-4be3-8629-46903ecfe4b8","Type":"ContainerStarted","Data":"c86e470e6c3c3cce2d7c6cc26245967fcc6a53b01089502a288a5eda4d653cb6"} Apr 16 13:59:55.444810 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:55.444241 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vp4gn" event={"ID":"1edb190f-96e8-4548-8b55-97073b01a7ed","Type":"ContainerStarted","Data":"a81bfed370758658a969e9ef6045644a8f6aa28bdbeb4126e73006858036af1d"} Apr 16 13:59:55.447253 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:55.447219 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-133.ec2.internal" event={"ID":"02f569f2777966beb695e394b803ecc2","Type":"ContainerStarted","Data":"7b33ef8a1294d1f06ae2b5b9e4b6542a2a7ec87960ce2ec2062d877c7237dd77"} Apr 16 13:59:55.456096 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:55.455544 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-brkpj" podStartSLOduration=2.721544726 podStartE2EDuration="21.455508072s" podCreationTimestamp="2026-04-16 13:59:34 +0000 UTC" firstStartedPulling="2026-04-16 13:59:36.525114905 +0000 UTC m=+2.730758047" lastFinishedPulling="2026-04-16 13:59:55.259078251 +0000 UTC m=+21.464721393" observedRunningTime="2026-04-16 13:59:55.455400329 +0000 UTC m=+21.661043498" watchObservedRunningTime="2026-04-16 13:59:55.455508072 +0000 UTC m=+21.661151255" Apr 16 13:59:55.513916 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:55.513677 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-133.ec2.internal" podStartSLOduration=20.513659015 podStartE2EDuration="20.513659015s" podCreationTimestamp="2026-04-16 13:59:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:59:55.473362092 +0000 UTC m=+21.679005266" watchObservedRunningTime="2026-04-16 13:59:55.513659015 +0000 UTC m=+21.719302180" Apr 16 13:59:56.304445 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:56.304350 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-88c5t" Apr 16 13:59:56.304692 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:56.304352 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-29pd4" Apr 16 13:59:56.304692 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:56.304484 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-88c5t" podUID="5a916223-1676-42c3-a13e-815b7355eb26" Apr 16 13:59:56.304692 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:56.304587 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-29pd4" podUID="e6284f77-08e3-4846-904d-6a21f10707ae" Apr 16 13:59:56.450456 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:56.450422 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mh9s8" event={"ID":"3d42db8b-b5be-43e6-bad1-6040da8c586f","Type":"ContainerStarted","Data":"1d767ad6e20949bf31a01bd5460cfde0165c5281f0bcd1fdc00bbea6a1d82df9"} Apr 16 13:59:56.451812 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:56.451781 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-kqv5v" event={"ID":"258d5bb3-0083-4fff-96dd-2c9e007b3c05","Type":"ContainerStarted","Data":"1a0ef96ef48267ffadd61430efe26d057c9b1bed607bb38d23fef95b6c39c81d"} Apr 16 13:59:56.454497 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:56.454464 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" event={"ID":"346c3280-2b45-4be3-8629-46903ecfe4b8","Type":"ContainerStarted","Data":"d452f444b2ddb583ffeba36c30cf5e7889ae0d69548953142be164f774cf2a7a"} Apr 16 13:59:56.454497 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:56.454497 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" event={"ID":"346c3280-2b45-4be3-8629-46903ecfe4b8","Type":"ContainerStarted","Data":"72268ec3052d7371818aaedb640a7dccf156c20d86da9911558e98b07c9d4218"} Apr 16 13:59:56.454683 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:56.454506 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" event={"ID":"346c3280-2b45-4be3-8629-46903ecfe4b8","Type":"ContainerStarted","Data":"f525f83ebb67550426aabfc09e0e31b434ec8d88a34bb2ba083c477f33c8c117"} Apr 16 13:59:56.454683 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:56.454520 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" event={"ID":"346c3280-2b45-4be3-8629-46903ecfe4b8","Type":"ContainerStarted","Data":"ea559be409ecceaef1c3df4fbfd91c49d3760e5a1f4584ca40929566b2f8d51e"} Apr 16 13:59:56.454683 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:56.454545 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" event={"ID":"346c3280-2b45-4be3-8629-46903ecfe4b8","Type":"ContainerStarted","Data":"9f120d0b2bd05f39f5b1477029aba460ac016593b9bc7433d283560b7e35b596"} Apr 16 13:59:56.455895 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:56.455863 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-p9j4h" event={"ID":"29d8d95e-1f57-49fb-9896-340b389f0eea","Type":"ContainerStarted","Data":"045f9271a6b4220060bcce97af85f9d0c25737a0f82705db2aa0a538881aabf6"} Apr 16 13:59:56.459001 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:56.458976 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-wkbwm" event={"ID":"529a37ce-5549-43e3-bcab-ff0f9a6e46d6","Type":"ContainerStarted","Data":"23be4dd813e3754f224b78a83ca7e847607e8c6917b19fa1f30872177dc7733e"} Apr 16 13:59:56.459853 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:56.459838 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-wkbwm" Apr 16 13:59:56.460456 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:56.460438 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-wkbwm" Apr 16 13:59:56.460798 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:56.460779 2570 generic.go:358] "Generic (PLEG): container finished" podID="39d671c484f3397b8f32690faf7d7a58" containerID="ab883f1e90d9134f147ae8966dfb72cd691e592d113aedbeaad8aa358570c356" exitCode=0 Apr 16 13:59:56.460890 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:56.460869 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-133.ec2.internal" event={"ID":"39d671c484f3397b8f32690faf7d7a58","Type":"ContainerDied","Data":"ab883f1e90d9134f147ae8966dfb72cd691e592d113aedbeaad8aa358570c356"} Apr 16 13:59:56.462594 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:56.462559 2570 generic.go:358] "Generic (PLEG): container finished" podID="d0acc482-de34-4188-ba44-20d609da46d0" containerID="6bc045bf039307dd356405a91e457a0c75c846357e64a8272fd3d0403b0962ab" exitCode=0 Apr 16 13:59:56.462688 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:56.462657 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bg5bb" event={"ID":"d0acc482-de34-4188-ba44-20d609da46d0","Type":"ContainerDied","Data":"6bc045bf039307dd356405a91e457a0c75c846357e64a8272fd3d0403b0962ab"} Apr 16 13:59:56.485525 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:56.485478 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-kqv5v" podStartSLOduration=3.739718752 podStartE2EDuration="22.485463435s" podCreationTimestamp="2026-04-16 13:59:34 +0000 UTC" firstStartedPulling="2026-04-16 13:59:36.51614978 +0000 UTC m=+2.721792925" lastFinishedPulling="2026-04-16 13:59:55.261894452 +0000 UTC m=+21.467537608" observedRunningTime="2026-04-16 13:59:56.48460674 +0000 UTC m=+22.690249906" watchObservedRunningTime="2026-04-16 13:59:56.485463435 +0000 UTC m=+22.691106598" Apr 16 13:59:56.485675 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:56.485646 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-vp4gn" podStartSLOduration=3.681091164 podStartE2EDuration="22.485641258s" podCreationTimestamp="2026-04-16 13:59:34 +0000 UTC" firstStartedPulling="2026-04-16 13:59:36.504837304 +0000 UTC m=+2.710480446" lastFinishedPulling="2026-04-16 13:59:55.309387384 +0000 UTC m=+21.515030540" observedRunningTime="2026-04-16 13:59:55.517725291 +0000 UTC m=+21.723368449" watchObservedRunningTime="2026-04-16 13:59:56.485641258 +0000 UTC m=+22.691284425" Apr 16 13:59:56.541704 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:56.541654 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-wkbwm" podStartSLOduration=3.7797409699999998 podStartE2EDuration="22.54163827s" podCreationTimestamp="2026-04-16 13:59:34 +0000 UTC" firstStartedPulling="2026-04-16 13:59:36.488317144 +0000 UTC m=+2.693960285" lastFinishedPulling="2026-04-16 13:59:55.250214423 +0000 UTC m=+21.455857585" observedRunningTime="2026-04-16 13:59:56.541008256 +0000 UTC m=+22.746651420" watchObservedRunningTime="2026-04-16 13:59:56.54163827 +0000 UTC m=+22.747281433" Apr 16 13:59:56.579241 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:56.579199 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-p9j4h" podStartSLOduration=3.822082406 podStartE2EDuration="22.579183734s" podCreationTimestamp="2026-04-16 13:59:34 +0000 UTC" firstStartedPulling="2026-04-16 13:59:36.493115283 +0000 UTC m=+2.698758425" lastFinishedPulling="2026-04-16 13:59:55.250216597 +0000 UTC m=+21.455859753" observedRunningTime="2026-04-16 13:59:56.578863341 +0000 UTC m=+22.784506506" watchObservedRunningTime="2026-04-16 13:59:56.579183734 +0000 UTC m=+22.784826897" Apr 16 13:59:57.466495 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:57.466441 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-rtqtn" event={"ID":"e5c96c36-5440-4e0b-b632-7b2269bd2b80","Type":"ContainerStarted","Data":"9b5148fb3399aa664523fe482fc4e97dcba6e5966816bea7fd18b1ce878d064c"} Apr 16 13:59:57.469320 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:57.469291 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-133.ec2.internal" event={"ID":"39d671c484f3397b8f32690faf7d7a58","Type":"ContainerStarted","Data":"8c4aded1e387e66e54e462599c9b9eb4b15beeed71cead458796858054feff0b"} Apr 16 13:59:57.470263 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:57.470236 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-wkbwm" Apr 16 13:59:57.470770 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:57.470748 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-wkbwm" Apr 16 13:59:57.484079 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:57.483595 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-rtqtn" podStartSLOduration=4.767310803 podStartE2EDuration="23.483580376s" podCreationTimestamp="2026-04-16 13:59:34 +0000 UTC" firstStartedPulling="2026-04-16 13:59:36.536797538 +0000 UTC m=+2.742440680" lastFinishedPulling="2026-04-16 13:59:55.253067112 +0000 UTC m=+21.458710253" observedRunningTime="2026-04-16 13:59:57.483243077 +0000 UTC m=+23.688886255" watchObservedRunningTime="2026-04-16 13:59:57.483580376 +0000 UTC m=+23.689223540" Apr 16 13:59:57.498932 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:57.498874 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-133.ec2.internal" podStartSLOduration=22.498858728 podStartE2EDuration="22.498858728s" podCreationTimestamp="2026-04-16 13:59:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:59:57.498772949 +0000 UTC m=+23.704416112" watchObservedRunningTime="2026-04-16 13:59:57.498858728 +0000 UTC m=+23.704501892" Apr 16 13:59:57.762703 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:57.762671 2570 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 13:59:58.260557 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:58.260417 2570 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T13:59:57.762692702Z","UUID":"4712fd5a-1f15-485e-a64e-768d00ceba9a","Handler":null,"Name":"","Endpoint":""} Apr 16 13:59:58.262404 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:58.262378 2570 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 13:59:58.262562 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:58.262427 2570 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 13:59:58.305011 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:58.304979 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-88c5t" Apr 16 13:59:58.305201 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:58.305116 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-88c5t" podUID="5a916223-1676-42c3-a13e-815b7355eb26" Apr 16 13:59:58.305201 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:58.305183 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-29pd4" Apr 16 13:59:58.305347 ip-10-0-133-133 kubenswrapper[2570]: E0416 13:59:58.305315 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-29pd4" podUID="e6284f77-08e3-4846-904d-6a21f10707ae" Apr 16 13:59:58.474345 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:58.474310 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" event={"ID":"346c3280-2b45-4be3-8629-46903ecfe4b8","Type":"ContainerStarted","Data":"175e1eeab39dfefb5ec8d2f8b110f373fd40130abc8fff3680675908f2c34b8c"} Apr 16 13:59:58.476164 ip-10-0-133-133 kubenswrapper[2570]: I0416 13:59:58.476130 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mh9s8" event={"ID":"3d42db8b-b5be-43e6-bad1-6040da8c586f","Type":"ContainerStarted","Data":"26b7a3cc3acbe870793a3326446b9c3ba77d7430e105e5f0049e55bafb3d39b0"} Apr 16 14:00:00.304960 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:00.304874 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-29pd4" Apr 16 14:00:00.304960 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:00.304921 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-88c5t" Apr 16 14:00:00.305414 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:00.305029 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-29pd4" podUID="e6284f77-08e3-4846-904d-6a21f10707ae" Apr 16 14:00:00.305414 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:00.305162 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-88c5t" podUID="5a916223-1676-42c3-a13e-815b7355eb26" Apr 16 14:00:01.483934 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:01.483726 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mh9s8" event={"ID":"3d42db8b-b5be-43e6-bad1-6040da8c586f","Type":"ContainerStarted","Data":"7cc32297a5c243bc7563e48c7fb0bea4b8356c746990fbe8d075b8bd8512dec9"} Apr 16 14:00:01.486934 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:01.486892 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" event={"ID":"346c3280-2b45-4be3-8629-46903ecfe4b8","Type":"ContainerStarted","Data":"013b2f9231514e4f57cfcf1f585822775ca2d52ce156e51f68ad369c9ef8fbd2"} Apr 16 14:00:01.487273 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:01.487251 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 14:00:01.487387 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:01.487283 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 14:00:01.487387 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:01.487295 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 14:00:01.488798 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:01.488773 2570 generic.go:358] "Generic (PLEG): container finished" podID="d0acc482-de34-4188-ba44-20d609da46d0" containerID="cedfd9f10b8f4347e39096f54ab9adee3a319b7ce220ec187e7b10231c2ed668" exitCode=0 Apr 16 14:00:01.488898 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:01.488811 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bg5bb" event={"ID":"d0acc482-de34-4188-ba44-20d609da46d0","Type":"ContainerDied","Data":"cedfd9f10b8f4347e39096f54ab9adee3a319b7ce220ec187e7b10231c2ed668"} Apr 16 14:00:01.502116 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:01.502054 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-mh9s8" podStartSLOduration=3.311871192 podStartE2EDuration="27.502035631s" podCreationTimestamp="2026-04-16 13:59:34 +0000 UTC" firstStartedPulling="2026-04-16 13:59:36.519819214 +0000 UTC m=+2.725462357" lastFinishedPulling="2026-04-16 14:00:00.709983653 +0000 UTC m=+26.915626796" observedRunningTime="2026-04-16 14:00:01.501706549 +0000 UTC m=+27.707349713" watchObservedRunningTime="2026-04-16 14:00:01.502035631 +0000 UTC m=+27.707678796" Apr 16 14:00:01.503517 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:01.503495 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 14:00:01.503628 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:01.503581 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 14:00:01.553927 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:01.553870 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" podStartSLOduration=8.74675345 podStartE2EDuration="27.553854114s" podCreationTimestamp="2026-04-16 13:59:34 +0000 UTC" firstStartedPulling="2026-04-16 13:59:36.508934354 +0000 UTC m=+2.714577510" lastFinishedPulling="2026-04-16 13:59:55.31603502 +0000 UTC m=+21.521678174" observedRunningTime="2026-04-16 14:00:01.553657449 +0000 UTC m=+27.759300612" watchObservedRunningTime="2026-04-16 14:00:01.553854114 +0000 UTC m=+27.759497279" Apr 16 14:00:02.307612 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:02.307585 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-88c5t" Apr 16 14:00:02.307824 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:02.307585 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-29pd4" Apr 16 14:00:02.307824 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:02.307711 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-88c5t" podUID="5a916223-1676-42c3-a13e-815b7355eb26" Apr 16 14:00:02.307824 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:02.307768 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-29pd4" podUID="e6284f77-08e3-4846-904d-6a21f10707ae" Apr 16 14:00:02.734016 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:02.733736 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-29pd4"] Apr 16 14:00:02.734486 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:02.734149 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-29pd4" Apr 16 14:00:02.734486 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:02.734268 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-29pd4" podUID="e6284f77-08e3-4846-904d-6a21f10707ae" Apr 16 14:00:02.736464 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:02.736428 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-88c5t"] Apr 16 14:00:02.736637 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:02.736610 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-88c5t" Apr 16 14:00:02.736731 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:02.736710 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-88c5t" podUID="5a916223-1676-42c3-a13e-815b7355eb26" Apr 16 14:00:04.305719 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:04.305679 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-29pd4" Apr 16 14:00:04.306273 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:04.305774 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-29pd4" podUID="e6284f77-08e3-4846-904d-6a21f10707ae" Apr 16 14:00:04.306273 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:04.305855 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-88c5t" Apr 16 14:00:04.306273 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:04.305941 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-88c5t" podUID="5a916223-1676-42c3-a13e-815b7355eb26" Apr 16 14:00:04.497207 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:04.497170 2570 generic.go:358] "Generic (PLEG): container finished" podID="d0acc482-de34-4188-ba44-20d609da46d0" containerID="85c9bb8f7b852640d04200060cadba877e038d8e5e446b47a990fe99fd675a52" exitCode=0 Apr 16 14:00:04.497389 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:04.497225 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bg5bb" event={"ID":"d0acc482-de34-4188-ba44-20d609da46d0","Type":"ContainerDied","Data":"85c9bb8f7b852640d04200060cadba877e038d8e5e446b47a990fe99fd675a52"} Apr 16 14:00:06.304454 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:06.304355 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-88c5t" Apr 16 14:00:06.304865 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:06.304355 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-29pd4" Apr 16 14:00:06.304865 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:06.304464 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-88c5t" podUID="5a916223-1676-42c3-a13e-815b7355eb26" Apr 16 14:00:06.304865 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:06.304582 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-29pd4" podUID="e6284f77-08e3-4846-904d-6a21f10707ae" Apr 16 14:00:06.503243 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:06.503205 2570 generic.go:358] "Generic (PLEG): container finished" podID="d0acc482-de34-4188-ba44-20d609da46d0" containerID="1349724259cb15e746d0d535fb23ad130bc45492d0bf6012a47fbe28d66e9c95" exitCode=0 Apr 16 14:00:06.503243 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:06.503244 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bg5bb" event={"ID":"d0acc482-de34-4188-ba44-20d609da46d0","Type":"ContainerDied","Data":"1349724259cb15e746d0d535fb23ad130bc45492d0bf6012a47fbe28d66e9c95"} Apr 16 14:00:07.940931 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:07.940702 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e6284f77-08e3-4846-904d-6a21f10707ae-metrics-certs\") pod \"network-metrics-daemon-29pd4\" (UID: \"e6284f77-08e3-4846-904d-6a21f10707ae\") " pod="openshift-multus/network-metrics-daemon-29pd4" Apr 16 14:00:07.941341 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:07.940861 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:00:07.941341 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:07.941054 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6284f77-08e3-4846-904d-6a21f10707ae-metrics-certs podName:e6284f77-08e3-4846-904d-6a21f10707ae nodeName:}" failed. No retries permitted until 2026-04-16 14:00:39.941032976 +0000 UTC m=+66.146676119 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e6284f77-08e3-4846-904d-6a21f10707ae-metrics-certs") pod "network-metrics-daemon-29pd4" (UID: "e6284f77-08e3-4846-904d-6a21f10707ae") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 14:00:08.113950 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.113859 2570 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-133.ec2.internal" event="NodeReady" Apr 16 14:00:08.114106 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.114029 2570 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 14:00:08.143286 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.143253 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n62vn\" (UniqueName: \"kubernetes.io/projected/5a916223-1676-42c3-a13e-815b7355eb26-kube-api-access-n62vn\") pod \"network-check-target-88c5t\" (UID: \"5a916223-1676-42c3-a13e-815b7355eb26\") " pod="openshift-network-diagnostics/network-check-target-88c5t" Apr 16 14:00:08.143454 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:08.143426 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 14:00:08.143454 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:08.143445 2570 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 14:00:08.143582 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:08.143458 2570 projected.go:194] Error preparing data for projected volume kube-api-access-n62vn for pod openshift-network-diagnostics/network-check-target-88c5t: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:00:08.143582 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:08.143524 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/5a916223-1676-42c3-a13e-815b7355eb26-kube-api-access-n62vn podName:5a916223-1676-42c3-a13e-815b7355eb26 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:40.143506098 +0000 UTC m=+66.349149239 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-n62vn" (UniqueName: "kubernetes.io/projected/5a916223-1676-42c3-a13e-815b7355eb26-kube-api-access-n62vn") pod "network-check-target-88c5t" (UID: "5a916223-1676-42c3-a13e-815b7355eb26") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 14:00:08.169781 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.169739 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-smqtz"] Apr 16 14:00:08.173672 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.173638 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-wc4m2"] Apr 16 14:00:08.173838 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.173818 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-smqtz" Apr 16 14:00:08.176234 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.175991 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wjgsd\"" Apr 16 14:00:08.176234 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.175992 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 14:00:08.176448 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.176331 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 14:00:08.177572 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.177526 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wc4m2" Apr 16 14:00:08.180324 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.180286 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 14:00:08.180442 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.180377 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 14:00:08.180754 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.180731 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 14:00:08.180842 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.180763 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pt57w\"" Apr 16 14:00:08.182124 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.182103 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wc4m2"] Apr 16 14:00:08.185703 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.185681 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-smqtz"] Apr 16 14:00:08.243652 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.243611 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgmh9\" (UniqueName: \"kubernetes.io/projected/2d037ded-fc00-41e0-b31f-c9fb98bdc629-kube-api-access-dgmh9\") pod \"ingress-canary-wc4m2\" (UID: \"2d037ded-fc00-41e0-b31f-c9fb98bdc629\") " pod="openshift-ingress-canary/ingress-canary-wc4m2" Apr 16 14:00:08.243855 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.243664 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6bb67241-6874-4040-a810-80b829751cf9-metrics-tls\") pod \"dns-default-smqtz\" (UID: \"6bb67241-6874-4040-a810-80b829751cf9\") " pod="openshift-dns/dns-default-smqtz" Apr 16 14:00:08.243855 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.243734 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6bb67241-6874-4040-a810-80b829751cf9-tmp-dir\") pod \"dns-default-smqtz\" (UID: \"6bb67241-6874-4040-a810-80b829751cf9\") " pod="openshift-dns/dns-default-smqtz" Apr 16 14:00:08.243855 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.243759 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6bb67241-6874-4040-a810-80b829751cf9-config-volume\") pod \"dns-default-smqtz\" (UID: \"6bb67241-6874-4040-a810-80b829751cf9\") " pod="openshift-dns/dns-default-smqtz" Apr 16 14:00:08.243855 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.243821 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcqgt\" (UniqueName: \"kubernetes.io/projected/6bb67241-6874-4040-a810-80b829751cf9-kube-api-access-gcqgt\") pod \"dns-default-smqtz\" (UID: \"6bb67241-6874-4040-a810-80b829751cf9\") " pod="openshift-dns/dns-default-smqtz" Apr 16 14:00:08.244033 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.243856 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d037ded-fc00-41e0-b31f-c9fb98bdc629-cert\") pod \"ingress-canary-wc4m2\" (UID: \"2d037ded-fc00-41e0-b31f-c9fb98bdc629\") " pod="openshift-ingress-canary/ingress-canary-wc4m2" Apr 16 14:00:08.304483 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.304448 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-29pd4" Apr 16 14:00:08.304664 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.304448 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-88c5t" Apr 16 14:00:08.307169 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.307143 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 14:00:08.307517 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.307498 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 14:00:08.307647 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.307560 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:00:08.307647 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.307569 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-p5bbb\"" Apr 16 14:00:08.307647 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.307571 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fjvc5\"" Apr 16 14:00:08.345127 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.345094 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gcqgt\" (UniqueName: \"kubernetes.io/projected/6bb67241-6874-4040-a810-80b829751cf9-kube-api-access-gcqgt\") pod \"dns-default-smqtz\" (UID: \"6bb67241-6874-4040-a810-80b829751cf9\") " pod="openshift-dns/dns-default-smqtz" Apr 16 14:00:08.345127 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.345129 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d037ded-fc00-41e0-b31f-c9fb98bdc629-cert\") pod \"ingress-canary-wc4m2\" (UID: \"2d037ded-fc00-41e0-b31f-c9fb98bdc629\") " pod="openshift-ingress-canary/ingress-canary-wc4m2" Apr 16 14:00:08.345345 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.345154 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dgmh9\" (UniqueName: \"kubernetes.io/projected/2d037ded-fc00-41e0-b31f-c9fb98bdc629-kube-api-access-dgmh9\") pod \"ingress-canary-wc4m2\" (UID: \"2d037ded-fc00-41e0-b31f-c9fb98bdc629\") " pod="openshift-ingress-canary/ingress-canary-wc4m2" Apr 16 14:00:08.345345 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.345178 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6bb67241-6874-4040-a810-80b829751cf9-metrics-tls\") pod \"dns-default-smqtz\" (UID: \"6bb67241-6874-4040-a810-80b829751cf9\") " pod="openshift-dns/dns-default-smqtz" Apr 16 14:00:08.345345 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.345232 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6bb67241-6874-4040-a810-80b829751cf9-tmp-dir\") pod \"dns-default-smqtz\" (UID: \"6bb67241-6874-4040-a810-80b829751cf9\") " pod="openshift-dns/dns-default-smqtz" Apr 16 14:00:08.345345 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.345255 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6bb67241-6874-4040-a810-80b829751cf9-config-volume\") pod \"dns-default-smqtz\" (UID: \"6bb67241-6874-4040-a810-80b829751cf9\") " pod="openshift-dns/dns-default-smqtz" Apr 16 14:00:08.345345 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:08.345265 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:08.345345 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:08.345325 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d037ded-fc00-41e0-b31f-c9fb98bdc629-cert podName:2d037ded-fc00-41e0-b31f-c9fb98bdc629 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:08.845306954 +0000 UTC m=+35.050950098 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2d037ded-fc00-41e0-b31f-c9fb98bdc629-cert") pod "ingress-canary-wc4m2" (UID: "2d037ded-fc00-41e0-b31f-c9fb98bdc629") : secret "canary-serving-cert" not found Apr 16 14:00:08.345568 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:08.345484 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:08.345568 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:08.345523 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bb67241-6874-4040-a810-80b829751cf9-metrics-tls podName:6bb67241-6874-4040-a810-80b829751cf9 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:08.845510172 +0000 UTC m=+35.051153317 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6bb67241-6874-4040-a810-80b829751cf9-metrics-tls") pod "dns-default-smqtz" (UID: "6bb67241-6874-4040-a810-80b829751cf9") : secret "dns-default-metrics-tls" not found Apr 16 14:00:08.345967 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.345942 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/6bb67241-6874-4040-a810-80b829751cf9-tmp-dir\") pod \"dns-default-smqtz\" (UID: \"6bb67241-6874-4040-a810-80b829751cf9\") " pod="openshift-dns/dns-default-smqtz" Apr 16 14:00:08.346149 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.346133 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6bb67241-6874-4040-a810-80b829751cf9-config-volume\") pod \"dns-default-smqtz\" (UID: \"6bb67241-6874-4040-a810-80b829751cf9\") " pod="openshift-dns/dns-default-smqtz" Apr 16 14:00:08.359270 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.359239 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgmh9\" (UniqueName: \"kubernetes.io/projected/2d037ded-fc00-41e0-b31f-c9fb98bdc629-kube-api-access-dgmh9\") pod \"ingress-canary-wc4m2\" (UID: \"2d037ded-fc00-41e0-b31f-c9fb98bdc629\") " pod="openshift-ingress-canary/ingress-canary-wc4m2" Apr 16 14:00:08.359433 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.359293 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcqgt\" (UniqueName: \"kubernetes.io/projected/6bb67241-6874-4040-a810-80b829751cf9-kube-api-access-gcqgt\") pod \"dns-default-smqtz\" (UID: \"6bb67241-6874-4040-a810-80b829751cf9\") " pod="openshift-dns/dns-default-smqtz" Apr 16 14:00:08.849584 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.849522 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d037ded-fc00-41e0-b31f-c9fb98bdc629-cert\") pod \"ingress-canary-wc4m2\" (UID: \"2d037ded-fc00-41e0-b31f-c9fb98bdc629\") " pod="openshift-ingress-canary/ingress-canary-wc4m2" Apr 16 14:00:08.849790 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:08.849603 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6bb67241-6874-4040-a810-80b829751cf9-metrics-tls\") pod \"dns-default-smqtz\" (UID: \"6bb67241-6874-4040-a810-80b829751cf9\") " pod="openshift-dns/dns-default-smqtz" Apr 16 14:00:08.849790 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:08.849707 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:08.849790 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:08.849727 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:08.849790 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:08.849786 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d037ded-fc00-41e0-b31f-c9fb98bdc629-cert podName:2d037ded-fc00-41e0-b31f-c9fb98bdc629 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:09.849765695 +0000 UTC m=+36.055408838 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2d037ded-fc00-41e0-b31f-c9fb98bdc629-cert") pod "ingress-canary-wc4m2" (UID: "2d037ded-fc00-41e0-b31f-c9fb98bdc629") : secret "canary-serving-cert" not found Apr 16 14:00:08.850012 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:08.849803 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bb67241-6874-4040-a810-80b829751cf9-metrics-tls podName:6bb67241-6874-4040-a810-80b829751cf9 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:09.849796545 +0000 UTC m=+36.055439686 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6bb67241-6874-4040-a810-80b829751cf9-metrics-tls") pod "dns-default-smqtz" (UID: "6bb67241-6874-4040-a810-80b829751cf9") : secret "dns-default-metrics-tls" not found Apr 16 14:00:09.857520 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:09.857477 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d037ded-fc00-41e0-b31f-c9fb98bdc629-cert\") pod \"ingress-canary-wc4m2\" (UID: \"2d037ded-fc00-41e0-b31f-c9fb98bdc629\") " pod="openshift-ingress-canary/ingress-canary-wc4m2" Apr 16 14:00:09.858012 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:09.857557 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6bb67241-6874-4040-a810-80b829751cf9-metrics-tls\") pod \"dns-default-smqtz\" (UID: \"6bb67241-6874-4040-a810-80b829751cf9\") " pod="openshift-dns/dns-default-smqtz" Apr 16 14:00:09.858012 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:09.857659 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:09.858012 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:09.857742 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d037ded-fc00-41e0-b31f-c9fb98bdc629-cert podName:2d037ded-fc00-41e0-b31f-c9fb98bdc629 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:11.857719592 +0000 UTC m=+38.063362809 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2d037ded-fc00-41e0-b31f-c9fb98bdc629-cert") pod "ingress-canary-wc4m2" (UID: "2d037ded-fc00-41e0-b31f-c9fb98bdc629") : secret "canary-serving-cert" not found Apr 16 14:00:09.858012 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:09.857659 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:09.858012 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:09.857809 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bb67241-6874-4040-a810-80b829751cf9-metrics-tls podName:6bb67241-6874-4040-a810-80b829751cf9 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:11.857792355 +0000 UTC m=+38.063435496 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6bb67241-6874-4040-a810-80b829751cf9-metrics-tls") pod "dns-default-smqtz" (UID: "6bb67241-6874-4040-a810-80b829751cf9") : secret "dns-default-metrics-tls" not found Apr 16 14:00:11.873694 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:11.873646 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d037ded-fc00-41e0-b31f-c9fb98bdc629-cert\") pod \"ingress-canary-wc4m2\" (UID: \"2d037ded-fc00-41e0-b31f-c9fb98bdc629\") " pod="openshift-ingress-canary/ingress-canary-wc4m2" Apr 16 14:00:11.874418 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:11.873720 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6bb67241-6874-4040-a810-80b829751cf9-metrics-tls\") pod \"dns-default-smqtz\" (UID: \"6bb67241-6874-4040-a810-80b829751cf9\") " pod="openshift-dns/dns-default-smqtz" Apr 16 14:00:11.874418 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:11.873822 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:11.874418 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:11.873861 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:11.874418 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:11.873907 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d037ded-fc00-41e0-b31f-c9fb98bdc629-cert podName:2d037ded-fc00-41e0-b31f-c9fb98bdc629 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:15.8738855 +0000 UTC m=+42.079528645 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2d037ded-fc00-41e0-b31f-c9fb98bdc629-cert") pod "ingress-canary-wc4m2" (UID: "2d037ded-fc00-41e0-b31f-c9fb98bdc629") : secret "canary-serving-cert" not found Apr 16 14:00:11.874418 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:11.873931 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bb67241-6874-4040-a810-80b829751cf9-metrics-tls podName:6bb67241-6874-4040-a810-80b829751cf9 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:15.873921294 +0000 UTC m=+42.079564436 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6bb67241-6874-4040-a810-80b829751cf9-metrics-tls") pod "dns-default-smqtz" (UID: "6bb67241-6874-4040-a810-80b829751cf9") : secret "dns-default-metrics-tls" not found Apr 16 14:00:14.519884 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:14.519850 2570 generic.go:358] "Generic (PLEG): container finished" podID="d0acc482-de34-4188-ba44-20d609da46d0" containerID="2c3add18ecbfbc13b8351d5864c3f50cbe4527aeab0f4e1f4dc26741fcfdae90" exitCode=0 Apr 16 14:00:14.520498 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:14.519912 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bg5bb" event={"ID":"d0acc482-de34-4188-ba44-20d609da46d0","Type":"ContainerDied","Data":"2c3add18ecbfbc13b8351d5864c3f50cbe4527aeab0f4e1f4dc26741fcfdae90"} Apr 16 14:00:15.524608 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:15.524572 2570 generic.go:358] "Generic (PLEG): container finished" podID="d0acc482-de34-4188-ba44-20d609da46d0" containerID="88c53d96844a220b1aa2bde4dd1e48bdb1d5526db82f58775c93ea7638aec7af" exitCode=0 Apr 16 14:00:15.525041 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:15.524638 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bg5bb" event={"ID":"d0acc482-de34-4188-ba44-20d609da46d0","Type":"ContainerDied","Data":"88c53d96844a220b1aa2bde4dd1e48bdb1d5526db82f58775c93ea7638aec7af"} Apr 16 14:00:15.905615 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:15.905567 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d037ded-fc00-41e0-b31f-c9fb98bdc629-cert\") pod \"ingress-canary-wc4m2\" (UID: \"2d037ded-fc00-41e0-b31f-c9fb98bdc629\") " pod="openshift-ingress-canary/ingress-canary-wc4m2" Apr 16 14:00:15.905781 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:15.905633 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6bb67241-6874-4040-a810-80b829751cf9-metrics-tls\") pod \"dns-default-smqtz\" (UID: \"6bb67241-6874-4040-a810-80b829751cf9\") " pod="openshift-dns/dns-default-smqtz" Apr 16 14:00:15.905781 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:15.905724 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:15.905781 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:15.905738 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:15.905876 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:15.905794 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bb67241-6874-4040-a810-80b829751cf9-metrics-tls podName:6bb67241-6874-4040-a810-80b829751cf9 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:23.905775989 +0000 UTC m=+50.111419130 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6bb67241-6874-4040-a810-80b829751cf9-metrics-tls") pod "dns-default-smqtz" (UID: "6bb67241-6874-4040-a810-80b829751cf9") : secret "dns-default-metrics-tls" not found Apr 16 14:00:15.905876 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:15.905811 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d037ded-fc00-41e0-b31f-c9fb98bdc629-cert podName:2d037ded-fc00-41e0-b31f-c9fb98bdc629 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:23.905803412 +0000 UTC m=+50.111446554 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2d037ded-fc00-41e0-b31f-c9fb98bdc629-cert") pod "ingress-canary-wc4m2" (UID: "2d037ded-fc00-41e0-b31f-c9fb98bdc629") : secret "canary-serving-cert" not found Apr 16 14:00:16.529626 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:16.529590 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-bg5bb" event={"ID":"d0acc482-de34-4188-ba44-20d609da46d0","Type":"ContainerStarted","Data":"f41cfe56cfca253df712b4b0cb7aca18c48143efcf610fcf2ce9bf3526e50aba"} Apr 16 14:00:16.551307 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:16.551247 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-bg5bb" podStartSLOduration=5.500300114 podStartE2EDuration="42.551230127s" podCreationTimestamp="2026-04-16 13:59:34 +0000 UTC" firstStartedPulling="2026-04-16 13:59:36.53196535 +0000 UTC m=+2.737608508" lastFinishedPulling="2026-04-16 14:00:13.582895376 +0000 UTC m=+39.788538521" observedRunningTime="2026-04-16 14:00:16.550077261 +0000 UTC m=+42.755720425" watchObservedRunningTime="2026-04-16 14:00:16.551230127 +0000 UTC m=+42.756873292" Apr 16 14:00:23.967489 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:23.967445 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6bb67241-6874-4040-a810-80b829751cf9-metrics-tls\") pod \"dns-default-smqtz\" (UID: \"6bb67241-6874-4040-a810-80b829751cf9\") " pod="openshift-dns/dns-default-smqtz" Apr 16 14:00:23.968048 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:23.967564 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d037ded-fc00-41e0-b31f-c9fb98bdc629-cert\") pod \"ingress-canary-wc4m2\" (UID: \"2d037ded-fc00-41e0-b31f-c9fb98bdc629\") " pod="openshift-ingress-canary/ingress-canary-wc4m2" Apr 16 14:00:23.968048 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:23.967624 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:23.968048 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:23.967650 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:23.968048 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:23.967688 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bb67241-6874-4040-a810-80b829751cf9-metrics-tls podName:6bb67241-6874-4040-a810-80b829751cf9 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:39.967672099 +0000 UTC m=+66.173315241 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6bb67241-6874-4040-a810-80b829751cf9-metrics-tls") pod "dns-default-smqtz" (UID: "6bb67241-6874-4040-a810-80b829751cf9") : secret "dns-default-metrics-tls" not found Apr 16 14:00:23.968048 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:23.967702 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d037ded-fc00-41e0-b31f-c9fb98bdc629-cert podName:2d037ded-fc00-41e0-b31f-c9fb98bdc629 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:39.967696319 +0000 UTC m=+66.173339460 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2d037ded-fc00-41e0-b31f-c9fb98bdc629-cert") pod "ingress-canary-wc4m2" (UID: "2d037ded-fc00-41e0-b31f-c9fb98bdc629") : secret "canary-serving-cert" not found Apr 16 14:00:30.771126 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.771089 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wskzx"] Apr 16 14:00:30.785938 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.785911 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wskzx" Apr 16 14:00:30.788282 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.788254 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 16 14:00:30.788401 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.788266 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 16 14:00:30.788401 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.788271 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:00:30.788813 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.788800 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 16 14:00:30.788889 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.788807 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-xxjb9\"" Apr 16 14:00:30.798658 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.798636 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wskzx"] Apr 16 14:00:30.815674 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.815640 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/234da4eb-18fa-4c53-ab1a-fbbe46b9e3cb-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-wskzx\" (UID: \"234da4eb-18fa-4c53-ab1a-fbbe46b9e3cb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wskzx" Apr 16 14:00:30.815674 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.815677 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6vxt\" (UniqueName: \"kubernetes.io/projected/234da4eb-18fa-4c53-ab1a-fbbe46b9e3cb-kube-api-access-f6vxt\") pod \"kube-storage-version-migrator-operator-756bb7d76f-wskzx\" (UID: \"234da4eb-18fa-4c53-ab1a-fbbe46b9e3cb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wskzx" Apr 16 14:00:30.815903 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.815722 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/234da4eb-18fa-4c53-ab1a-fbbe46b9e3cb-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-wskzx\" (UID: \"234da4eb-18fa-4c53-ab1a-fbbe46b9e3cb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wskzx" Apr 16 14:00:30.871901 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.871863 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-zm999"] Apr 16 14:00:30.875104 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.875079 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-zm999" Apr 16 14:00:30.877342 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.877315 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 16 14:00:30.877342 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.877316 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:00:30.877582 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.877404 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-gsqrs\"" Apr 16 14:00:30.879739 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.879713 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-qgcsx"] Apr 16 14:00:30.882798 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.882780 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-7dn56"] Apr 16 14:00:30.882977 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.882959 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-qgcsx" Apr 16 14:00:30.886082 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.885755 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7dn56" Apr 16 14:00:30.886082 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.885862 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 16 14:00:30.886366 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.886343 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 14:00:30.886590 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.886525 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 14:00:30.887018 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.886996 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-zm999"] Apr 16 14:00:30.887893 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.887875 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 16 14:00:30.887987 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.887878 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-whf45\"" Apr 16 14:00:30.890661 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.890642 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 16 14:00:30.890903 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.890885 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 14:00:30.892169 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.892147 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-xslx9\"" Apr 16 14:00:30.892688 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.892663 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 16 14:00:30.893248 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.893228 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 14:00:30.897246 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.897220 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 16 14:00:30.910026 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.909996 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-7dn56"] Apr 16 14:00:30.910937 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.910913 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-qgcsx"] Apr 16 14:00:30.916145 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.916108 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fcd418a5-48d2-4f13-a35b-28504fb6ca61-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-7dn56\" (UID: \"fcd418a5-48d2-4f13-a35b-28504fb6ca61\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7dn56" Apr 16 14:00:30.916248 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.916154 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/e39233e7-6f83-4e72-8e15-0f19ce865b49-snapshots\") pod \"insights-operator-5785d4fcdd-qgcsx\" (UID: \"e39233e7-6f83-4e72-8e15-0f19ce865b49\") " pod="openshift-insights/insights-operator-5785d4fcdd-qgcsx" Apr 16 14:00:30.916248 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.916196 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/234da4eb-18fa-4c53-ab1a-fbbe46b9e3cb-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-wskzx\" (UID: \"234da4eb-18fa-4c53-ab1a-fbbe46b9e3cb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wskzx" Apr 16 14:00:30.916341 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.916243 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lq22k\" (UniqueName: \"kubernetes.io/projected/fcd418a5-48d2-4f13-a35b-28504fb6ca61-kube-api-access-lq22k\") pod \"cluster-monitoring-operator-6667474d89-7dn56\" (UID: \"fcd418a5-48d2-4f13-a35b-28504fb6ca61\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7dn56" Apr 16 14:00:30.916341 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.916277 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e39233e7-6f83-4e72-8e15-0f19ce865b49-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-qgcsx\" (UID: \"e39233e7-6f83-4e72-8e15-0f19ce865b49\") " pod="openshift-insights/insights-operator-5785d4fcdd-qgcsx" Apr 16 14:00:30.916341 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.916326 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/fcd418a5-48d2-4f13-a35b-28504fb6ca61-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-7dn56\" (UID: \"fcd418a5-48d2-4f13-a35b-28504fb6ca61\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7dn56" Apr 16 14:00:30.916456 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.916410 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e39233e7-6f83-4e72-8e15-0f19ce865b49-tmp\") pod \"insights-operator-5785d4fcdd-qgcsx\" (UID: \"e39233e7-6f83-4e72-8e15-0f19ce865b49\") " pod="openshift-insights/insights-operator-5785d4fcdd-qgcsx" Apr 16 14:00:30.916456 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.916436 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e39233e7-6f83-4e72-8e15-0f19ce865b49-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-qgcsx\" (UID: \"e39233e7-6f83-4e72-8e15-0f19ce865b49\") " pod="openshift-insights/insights-operator-5785d4fcdd-qgcsx" Apr 16 14:00:30.916572 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.916499 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/234da4eb-18fa-4c53-ab1a-fbbe46b9e3cb-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-wskzx\" (UID: \"234da4eb-18fa-4c53-ab1a-fbbe46b9e3cb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wskzx" Apr 16 14:00:30.916572 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.916525 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e39233e7-6f83-4e72-8e15-0f19ce865b49-serving-cert\") pod \"insights-operator-5785d4fcdd-qgcsx\" (UID: \"e39233e7-6f83-4e72-8e15-0f19ce865b49\") " pod="openshift-insights/insights-operator-5785d4fcdd-qgcsx" Apr 16 14:00:30.916684 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.916571 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f6vxt\" (UniqueName: \"kubernetes.io/projected/234da4eb-18fa-4c53-ab1a-fbbe46b9e3cb-kube-api-access-f6vxt\") pod \"kube-storage-version-migrator-operator-756bb7d76f-wskzx\" (UID: \"234da4eb-18fa-4c53-ab1a-fbbe46b9e3cb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wskzx" Apr 16 14:00:30.916826 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.916799 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65glx\" (UniqueName: \"kubernetes.io/projected/2bc0eeb3-5222-4373-ab41-9da4b7efca15-kube-api-access-65glx\") pod \"volume-data-source-validator-7d955d5dd4-zm999\" (UID: \"2bc0eeb3-5222-4373-ab41-9da4b7efca15\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-zm999" Apr 16 14:00:30.916945 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.916827 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/234da4eb-18fa-4c53-ab1a-fbbe46b9e3cb-config\") pod \"kube-storage-version-migrator-operator-756bb7d76f-wskzx\" (UID: \"234da4eb-18fa-4c53-ab1a-fbbe46b9e3cb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wskzx" Apr 16 14:00:30.916945 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.916851 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8crwx\" (UniqueName: \"kubernetes.io/projected/e39233e7-6f83-4e72-8e15-0f19ce865b49-kube-api-access-8crwx\") pod \"insights-operator-5785d4fcdd-qgcsx\" (UID: \"e39233e7-6f83-4e72-8e15-0f19ce865b49\") " pod="openshift-insights/insights-operator-5785d4fcdd-qgcsx" Apr 16 14:00:30.919591 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.919570 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/234da4eb-18fa-4c53-ab1a-fbbe46b9e3cb-serving-cert\") pod \"kube-storage-version-migrator-operator-756bb7d76f-wskzx\" (UID: \"234da4eb-18fa-4c53-ab1a-fbbe46b9e3cb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wskzx" Apr 16 14:00:30.936856 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:30.936816 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6vxt\" (UniqueName: \"kubernetes.io/projected/234da4eb-18fa-4c53-ab1a-fbbe46b9e3cb-kube-api-access-f6vxt\") pod \"kube-storage-version-migrator-operator-756bb7d76f-wskzx\" (UID: \"234da4eb-18fa-4c53-ab1a-fbbe46b9e3cb\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wskzx" Apr 16 14:00:31.017417 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:31.017379 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fcd418a5-48d2-4f13-a35b-28504fb6ca61-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-7dn56\" (UID: \"fcd418a5-48d2-4f13-a35b-28504fb6ca61\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7dn56" Apr 16 14:00:31.017417 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:31.017420 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/e39233e7-6f83-4e72-8e15-0f19ce865b49-snapshots\") pod \"insights-operator-5785d4fcdd-qgcsx\" (UID: \"e39233e7-6f83-4e72-8e15-0f19ce865b49\") " pod="openshift-insights/insights-operator-5785d4fcdd-qgcsx" Apr 16 14:00:31.017709 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:31.017446 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lq22k\" (UniqueName: \"kubernetes.io/projected/fcd418a5-48d2-4f13-a35b-28504fb6ca61-kube-api-access-lq22k\") pod \"cluster-monitoring-operator-6667474d89-7dn56\" (UID: \"fcd418a5-48d2-4f13-a35b-28504fb6ca61\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7dn56" Apr 16 14:00:31.017709 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:31.017471 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e39233e7-6f83-4e72-8e15-0f19ce865b49-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-qgcsx\" (UID: \"e39233e7-6f83-4e72-8e15-0f19ce865b49\") " pod="openshift-insights/insights-operator-5785d4fcdd-qgcsx" Apr 16 14:00:31.017709 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:31.017508 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/fcd418a5-48d2-4f13-a35b-28504fb6ca61-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-7dn56\" (UID: \"fcd418a5-48d2-4f13-a35b-28504fb6ca61\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7dn56" Apr 16 14:00:31.017709 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:31.017554 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:00:31.017709 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:31.017583 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e39233e7-6f83-4e72-8e15-0f19ce865b49-tmp\") pod \"insights-operator-5785d4fcdd-qgcsx\" (UID: \"e39233e7-6f83-4e72-8e15-0f19ce865b49\") " pod="openshift-insights/insights-operator-5785d4fcdd-qgcsx" Apr 16 14:00:31.017709 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:31.017607 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e39233e7-6f83-4e72-8e15-0f19ce865b49-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-qgcsx\" (UID: \"e39233e7-6f83-4e72-8e15-0f19ce865b49\") " pod="openshift-insights/insights-operator-5785d4fcdd-qgcsx" Apr 16 14:00:31.017709 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:31.017645 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcd418a5-48d2-4f13-a35b-28504fb6ca61-cluster-monitoring-operator-tls podName:fcd418a5-48d2-4f13-a35b-28504fb6ca61 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:31.517623174 +0000 UTC m=+57.723266335 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fcd418a5-48d2-4f13-a35b-28504fb6ca61-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-7dn56" (UID: "fcd418a5-48d2-4f13-a35b-28504fb6ca61") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:00:31.018042 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:31.017722 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e39233e7-6f83-4e72-8e15-0f19ce865b49-serving-cert\") pod \"insights-operator-5785d4fcdd-qgcsx\" (UID: \"e39233e7-6f83-4e72-8e15-0f19ce865b49\") " pod="openshift-insights/insights-operator-5785d4fcdd-qgcsx" Apr 16 14:00:31.018042 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:31.017765 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-65glx\" (UniqueName: \"kubernetes.io/projected/2bc0eeb3-5222-4373-ab41-9da4b7efca15-kube-api-access-65glx\") pod \"volume-data-source-validator-7d955d5dd4-zm999\" (UID: \"2bc0eeb3-5222-4373-ab41-9da4b7efca15\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-zm999" Apr 16 14:00:31.018042 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:31.017813 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8crwx\" (UniqueName: \"kubernetes.io/projected/e39233e7-6f83-4e72-8e15-0f19ce865b49-kube-api-access-8crwx\") pod \"insights-operator-5785d4fcdd-qgcsx\" (UID: \"e39233e7-6f83-4e72-8e15-0f19ce865b49\") " pod="openshift-insights/insights-operator-5785d4fcdd-qgcsx" Apr 16 14:00:31.018263 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:31.018236 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e39233e7-6f83-4e72-8e15-0f19ce865b49-tmp\") pod \"insights-operator-5785d4fcdd-qgcsx\" (UID: \"e39233e7-6f83-4e72-8e15-0f19ce865b49\") " pod="openshift-insights/insights-operator-5785d4fcdd-qgcsx" Apr 16 14:00:31.018434 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:31.018412 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/e39233e7-6f83-4e72-8e15-0f19ce865b49-snapshots\") pod \"insights-operator-5785d4fcdd-qgcsx\" (UID: \"e39233e7-6f83-4e72-8e15-0f19ce865b49\") " pod="openshift-insights/insights-operator-5785d4fcdd-qgcsx" Apr 16 14:00:31.018736 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:31.018710 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e39233e7-6f83-4e72-8e15-0f19ce865b49-service-ca-bundle\") pod \"insights-operator-5785d4fcdd-qgcsx\" (UID: \"e39233e7-6f83-4e72-8e15-0f19ce865b49\") " pod="openshift-insights/insights-operator-5785d4fcdd-qgcsx" Apr 16 14:00:31.018979 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:31.018955 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/fcd418a5-48d2-4f13-a35b-28504fb6ca61-telemetry-config\") pod \"cluster-monitoring-operator-6667474d89-7dn56\" (UID: \"fcd418a5-48d2-4f13-a35b-28504fb6ca61\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7dn56" Apr 16 14:00:31.019051 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:31.019033 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e39233e7-6f83-4e72-8e15-0f19ce865b49-trusted-ca-bundle\") pod \"insights-operator-5785d4fcdd-qgcsx\" (UID: \"e39233e7-6f83-4e72-8e15-0f19ce865b49\") " pod="openshift-insights/insights-operator-5785d4fcdd-qgcsx" Apr 16 14:00:31.020737 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:31.020718 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e39233e7-6f83-4e72-8e15-0f19ce865b49-serving-cert\") pod \"insights-operator-5785d4fcdd-qgcsx\" (UID: \"e39233e7-6f83-4e72-8e15-0f19ce865b49\") " pod="openshift-insights/insights-operator-5785d4fcdd-qgcsx" Apr 16 14:00:31.026194 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:31.026125 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8crwx\" (UniqueName: \"kubernetes.io/projected/e39233e7-6f83-4e72-8e15-0f19ce865b49-kube-api-access-8crwx\") pod \"insights-operator-5785d4fcdd-qgcsx\" (UID: \"e39233e7-6f83-4e72-8e15-0f19ce865b49\") " pod="openshift-insights/insights-operator-5785d4fcdd-qgcsx" Apr 16 14:00:31.026366 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:31.026342 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-65glx\" (UniqueName: \"kubernetes.io/projected/2bc0eeb3-5222-4373-ab41-9da4b7efca15-kube-api-access-65glx\") pod \"volume-data-source-validator-7d955d5dd4-zm999\" (UID: \"2bc0eeb3-5222-4373-ab41-9da4b7efca15\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-zm999" Apr 16 14:00:31.026861 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:31.026834 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lq22k\" (UniqueName: \"kubernetes.io/projected/fcd418a5-48d2-4f13-a35b-28504fb6ca61-kube-api-access-lq22k\") pod \"cluster-monitoring-operator-6667474d89-7dn56\" (UID: \"fcd418a5-48d2-4f13-a35b-28504fb6ca61\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7dn56" Apr 16 14:00:31.095078 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:31.095039 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wskzx" Apr 16 14:00:31.184869 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:31.184833 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-zm999" Apr 16 14:00:31.198761 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:31.198728 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-5785d4fcdd-qgcsx" Apr 16 14:00:31.237828 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:31.237773 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wskzx"] Apr 16 14:00:31.242343 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:00:31.242275 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod234da4eb_18fa_4c53_ab1a_fbbe46b9e3cb.slice/crio-cac97470a7dc9df6b8a7219b5a24e9c291c147752475a2f3db32bc1b57f5f2c6 WatchSource:0}: Error finding container cac97470a7dc9df6b8a7219b5a24e9c291c147752475a2f3db32bc1b57f5f2c6: Status 404 returned error can't find the container with id cac97470a7dc9df6b8a7219b5a24e9c291c147752475a2f3db32bc1b57f5f2c6 Apr 16 14:00:31.339914 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:31.339850 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-zm999"] Apr 16 14:00:31.344229 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:00:31.344196 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2bc0eeb3_5222_4373_ab41_9da4b7efca15.slice/crio-1fba14001a110f49986b3ec89138da0952a0dea0f0077edf1145ece29880f667 WatchSource:0}: Error finding container 1fba14001a110f49986b3ec89138da0952a0dea0f0077edf1145ece29880f667: Status 404 returned error can't find the container with id 1fba14001a110f49986b3ec89138da0952a0dea0f0077edf1145ece29880f667 Apr 16 14:00:31.354584 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:31.354551 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-5785d4fcdd-qgcsx"] Apr 16 14:00:31.358272 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:00:31.358240 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode39233e7_6f83_4e72_8e15_0f19ce865b49.slice/crio-2dee4ad41da971a478e8c2e326b3f2b0420db57bf1714e437272a9e6a4bb010d WatchSource:0}: Error finding container 2dee4ad41da971a478e8c2e326b3f2b0420db57bf1714e437272a9e6a4bb010d: Status 404 returned error can't find the container with id 2dee4ad41da971a478e8c2e326b3f2b0420db57bf1714e437272a9e6a4bb010d Apr 16 14:00:31.522152 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:31.522114 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fcd418a5-48d2-4f13-a35b-28504fb6ca61-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-7dn56\" (UID: \"fcd418a5-48d2-4f13-a35b-28504fb6ca61\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7dn56" Apr 16 14:00:31.522331 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:31.522259 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:00:31.522331 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:31.522328 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcd418a5-48d2-4f13-a35b-28504fb6ca61-cluster-monitoring-operator-tls podName:fcd418a5-48d2-4f13-a35b-28504fb6ca61 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:32.522312712 +0000 UTC m=+58.727955855 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fcd418a5-48d2-4f13-a35b-28504fb6ca61-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-7dn56" (UID: "fcd418a5-48d2-4f13-a35b-28504fb6ca61") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:00:31.559312 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:31.559221 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-qgcsx" event={"ID":"e39233e7-6f83-4e72-8e15-0f19ce865b49","Type":"ContainerStarted","Data":"2dee4ad41da971a478e8c2e326b3f2b0420db57bf1714e437272a9e6a4bb010d"} Apr 16 14:00:31.560241 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:31.560219 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-zm999" event={"ID":"2bc0eeb3-5222-4373-ab41-9da4b7efca15","Type":"ContainerStarted","Data":"1fba14001a110f49986b3ec89138da0952a0dea0f0077edf1145ece29880f667"} Apr 16 14:00:31.561212 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:31.561194 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wskzx" event={"ID":"234da4eb-18fa-4c53-ab1a-fbbe46b9e3cb","Type":"ContainerStarted","Data":"cac97470a7dc9df6b8a7219b5a24e9c291c147752475a2f3db32bc1b57f5f2c6"} Apr 16 14:00:32.530095 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:32.530047 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fcd418a5-48d2-4f13-a35b-28504fb6ca61-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-7dn56\" (UID: \"fcd418a5-48d2-4f13-a35b-28504fb6ca61\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7dn56" Apr 16 14:00:32.530492 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:32.530163 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:00:32.530492 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:32.530227 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcd418a5-48d2-4f13-a35b-28504fb6ca61-cluster-monitoring-operator-tls podName:fcd418a5-48d2-4f13-a35b-28504fb6ca61 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:34.530213147 +0000 UTC m=+60.735856290 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fcd418a5-48d2-4f13-a35b-28504fb6ca61-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-7dn56" (UID: "fcd418a5-48d2-4f13-a35b-28504fb6ca61") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:00:33.506954 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:33.506925 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-p2d74" Apr 16 14:00:34.547675 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:34.547621 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fcd418a5-48d2-4f13-a35b-28504fb6ca61-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-7dn56\" (UID: \"fcd418a5-48d2-4f13-a35b-28504fb6ca61\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7dn56" Apr 16 14:00:34.548154 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:34.547796 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:00:34.548154 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:34.547871 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcd418a5-48d2-4f13-a35b-28504fb6ca61-cluster-monitoring-operator-tls podName:fcd418a5-48d2-4f13-a35b-28504fb6ca61 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:38.5478547 +0000 UTC m=+64.753497841 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fcd418a5-48d2-4f13-a35b-28504fb6ca61-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-7dn56" (UID: "fcd418a5-48d2-4f13-a35b-28504fb6ca61") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:00:34.568267 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:34.568227 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-zm999" event={"ID":"2bc0eeb3-5222-4373-ab41-9da4b7efca15","Type":"ContainerStarted","Data":"2f557bedbead6b21ecefd2d315971d20191d283b92d0ec2bdb5156215f2b26a8"} Apr 16 14:00:34.583469 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:34.583410 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7d955d5dd4-zm999" podStartSLOduration=2.06558455 podStartE2EDuration="4.583394169s" podCreationTimestamp="2026-04-16 14:00:30 +0000 UTC" firstStartedPulling="2026-04-16 14:00:31.346111212 +0000 UTC m=+57.551754357" lastFinishedPulling="2026-04-16 14:00:33.863920826 +0000 UTC m=+60.069563976" observedRunningTime="2026-04-16 14:00:34.583073239 +0000 UTC m=+60.788716415" watchObservedRunningTime="2026-04-16 14:00:34.583394169 +0000 UTC m=+60.789037334" Apr 16 14:00:35.572108 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:35.572067 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-qgcsx" event={"ID":"e39233e7-6f83-4e72-8e15-0f19ce865b49","Type":"ContainerStarted","Data":"400eb4e3feb3576ee03ad19d5ebd0349027e29d8157a7c2eb4fdb9229cc4f5af"} Apr 16 14:00:35.573498 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:35.573469 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wskzx" event={"ID":"234da4eb-18fa-4c53-ab1a-fbbe46b9e3cb","Type":"ContainerStarted","Data":"e2116bfba95c00ac457b921051d604ff04c693de0750873d827bb1dc4890ac40"} Apr 16 14:00:35.588841 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:35.588792 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-5785d4fcdd-qgcsx" podStartSLOduration=1.846311729 podStartE2EDuration="5.588778212s" podCreationTimestamp="2026-04-16 14:00:30 +0000 UTC" firstStartedPulling="2026-04-16 14:00:31.360210618 +0000 UTC m=+57.565853760" lastFinishedPulling="2026-04-16 14:00:35.1026771 +0000 UTC m=+61.308320243" observedRunningTime="2026-04-16 14:00:35.588445123 +0000 UTC m=+61.794088298" watchObservedRunningTime="2026-04-16 14:00:35.588778212 +0000 UTC m=+61.794421376" Apr 16 14:00:35.605638 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:35.605583 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wskzx" podStartSLOduration=1.745688522 podStartE2EDuration="5.605563094s" podCreationTimestamp="2026-04-16 14:00:30 +0000 UTC" firstStartedPulling="2026-04-16 14:00:31.244866925 +0000 UTC m=+57.450510083" lastFinishedPulling="2026-04-16 14:00:35.104741514 +0000 UTC m=+61.310384655" observedRunningTime="2026-04-16 14:00:35.604675714 +0000 UTC m=+61.810318891" watchObservedRunningTime="2026-04-16 14:00:35.605563094 +0000 UTC m=+61.811206257" Apr 16 14:00:36.418158 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:36.418116 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-4kqf9"] Apr 16 14:00:36.420116 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:36.420093 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-4kqf9" Apr 16 14:00:36.422145 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:36.422124 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-ktnbr\"" Apr 16 14:00:36.422145 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:36.422143 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 16 14:00:36.422319 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:36.422127 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 16 14:00:36.428177 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:36.428154 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-4kqf9"] Apr 16 14:00:36.467665 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:36.467634 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2c62\" (UniqueName: \"kubernetes.io/projected/e4c1dc9f-ffdc-4615-820f-9560dd37ae0b-kube-api-access-j2c62\") pod \"migrator-64d4d94569-4kqf9\" (UID: \"e4c1dc9f-ffdc-4615-820f-9560dd37ae0b\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-4kqf9" Apr 16 14:00:36.568263 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:36.568228 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j2c62\" (UniqueName: \"kubernetes.io/projected/e4c1dc9f-ffdc-4615-820f-9560dd37ae0b-kube-api-access-j2c62\") pod \"migrator-64d4d94569-4kqf9\" (UID: \"e4c1dc9f-ffdc-4615-820f-9560dd37ae0b\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-4kqf9" Apr 16 14:00:36.584450 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:36.584418 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2c62\" (UniqueName: \"kubernetes.io/projected/e4c1dc9f-ffdc-4615-820f-9560dd37ae0b-kube-api-access-j2c62\") pod \"migrator-64d4d94569-4kqf9\" (UID: \"e4c1dc9f-ffdc-4615-820f-9560dd37ae0b\") " pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-4kqf9" Apr 16 14:00:36.729366 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:36.729280 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-4kqf9" Apr 16 14:00:36.848203 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:36.848174 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-64d4d94569-4kqf9"] Apr 16 14:00:36.851636 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:00:36.851609 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4c1dc9f_ffdc_4615_820f_9560dd37ae0b.slice/crio-9c57a9f49701fc5733065d146d802c093b54786b935a8a1a04d74bf33665e008 WatchSource:0}: Error finding container 9c57a9f49701fc5733065d146d802c093b54786b935a8a1a04d74bf33665e008: Status 404 returned error can't find the container with id 9c57a9f49701fc5733065d146d802c093b54786b935a8a1a04d74bf33665e008 Apr 16 14:00:37.578498 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:37.578455 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-4kqf9" event={"ID":"e4c1dc9f-ffdc-4615-820f-9560dd37ae0b","Type":"ContainerStarted","Data":"9c57a9f49701fc5733065d146d802c093b54786b935a8a1a04d74bf33665e008"} Apr 16 14:00:38.088228 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:38.088196 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-kqv5v_258d5bb3-0083-4fff-96dd-2c9e007b3c05/dns-node-resolver/0.log" Apr 16 14:00:38.581733 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:38.581696 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-4kqf9" event={"ID":"e4c1dc9f-ffdc-4615-820f-9560dd37ae0b","Type":"ContainerStarted","Data":"69563067301dac964d5a6432f11dec2323e280e304f4e6d339a08302cefc9095"} Apr 16 14:00:38.581916 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:38.581740 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-4kqf9" event={"ID":"e4c1dc9f-ffdc-4615-820f-9560dd37ae0b","Type":"ContainerStarted","Data":"ca8ba93a195fe521fa3a6c308a0e06437915f617fba1716a8c0bd97d8dd5f49c"} Apr 16 14:00:38.586024 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:38.586000 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fcd418a5-48d2-4f13-a35b-28504fb6ca61-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-7dn56\" (UID: \"fcd418a5-48d2-4f13-a35b-28504fb6ca61\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7dn56" Apr 16 14:00:38.586162 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:38.586146 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:00:38.586219 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:38.586210 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcd418a5-48d2-4f13-a35b-28504fb6ca61-cluster-monitoring-operator-tls podName:fcd418a5-48d2-4f13-a35b-28504fb6ca61 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:46.586189791 +0000 UTC m=+72.791832937 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fcd418a5-48d2-4f13-a35b-28504fb6ca61-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-7dn56" (UID: "fcd418a5-48d2-4f13-a35b-28504fb6ca61") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:00:38.599493 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:38.599443 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-64d4d94569-4kqf9" podStartSLOduration=1.094343364 podStartE2EDuration="2.599429161s" podCreationTimestamp="2026-04-16 14:00:36 +0000 UTC" firstStartedPulling="2026-04-16 14:00:36.853517663 +0000 UTC m=+63.059160809" lastFinishedPulling="2026-04-16 14:00:38.35860346 +0000 UTC m=+64.564246606" observedRunningTime="2026-04-16 14:00:38.598962052 +0000 UTC m=+64.804605215" watchObservedRunningTime="2026-04-16 14:00:38.599429161 +0000 UTC m=+64.805072324" Apr 16 14:00:38.887930 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:38.887888 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-p9j4h_29d8d95e-1f57-49fb-9896-340b389f0eea/node-ca/0.log" Apr 16 14:00:39.475988 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:39.475953 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-pn8kd"] Apr 16 14:00:39.477774 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:39.477758 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-pn8kd" Apr 16 14:00:39.480018 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:39.479997 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 16 14:00:39.480171 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:39.480070 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 16 14:00:39.480646 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:39.480618 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 16 14:00:39.480646 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:39.480631 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-fx6zv\"" Apr 16 14:00:39.480822 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:39.480678 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 16 14:00:39.491762 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:39.491726 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-pn8kd"] Apr 16 14:00:39.593506 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:39.593470 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/79bd47ca-25a0-47cc-ae9b-4dd962dcb2f7-signing-cabundle\") pod \"service-ca-bfc587fb7-pn8kd\" (UID: \"79bd47ca-25a0-47cc-ae9b-4dd962dcb2f7\") " pod="openshift-service-ca/service-ca-bfc587fb7-pn8kd" Apr 16 14:00:39.593723 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:39.593570 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/79bd47ca-25a0-47cc-ae9b-4dd962dcb2f7-signing-key\") pod \"service-ca-bfc587fb7-pn8kd\" (UID: \"79bd47ca-25a0-47cc-ae9b-4dd962dcb2f7\") " pod="openshift-service-ca/service-ca-bfc587fb7-pn8kd" Apr 16 14:00:39.593723 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:39.593619 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lqldz\" (UniqueName: \"kubernetes.io/projected/79bd47ca-25a0-47cc-ae9b-4dd962dcb2f7-kube-api-access-lqldz\") pod \"service-ca-bfc587fb7-pn8kd\" (UID: \"79bd47ca-25a0-47cc-ae9b-4dd962dcb2f7\") " pod="openshift-service-ca/service-ca-bfc587fb7-pn8kd" Apr 16 14:00:39.694517 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:39.694480 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/79bd47ca-25a0-47cc-ae9b-4dd962dcb2f7-signing-cabundle\") pod \"service-ca-bfc587fb7-pn8kd\" (UID: \"79bd47ca-25a0-47cc-ae9b-4dd962dcb2f7\") " pod="openshift-service-ca/service-ca-bfc587fb7-pn8kd" Apr 16 14:00:39.694735 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:39.694571 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/79bd47ca-25a0-47cc-ae9b-4dd962dcb2f7-signing-key\") pod \"service-ca-bfc587fb7-pn8kd\" (UID: \"79bd47ca-25a0-47cc-ae9b-4dd962dcb2f7\") " pod="openshift-service-ca/service-ca-bfc587fb7-pn8kd" Apr 16 14:00:39.694735 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:39.694609 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lqldz\" (UniqueName: \"kubernetes.io/projected/79bd47ca-25a0-47cc-ae9b-4dd962dcb2f7-kube-api-access-lqldz\") pod \"service-ca-bfc587fb7-pn8kd\" (UID: \"79bd47ca-25a0-47cc-ae9b-4dd962dcb2f7\") " pod="openshift-service-ca/service-ca-bfc587fb7-pn8kd" Apr 16 14:00:39.695218 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:39.695192 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/79bd47ca-25a0-47cc-ae9b-4dd962dcb2f7-signing-cabundle\") pod \"service-ca-bfc587fb7-pn8kd\" (UID: \"79bd47ca-25a0-47cc-ae9b-4dd962dcb2f7\") " pod="openshift-service-ca/service-ca-bfc587fb7-pn8kd" Apr 16 14:00:39.697084 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:39.697066 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/79bd47ca-25a0-47cc-ae9b-4dd962dcb2f7-signing-key\") pod \"service-ca-bfc587fb7-pn8kd\" (UID: \"79bd47ca-25a0-47cc-ae9b-4dd962dcb2f7\") " pod="openshift-service-ca/service-ca-bfc587fb7-pn8kd" Apr 16 14:00:39.705876 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:39.705851 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lqldz\" (UniqueName: \"kubernetes.io/projected/79bd47ca-25a0-47cc-ae9b-4dd962dcb2f7-kube-api-access-lqldz\") pod \"service-ca-bfc587fb7-pn8kd\" (UID: \"79bd47ca-25a0-47cc-ae9b-4dd962dcb2f7\") " pod="openshift-service-ca/service-ca-bfc587fb7-pn8kd" Apr 16 14:00:39.786959 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:39.786864 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-bfc587fb7-pn8kd" Apr 16 14:00:39.908524 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:39.908490 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-bfc587fb7-pn8kd"] Apr 16 14:00:39.912837 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:00:39.912807 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod79bd47ca_25a0_47cc_ae9b_4dd962dcb2f7.slice/crio-9f5b4b00e41759521f7237c36fbf9dc539d7f733207e707cafb28d352db6de70 WatchSource:0}: Error finding container 9f5b4b00e41759521f7237c36fbf9dc539d7f733207e707cafb28d352db6de70: Status 404 returned error can't find the container with id 9f5b4b00e41759521f7237c36fbf9dc539d7f733207e707cafb28d352db6de70 Apr 16 14:00:39.996937 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:39.996890 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d037ded-fc00-41e0-b31f-c9fb98bdc629-cert\") pod \"ingress-canary-wc4m2\" (UID: \"2d037ded-fc00-41e0-b31f-c9fb98bdc629\") " pod="openshift-ingress-canary/ingress-canary-wc4m2" Apr 16 14:00:39.997140 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:39.996952 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6bb67241-6874-4040-a810-80b829751cf9-metrics-tls\") pod \"dns-default-smqtz\" (UID: \"6bb67241-6874-4040-a810-80b829751cf9\") " pod="openshift-dns/dns-default-smqtz" Apr 16 14:00:39.997140 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:39.996985 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e6284f77-08e3-4846-904d-6a21f10707ae-metrics-certs\") pod \"network-metrics-daemon-29pd4\" (UID: \"e6284f77-08e3-4846-904d-6a21f10707ae\") " pod="openshift-multus/network-metrics-daemon-29pd4" Apr 16 14:00:39.997140 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:39.997049 2570 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:39.997140 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:39.997101 2570 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:39.997140 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:39.997121 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2d037ded-fc00-41e0-b31f-c9fb98bdc629-cert podName:2d037ded-fc00-41e0-b31f-c9fb98bdc629 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:11.997099713 +0000 UTC m=+98.202742858 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/2d037ded-fc00-41e0-b31f-c9fb98bdc629-cert") pod "ingress-canary-wc4m2" (UID: "2d037ded-fc00-41e0-b31f-c9fb98bdc629") : secret "canary-serving-cert" not found Apr 16 14:00:39.997334 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:39.997157 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bb67241-6874-4040-a810-80b829751cf9-metrics-tls podName:6bb67241-6874-4040-a810-80b829751cf9 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:11.997139526 +0000 UTC m=+98.202782670 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/6bb67241-6874-4040-a810-80b829751cf9-metrics-tls") pod "dns-default-smqtz" (UID: "6bb67241-6874-4040-a810-80b829751cf9") : secret "dns-default-metrics-tls" not found Apr 16 14:00:39.999478 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:39.999461 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 14:00:40.008109 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:40.008083 2570 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:00:40.008227 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:40.008140 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e6284f77-08e3-4846-904d-6a21f10707ae-metrics-certs podName:e6284f77-08e3-4846-904d-6a21f10707ae nodeName:}" failed. No retries permitted until 2026-04-16 14:01:44.008124854 +0000 UTC m=+130.213767995 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/e6284f77-08e3-4846-904d-6a21f10707ae-metrics-certs") pod "network-metrics-daemon-29pd4" (UID: "e6284f77-08e3-4846-904d-6a21f10707ae") : secret "metrics-daemon-secret" not found Apr 16 14:00:40.198952 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:40.198919 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n62vn\" (UniqueName: \"kubernetes.io/projected/5a916223-1676-42c3-a13e-815b7355eb26-kube-api-access-n62vn\") pod \"network-check-target-88c5t\" (UID: \"5a916223-1676-42c3-a13e-815b7355eb26\") " pod="openshift-network-diagnostics/network-check-target-88c5t" Apr 16 14:00:40.201409 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:40.201388 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 14:00:40.212508 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:40.212487 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 14:00:40.223087 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:40.223059 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n62vn\" (UniqueName: \"kubernetes.io/projected/5a916223-1676-42c3-a13e-815b7355eb26-kube-api-access-n62vn\") pod \"network-check-target-88c5t\" (UID: \"5a916223-1676-42c3-a13e-815b7355eb26\") " pod="openshift-network-diagnostics/network-check-target-88c5t" Apr 16 14:00:40.426487 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:40.426457 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fjvc5\"" Apr 16 14:00:40.434352 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:40.434328 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-88c5t" Apr 16 14:00:40.555667 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:40.555633 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-88c5t"] Apr 16 14:00:40.558999 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:00:40.558973 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a916223_1676_42c3_a13e_815b7355eb26.slice/crio-305efe7306b744f36b6b4bd3e6776eee0aa70f9d71df04691474115532589a5b WatchSource:0}: Error finding container 305efe7306b744f36b6b4bd3e6776eee0aa70f9d71df04691474115532589a5b: Status 404 returned error can't find the container with id 305efe7306b744f36b6b4bd3e6776eee0aa70f9d71df04691474115532589a5b Apr 16 14:00:40.586474 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:40.586433 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-pn8kd" event={"ID":"79bd47ca-25a0-47cc-ae9b-4dd962dcb2f7","Type":"ContainerStarted","Data":"9f5b4b00e41759521f7237c36fbf9dc539d7f733207e707cafb28d352db6de70"} Apr 16 14:00:40.587367 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:40.587347 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-88c5t" event={"ID":"5a916223-1676-42c3-a13e-815b7355eb26","Type":"ContainerStarted","Data":"305efe7306b744f36b6b4bd3e6776eee0aa70f9d71df04691474115532589a5b"} Apr 16 14:00:42.595471 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:42.595431 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-bfc587fb7-pn8kd" event={"ID":"79bd47ca-25a0-47cc-ae9b-4dd962dcb2f7","Type":"ContainerStarted","Data":"a34a599147b764486a5d5ae985a44c12efcd0293312754645a84a4fe2e0aab90"} Apr 16 14:00:42.613763 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:42.613714 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-bfc587fb7-pn8kd" podStartSLOduration=1.575606445 podStartE2EDuration="3.613698669s" podCreationTimestamp="2026-04-16 14:00:39 +0000 UTC" firstStartedPulling="2026-04-16 14:00:39.91487184 +0000 UTC m=+66.120514995" lastFinishedPulling="2026-04-16 14:00:41.952964078 +0000 UTC m=+68.158607219" observedRunningTime="2026-04-16 14:00:42.612869165 +0000 UTC m=+68.818512360" watchObservedRunningTime="2026-04-16 14:00:42.613698669 +0000 UTC m=+68.819341859" Apr 16 14:00:46.652689 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:46.652258 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fcd418a5-48d2-4f13-a35b-28504fb6ca61-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-7dn56\" (UID: \"fcd418a5-48d2-4f13-a35b-28504fb6ca61\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7dn56" Apr 16 14:00:46.652689 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:46.652428 2570 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 16 14:00:46.652689 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:46.652512 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fcd418a5-48d2-4f13-a35b-28504fb6ca61-cluster-monitoring-operator-tls podName:fcd418a5-48d2-4f13-a35b-28504fb6ca61 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:02.652491652 +0000 UTC m=+88.858134800 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/fcd418a5-48d2-4f13-a35b-28504fb6ca61-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-6667474d89-7dn56" (UID: "fcd418a5-48d2-4f13-a35b-28504fb6ca61") : secret "cluster-monitoring-operator-tls" not found Apr 16 14:00:47.612739 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:47.612651 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-88c5t" event={"ID":"5a916223-1676-42c3-a13e-815b7355eb26","Type":"ContainerStarted","Data":"5ea816646884fbd0a6ca9546255816a68fbced425d248add58f222608ba7a5a2"} Apr 16 14:00:47.612876 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:47.612769 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-88c5t" Apr 16 14:00:47.627756 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:47.627694 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-88c5t" podStartSLOduration=66.946406001 podStartE2EDuration="1m13.627681639s" podCreationTimestamp="2026-04-16 13:59:34 +0000 UTC" firstStartedPulling="2026-04-16 14:00:40.561027055 +0000 UTC m=+66.766670196" lastFinishedPulling="2026-04-16 14:00:47.242302677 +0000 UTC m=+73.447945834" observedRunningTime="2026-04-16 14:00:47.62745663 +0000 UTC m=+73.833099796" watchObservedRunningTime="2026-04-16 14:00:47.627681639 +0000 UTC m=+73.833324802" Apr 16 14:00:59.627320 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.627276 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-57878bf6fb-j55r4"] Apr 16 14:00:59.631998 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.631974 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-474qr"] Apr 16 14:00:59.632153 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.632134 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57878bf6fb-j55r4" Apr 16 14:00:59.634548 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.634504 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 14:00:59.634548 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.634505 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 14:00:59.634873 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.634854 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 14:00:59.635005 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.634988 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-474qr" Apr 16 14:00:59.635384 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.635366 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-khk5g\"" Apr 16 14:00:59.637757 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.637737 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 16 14:00:59.637951 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.637929 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-gffh6\"" Apr 16 14:00:59.638054 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.637930 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 16 14:00:59.643680 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.643661 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 14:00:59.647376 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.647353 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-57878bf6fb-j55r4"] Apr 16 14:00:59.650429 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.650369 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-474qr"] Apr 16 14:00:59.651216 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.651198 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-6fksf"] Apr 16 14:00:59.654346 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.654326 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6fksf" Apr 16 14:00:59.656560 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.656515 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 14:00:59.656655 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.656575 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-sdpvr\"" Apr 16 14:00:59.656655 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.656577 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 14:00:59.664576 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.664523 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6fksf"] Apr 16 14:00:59.689229 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.689193 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-57878bf6fb-j55r4"] Apr 16 14:00:59.689399 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:00:59.689374 2570 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[bound-sa-token ca-trust-extracted image-registry-private-configuration installation-pull-secrets kube-api-access-fslq4 registry-certificates registry-tls trusted-ca], unattached volumes=[], failed to process volumes=[bound-sa-token ca-trust-extracted image-registry-private-configuration installation-pull-secrets kube-api-access-fslq4 registry-certificates registry-tls trusted-ca]: context canceled" pod="openshift-image-registry/image-registry-57878bf6fb-j55r4" podUID="cf3196a9-e94b-48db-be40-84858772dac6" Apr 16 14:00:59.753042 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.753007 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cf3196a9-e94b-48db-be40-84858772dac6-image-registry-private-configuration\") pod \"image-registry-57878bf6fb-j55r4\" (UID: \"cf3196a9-e94b-48db-be40-84858772dac6\") " pod="openshift-image-registry/image-registry-57878bf6fb-j55r4" Apr 16 14:00:59.753042 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.753049 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cf3196a9-e94b-48db-be40-84858772dac6-registry-certificates\") pod \"image-registry-57878bf6fb-j55r4\" (UID: \"cf3196a9-e94b-48db-be40-84858772dac6\") " pod="openshift-image-registry/image-registry-57878bf6fb-j55r4" Apr 16 14:00:59.753290 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.753069 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cf3196a9-e94b-48db-be40-84858772dac6-installation-pull-secrets\") pod \"image-registry-57878bf6fb-j55r4\" (UID: \"cf3196a9-e94b-48db-be40-84858772dac6\") " pod="openshift-image-registry/image-registry-57878bf6fb-j55r4" Apr 16 14:00:59.753290 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.753122 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/75776d76-58e8-483c-ae65-df6ee9cfa222-crio-socket\") pod \"insights-runtime-extractor-6fksf\" (UID: \"75776d76-58e8-483c-ae65-df6ee9cfa222\") " pod="openshift-insights/insights-runtime-extractor-6fksf" Apr 16 14:00:59.753290 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.753168 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/64d4de55-cfd8-4588-af6f-f5d27ec26b16-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-474qr\" (UID: \"64d4de55-cfd8-4588-af6f-f5d27ec26b16\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-474qr" Apr 16 14:00:59.753290 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.753188 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fslq4\" (UniqueName: \"kubernetes.io/projected/cf3196a9-e94b-48db-be40-84858772dac6-kube-api-access-fslq4\") pod \"image-registry-57878bf6fb-j55r4\" (UID: \"cf3196a9-e94b-48db-be40-84858772dac6\") " pod="openshift-image-registry/image-registry-57878bf6fb-j55r4" Apr 16 14:00:59.753290 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.753261 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/64d4de55-cfd8-4588-af6f-f5d27ec26b16-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-474qr\" (UID: \"64d4de55-cfd8-4588-af6f-f5d27ec26b16\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-474qr" Apr 16 14:00:59.753516 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.753299 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sgqq\" (UniqueName: \"kubernetes.io/projected/75776d76-58e8-483c-ae65-df6ee9cfa222-kube-api-access-6sgqq\") pod \"insights-runtime-extractor-6fksf\" (UID: \"75776d76-58e8-483c-ae65-df6ee9cfa222\") " pod="openshift-insights/insights-runtime-extractor-6fksf" Apr 16 14:00:59.753516 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.753350 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/75776d76-58e8-483c-ae65-df6ee9cfa222-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6fksf\" (UID: \"75776d76-58e8-483c-ae65-df6ee9cfa222\") " pod="openshift-insights/insights-runtime-extractor-6fksf" Apr 16 14:00:59.753516 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.753379 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/75776d76-58e8-483c-ae65-df6ee9cfa222-data-volume\") pod \"insights-runtime-extractor-6fksf\" (UID: \"75776d76-58e8-483c-ae65-df6ee9cfa222\") " pod="openshift-insights/insights-runtime-extractor-6fksf" Apr 16 14:00:59.753516 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.753400 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cf3196a9-e94b-48db-be40-84858772dac6-bound-sa-token\") pod \"image-registry-57878bf6fb-j55r4\" (UID: \"cf3196a9-e94b-48db-be40-84858772dac6\") " pod="openshift-image-registry/image-registry-57878bf6fb-j55r4" Apr 16 14:00:59.753516 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.753447 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cf3196a9-e94b-48db-be40-84858772dac6-ca-trust-extracted\") pod \"image-registry-57878bf6fb-j55r4\" (UID: \"cf3196a9-e94b-48db-be40-84858772dac6\") " pod="openshift-image-registry/image-registry-57878bf6fb-j55r4" Apr 16 14:00:59.753516 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.753478 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf3196a9-e94b-48db-be40-84858772dac6-trusted-ca\") pod \"image-registry-57878bf6fb-j55r4\" (UID: \"cf3196a9-e94b-48db-be40-84858772dac6\") " pod="openshift-image-registry/image-registry-57878bf6fb-j55r4" Apr 16 14:00:59.753516 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.753507 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cf3196a9-e94b-48db-be40-84858772dac6-registry-tls\") pod \"image-registry-57878bf6fb-j55r4\" (UID: \"cf3196a9-e94b-48db-be40-84858772dac6\") " pod="openshift-image-registry/image-registry-57878bf6fb-j55r4" Apr 16 14:00:59.753782 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.753559 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/75776d76-58e8-483c-ae65-df6ee9cfa222-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6fksf\" (UID: \"75776d76-58e8-483c-ae65-df6ee9cfa222\") " pod="openshift-insights/insights-runtime-extractor-6fksf" Apr 16 14:00:59.854419 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.854380 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/64d4de55-cfd8-4588-af6f-f5d27ec26b16-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-474qr\" (UID: \"64d4de55-cfd8-4588-af6f-f5d27ec26b16\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-474qr" Apr 16 14:00:59.854419 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.854422 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fslq4\" (UniqueName: \"kubernetes.io/projected/cf3196a9-e94b-48db-be40-84858772dac6-kube-api-access-fslq4\") pod \"image-registry-57878bf6fb-j55r4\" (UID: \"cf3196a9-e94b-48db-be40-84858772dac6\") " pod="openshift-image-registry/image-registry-57878bf6fb-j55r4" Apr 16 14:00:59.854661 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.854604 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/64d4de55-cfd8-4588-af6f-f5d27ec26b16-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-474qr\" (UID: \"64d4de55-cfd8-4588-af6f-f5d27ec26b16\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-474qr" Apr 16 14:00:59.854661 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.854645 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6sgqq\" (UniqueName: \"kubernetes.io/projected/75776d76-58e8-483c-ae65-df6ee9cfa222-kube-api-access-6sgqq\") pod \"insights-runtime-extractor-6fksf\" (UID: \"75776d76-58e8-483c-ae65-df6ee9cfa222\") " pod="openshift-insights/insights-runtime-extractor-6fksf" Apr 16 14:00:59.854737 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.854679 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/75776d76-58e8-483c-ae65-df6ee9cfa222-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6fksf\" (UID: \"75776d76-58e8-483c-ae65-df6ee9cfa222\") " pod="openshift-insights/insights-runtime-extractor-6fksf" Apr 16 14:00:59.854737 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.854722 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/75776d76-58e8-483c-ae65-df6ee9cfa222-data-volume\") pod \"insights-runtime-extractor-6fksf\" (UID: \"75776d76-58e8-483c-ae65-df6ee9cfa222\") " pod="openshift-insights/insights-runtime-extractor-6fksf" Apr 16 14:00:59.854828 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.854757 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cf3196a9-e94b-48db-be40-84858772dac6-bound-sa-token\") pod \"image-registry-57878bf6fb-j55r4\" (UID: \"cf3196a9-e94b-48db-be40-84858772dac6\") " pod="openshift-image-registry/image-registry-57878bf6fb-j55r4" Apr 16 14:00:59.854881 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.854864 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cf3196a9-e94b-48db-be40-84858772dac6-ca-trust-extracted\") pod \"image-registry-57878bf6fb-j55r4\" (UID: \"cf3196a9-e94b-48db-be40-84858772dac6\") " pod="openshift-image-registry/image-registry-57878bf6fb-j55r4" Apr 16 14:00:59.854934 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.854916 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf3196a9-e94b-48db-be40-84858772dac6-trusted-ca\") pod \"image-registry-57878bf6fb-j55r4\" (UID: \"cf3196a9-e94b-48db-be40-84858772dac6\") " pod="openshift-image-registry/image-registry-57878bf6fb-j55r4" Apr 16 14:00:59.854985 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.854953 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cf3196a9-e94b-48db-be40-84858772dac6-registry-tls\") pod \"image-registry-57878bf6fb-j55r4\" (UID: \"cf3196a9-e94b-48db-be40-84858772dac6\") " pod="openshift-image-registry/image-registry-57878bf6fb-j55r4" Apr 16 14:00:59.855038 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.854982 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/75776d76-58e8-483c-ae65-df6ee9cfa222-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6fksf\" (UID: \"75776d76-58e8-483c-ae65-df6ee9cfa222\") " pod="openshift-insights/insights-runtime-extractor-6fksf" Apr 16 14:00:59.855095 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.855045 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cf3196a9-e94b-48db-be40-84858772dac6-image-registry-private-configuration\") pod \"image-registry-57878bf6fb-j55r4\" (UID: \"cf3196a9-e94b-48db-be40-84858772dac6\") " pod="openshift-image-registry/image-registry-57878bf6fb-j55r4" Apr 16 14:00:59.855146 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.855101 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cf3196a9-e94b-48db-be40-84858772dac6-registry-certificates\") pod \"image-registry-57878bf6fb-j55r4\" (UID: \"cf3196a9-e94b-48db-be40-84858772dac6\") " pod="openshift-image-registry/image-registry-57878bf6fb-j55r4" Apr 16 14:00:59.855146 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.855132 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cf3196a9-e94b-48db-be40-84858772dac6-installation-pull-secrets\") pod \"image-registry-57878bf6fb-j55r4\" (UID: \"cf3196a9-e94b-48db-be40-84858772dac6\") " pod="openshift-image-registry/image-registry-57878bf6fb-j55r4" Apr 16 14:00:59.855243 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.855150 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/75776d76-58e8-483c-ae65-df6ee9cfa222-data-volume\") pod \"insights-runtime-extractor-6fksf\" (UID: \"75776d76-58e8-483c-ae65-df6ee9cfa222\") " pod="openshift-insights/insights-runtime-extractor-6fksf" Apr 16 14:00:59.855243 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.855161 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/75776d76-58e8-483c-ae65-df6ee9cfa222-crio-socket\") pod \"insights-runtime-extractor-6fksf\" (UID: \"75776d76-58e8-483c-ae65-df6ee9cfa222\") " pod="openshift-insights/insights-runtime-extractor-6fksf" Apr 16 14:00:59.855326 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.855265 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cf3196a9-e94b-48db-be40-84858772dac6-ca-trust-extracted\") pod \"image-registry-57878bf6fb-j55r4\" (UID: \"cf3196a9-e94b-48db-be40-84858772dac6\") " pod="openshift-image-registry/image-registry-57878bf6fb-j55r4" Apr 16 14:00:59.855372 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.855356 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/64d4de55-cfd8-4588-af6f-f5d27ec26b16-nginx-conf\") pod \"networking-console-plugin-5cb6cf4cb4-474qr\" (UID: \"64d4de55-cfd8-4588-af6f-f5d27ec26b16\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-474qr" Apr 16 14:00:59.855456 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.855429 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/75776d76-58e8-483c-ae65-df6ee9cfa222-crio-socket\") pod \"insights-runtime-extractor-6fksf\" (UID: \"75776d76-58e8-483c-ae65-df6ee9cfa222\") " pod="openshift-insights/insights-runtime-extractor-6fksf" Apr 16 14:00:59.855843 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.855822 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/75776d76-58e8-483c-ae65-df6ee9cfa222-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6fksf\" (UID: \"75776d76-58e8-483c-ae65-df6ee9cfa222\") " pod="openshift-insights/insights-runtime-extractor-6fksf" Apr 16 14:00:59.856692 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.856668 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cf3196a9-e94b-48db-be40-84858772dac6-registry-certificates\") pod \"image-registry-57878bf6fb-j55r4\" (UID: \"cf3196a9-e94b-48db-be40-84858772dac6\") " pod="openshift-image-registry/image-registry-57878bf6fb-j55r4" Apr 16 14:00:59.856791 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.856730 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf3196a9-e94b-48db-be40-84858772dac6-trusted-ca\") pod \"image-registry-57878bf6fb-j55r4\" (UID: \"cf3196a9-e94b-48db-be40-84858772dac6\") " pod="openshift-image-registry/image-registry-57878bf6fb-j55r4" Apr 16 14:00:59.857693 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.857669 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/64d4de55-cfd8-4588-af6f-f5d27ec26b16-networking-console-plugin-cert\") pod \"networking-console-plugin-5cb6cf4cb4-474qr\" (UID: \"64d4de55-cfd8-4588-af6f-f5d27ec26b16\") " pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-474qr" Apr 16 14:00:59.857784 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.857740 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/75776d76-58e8-483c-ae65-df6ee9cfa222-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6fksf\" (UID: \"75776d76-58e8-483c-ae65-df6ee9cfa222\") " pod="openshift-insights/insights-runtime-extractor-6fksf" Apr 16 14:00:59.858354 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.858318 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cf3196a9-e94b-48db-be40-84858772dac6-installation-pull-secrets\") pod \"image-registry-57878bf6fb-j55r4\" (UID: \"cf3196a9-e94b-48db-be40-84858772dac6\") " pod="openshift-image-registry/image-registry-57878bf6fb-j55r4" Apr 16 14:00:59.858658 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.858640 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cf3196a9-e94b-48db-be40-84858772dac6-image-registry-private-configuration\") pod \"image-registry-57878bf6fb-j55r4\" (UID: \"cf3196a9-e94b-48db-be40-84858772dac6\") " pod="openshift-image-registry/image-registry-57878bf6fb-j55r4" Apr 16 14:00:59.858745 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.858728 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cf3196a9-e94b-48db-be40-84858772dac6-registry-tls\") pod \"image-registry-57878bf6fb-j55r4\" (UID: \"cf3196a9-e94b-48db-be40-84858772dac6\") " pod="openshift-image-registry/image-registry-57878bf6fb-j55r4" Apr 16 14:00:59.863017 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.862991 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sgqq\" (UniqueName: \"kubernetes.io/projected/75776d76-58e8-483c-ae65-df6ee9cfa222-kube-api-access-6sgqq\") pod \"insights-runtime-extractor-6fksf\" (UID: \"75776d76-58e8-483c-ae65-df6ee9cfa222\") " pod="openshift-insights/insights-runtime-extractor-6fksf" Apr 16 14:00:59.863106 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.863043 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fslq4\" (UniqueName: \"kubernetes.io/projected/cf3196a9-e94b-48db-be40-84858772dac6-kube-api-access-fslq4\") pod \"image-registry-57878bf6fb-j55r4\" (UID: \"cf3196a9-e94b-48db-be40-84858772dac6\") " pod="openshift-image-registry/image-registry-57878bf6fb-j55r4" Apr 16 14:00:59.863243 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.863227 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cf3196a9-e94b-48db-be40-84858772dac6-bound-sa-token\") pod \"image-registry-57878bf6fb-j55r4\" (UID: \"cf3196a9-e94b-48db-be40-84858772dac6\") " pod="openshift-image-registry/image-registry-57878bf6fb-j55r4" Apr 16 14:00:59.950246 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.950205 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-474qr" Apr 16 14:00:59.963841 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:00:59.963816 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6fksf" Apr 16 14:01:00.087093 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:00.087057 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cb6cf4cb4-474qr"] Apr 16 14:01:00.090146 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:01:00.090119 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64d4de55_cfd8_4588_af6f_f5d27ec26b16.slice/crio-de319812561dfd8e2235dbb20a541b07cae7a63cd68f3fca9c36737d5881d87a WatchSource:0}: Error finding container de319812561dfd8e2235dbb20a541b07cae7a63cd68f3fca9c36737d5881d87a: Status 404 returned error can't find the container with id de319812561dfd8e2235dbb20a541b07cae7a63cd68f3fca9c36737d5881d87a Apr 16 14:01:00.105697 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:00.105666 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6fksf"] Apr 16 14:01:00.109027 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:01:00.108997 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75776d76_58e8_483c_ae65_df6ee9cfa222.slice/crio-11506a59ab8e9184fed3b1222a9356dbdbcfde34b995663d519aeac348bfa969 WatchSource:0}: Error finding container 11506a59ab8e9184fed3b1222a9356dbdbcfde34b995663d519aeac348bfa969: Status 404 returned error can't find the container with id 11506a59ab8e9184fed3b1222a9356dbdbcfde34b995663d519aeac348bfa969 Apr 16 14:01:00.646512 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:00.646468 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-474qr" event={"ID":"64d4de55-cfd8-4588-af6f-f5d27ec26b16","Type":"ContainerStarted","Data":"de319812561dfd8e2235dbb20a541b07cae7a63cd68f3fca9c36737d5881d87a"} Apr 16 14:01:00.647805 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:00.647776 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57878bf6fb-j55r4" Apr 16 14:01:00.647805 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:00.647780 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6fksf" event={"ID":"75776d76-58e8-483c-ae65-df6ee9cfa222","Type":"ContainerStarted","Data":"5a888cee4f967845efb4bbd2227551440ab1db2dee1ffd378b253779ad23d8c2"} Apr 16 14:01:00.647938 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:00.647814 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6fksf" event={"ID":"75776d76-58e8-483c-ae65-df6ee9cfa222","Type":"ContainerStarted","Data":"11506a59ab8e9184fed3b1222a9356dbdbcfde34b995663d519aeac348bfa969"} Apr 16 14:01:00.651673 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:00.651656 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57878bf6fb-j55r4" Apr 16 14:01:00.763189 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:00.763145 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cf3196a9-e94b-48db-be40-84858772dac6-bound-sa-token\") pod \"cf3196a9-e94b-48db-be40-84858772dac6\" (UID: \"cf3196a9-e94b-48db-be40-84858772dac6\") " Apr 16 14:01:00.763189 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:00.763193 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fslq4\" (UniqueName: \"kubernetes.io/projected/cf3196a9-e94b-48db-be40-84858772dac6-kube-api-access-fslq4\") pod \"cf3196a9-e94b-48db-be40-84858772dac6\" (UID: \"cf3196a9-e94b-48db-be40-84858772dac6\") " Apr 16 14:01:00.763392 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:00.763217 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cf3196a9-e94b-48db-be40-84858772dac6-registry-certificates\") pod \"cf3196a9-e94b-48db-be40-84858772dac6\" (UID: \"cf3196a9-e94b-48db-be40-84858772dac6\") " Apr 16 14:01:00.763392 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:00.763244 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cf3196a9-e94b-48db-be40-84858772dac6-image-registry-private-configuration\") pod \"cf3196a9-e94b-48db-be40-84858772dac6\" (UID: \"cf3196a9-e94b-48db-be40-84858772dac6\") " Apr 16 14:01:00.763392 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:00.763291 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cf3196a9-e94b-48db-be40-84858772dac6-ca-trust-extracted\") pod \"cf3196a9-e94b-48db-be40-84858772dac6\" (UID: \"cf3196a9-e94b-48db-be40-84858772dac6\") " Apr 16 14:01:00.763392 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:00.763322 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cf3196a9-e94b-48db-be40-84858772dac6-registry-tls\") pod \"cf3196a9-e94b-48db-be40-84858772dac6\" (UID: \"cf3196a9-e94b-48db-be40-84858772dac6\") " Apr 16 14:01:00.763392 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:00.763359 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cf3196a9-e94b-48db-be40-84858772dac6-installation-pull-secrets\") pod \"cf3196a9-e94b-48db-be40-84858772dac6\" (UID: \"cf3196a9-e94b-48db-be40-84858772dac6\") " Apr 16 14:01:00.763392 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:00.763374 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf3196a9-e94b-48db-be40-84858772dac6-trusted-ca\") pod \"cf3196a9-e94b-48db-be40-84858772dac6\" (UID: \"cf3196a9-e94b-48db-be40-84858772dac6\") " Apr 16 14:01:00.768882 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:00.768834 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cf3196a9-e94b-48db-be40-84858772dac6-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "cf3196a9-e94b-48db-be40-84858772dac6" (UID: "cf3196a9-e94b-48db-be40-84858772dac6"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:01:00.768996 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:00.768933 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf3196a9-e94b-48db-be40-84858772dac6-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "cf3196a9-e94b-48db-be40-84858772dac6" (UID: "cf3196a9-e94b-48db-be40-84858772dac6"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:01:00.769056 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:00.768986 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf3196a9-e94b-48db-be40-84858772dac6-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "cf3196a9-e94b-48db-be40-84858772dac6" (UID: "cf3196a9-e94b-48db-be40-84858772dac6"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:01:00.770884 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:00.770854 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf3196a9-e94b-48db-be40-84858772dac6-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "cf3196a9-e94b-48db-be40-84858772dac6" (UID: "cf3196a9-e94b-48db-be40-84858772dac6"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:01:00.771180 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:00.771152 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf3196a9-e94b-48db-be40-84858772dac6-kube-api-access-fslq4" (OuterVolumeSpecName: "kube-api-access-fslq4") pod "cf3196a9-e94b-48db-be40-84858772dac6" (UID: "cf3196a9-e94b-48db-be40-84858772dac6"). InnerVolumeSpecName "kube-api-access-fslq4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:01:00.771180 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:00.771165 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf3196a9-e94b-48db-be40-84858772dac6-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "cf3196a9-e94b-48db-be40-84858772dac6" (UID: "cf3196a9-e94b-48db-be40-84858772dac6"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:01:00.771311 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:00.771235 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf3196a9-e94b-48db-be40-84858772dac6-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "cf3196a9-e94b-48db-be40-84858772dac6" (UID: "cf3196a9-e94b-48db-be40-84858772dac6"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:01:00.771311 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:00.771233 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cf3196a9-e94b-48db-be40-84858772dac6-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "cf3196a9-e94b-48db-be40-84858772dac6" (UID: "cf3196a9-e94b-48db-be40-84858772dac6"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:01:00.865104 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:00.865074 2570 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/cf3196a9-e94b-48db-be40-84858772dac6-installation-pull-secrets\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:01:00.865230 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:00.865110 2570 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf3196a9-e94b-48db-be40-84858772dac6-trusted-ca\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:01:00.865230 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:00.865127 2570 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cf3196a9-e94b-48db-be40-84858772dac6-bound-sa-token\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:01:00.865230 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:00.865142 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fslq4\" (UniqueName: \"kubernetes.io/projected/cf3196a9-e94b-48db-be40-84858772dac6-kube-api-access-fslq4\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:01:00.865230 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:00.865157 2570 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/cf3196a9-e94b-48db-be40-84858772dac6-registry-certificates\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:01:00.865230 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:00.865172 2570 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/cf3196a9-e94b-48db-be40-84858772dac6-image-registry-private-configuration\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:01:00.865230 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:00.865187 2570 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/cf3196a9-e94b-48db-be40-84858772dac6-ca-trust-extracted\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:01:00.865230 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:00.865202 2570 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/cf3196a9-e94b-48db-be40-84858772dac6-registry-tls\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:01:01.651302 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:01.651276 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-57878bf6fb-j55r4" Apr 16 14:01:01.651719 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:01.651276 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6fksf" event={"ID":"75776d76-58e8-483c-ae65-df6ee9cfa222","Type":"ContainerStarted","Data":"83751ef56537d77e3d56b61d2c8dc484cfb192fe94152355337cf2e78d68f018"} Apr 16 14:01:01.689016 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:01.688987 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-57878bf6fb-j55r4"] Apr 16 14:01:01.693245 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:01.693216 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-57878bf6fb-j55r4"] Apr 16 14:01:02.309022 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:02.308990 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf3196a9-e94b-48db-be40-84858772dac6" path="/var/lib/kubelet/pods/cf3196a9-e94b-48db-be40-84858772dac6/volumes" Apr 16 14:01:02.655205 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:02.655164 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-474qr" event={"ID":"64d4de55-cfd8-4588-af6f-f5d27ec26b16","Type":"ContainerStarted","Data":"1814b30e1617dbe884057f73abbc6079ab94a2151144ad0af1378b220a9f6af7"} Apr 16 14:01:02.671261 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:02.671208 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cb6cf4cb4-474qr" podStartSLOduration=2.148635623 podStartE2EDuration="3.671190118s" podCreationTimestamp="2026-04-16 14:00:59 +0000 UTC" firstStartedPulling="2026-04-16 14:01:00.092059712 +0000 UTC m=+86.297702859" lastFinishedPulling="2026-04-16 14:01:01.614614207 +0000 UTC m=+87.820257354" observedRunningTime="2026-04-16 14:01:02.670378712 +0000 UTC m=+88.876021893" watchObservedRunningTime="2026-04-16 14:01:02.671190118 +0000 UTC m=+88.876833283" Apr 16 14:01:02.681808 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:02.681750 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fcd418a5-48d2-4f13-a35b-28504fb6ca61-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-7dn56\" (UID: \"fcd418a5-48d2-4f13-a35b-28504fb6ca61\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7dn56" Apr 16 14:01:02.684609 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:02.684585 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/fcd418a5-48d2-4f13-a35b-28504fb6ca61-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-6667474d89-7dn56\" (UID: \"fcd418a5-48d2-4f13-a35b-28504fb6ca61\") " pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7dn56" Apr 16 14:01:02.705204 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:02.705176 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-xslx9\"" Apr 16 14:01:02.713875 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:02.713848 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7dn56" Apr 16 14:01:02.875025 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:02.874994 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-6667474d89-7dn56"] Apr 16 14:01:02.878153 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:01:02.878124 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcd418a5_48d2_4f13_a35b_28504fb6ca61.slice/crio-69ae13907033cbb7119dccd2ddd26af8f5c4b564c701f48990202443f95d1150 WatchSource:0}: Error finding container 69ae13907033cbb7119dccd2ddd26af8f5c4b564c701f48990202443f95d1150: Status 404 returned error can't find the container with id 69ae13907033cbb7119dccd2ddd26af8f5c4b564c701f48990202443f95d1150 Apr 16 14:01:03.662391 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:03.662350 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6fksf" event={"ID":"75776d76-58e8-483c-ae65-df6ee9cfa222","Type":"ContainerStarted","Data":"4a8463e28affdcc359a5ccb3c1f0df0060b50be437e2600c0ebef263d54a3de2"} Apr 16 14:01:03.663467 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:03.663444 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7dn56" event={"ID":"fcd418a5-48d2-4f13-a35b-28504fb6ca61","Type":"ContainerStarted","Data":"69ae13907033cbb7119dccd2ddd26af8f5c4b564c701f48990202443f95d1150"} Apr 16 14:01:03.679489 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:03.679308 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-6fksf" podStartSLOduration=2.106300224 podStartE2EDuration="4.679294616s" podCreationTimestamp="2026-04-16 14:00:59 +0000 UTC" firstStartedPulling="2026-04-16 14:01:00.170024081 +0000 UTC m=+86.375667235" lastFinishedPulling="2026-04-16 14:01:02.743018479 +0000 UTC m=+88.948661627" observedRunningTime="2026-04-16 14:01:03.679124516 +0000 UTC m=+89.884767680" watchObservedRunningTime="2026-04-16 14:01:03.679294616 +0000 UTC m=+89.884937799" Apr 16 14:01:05.670702 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:05.670661 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7dn56" event={"ID":"fcd418a5-48d2-4f13-a35b-28504fb6ca61","Type":"ContainerStarted","Data":"a8ed07421f1e7200ed27cd06ca8d5355c9bfb43c8822333fd62f1b36e058c488"} Apr 16 14:01:08.634467 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:08.634411 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-6667474d89-7dn56" podStartSLOduration=36.625226407 podStartE2EDuration="38.634397071s" podCreationTimestamp="2026-04-16 14:00:30 +0000 UTC" firstStartedPulling="2026-04-16 14:01:02.880127114 +0000 UTC m=+89.085770256" lastFinishedPulling="2026-04-16 14:01:04.889297778 +0000 UTC m=+91.094940920" observedRunningTime="2026-04-16 14:01:05.695732378 +0000 UTC m=+91.901375541" watchObservedRunningTime="2026-04-16 14:01:08.634397071 +0000 UTC m=+94.840040235" Apr 16 14:01:08.635454 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:08.635428 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-fhx25"] Apr 16 14:01:08.663928 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:08.663877 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-fhx25"] Apr 16 14:01:08.664103 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:08.664033 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-fhx25" Apr 16 14:01:08.669898 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:08.669852 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 16 14:01:08.669898 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:08.669861 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 16 14:01:08.670224 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:08.669929 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-b4rjn\"" Apr 16 14:01:08.670224 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:08.669996 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 14:01:08.734663 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:08.734611 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/df061447-182f-4f24-a0d0-95178339d48f-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-fhx25\" (UID: \"df061447-182f-4f24-a0d0-95178339d48f\") " pod="openshift-monitoring/prometheus-operator-78f957474d-fhx25" Apr 16 14:01:08.734859 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:08.734701 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/df061447-182f-4f24-a0d0-95178339d48f-metrics-client-ca\") pod \"prometheus-operator-78f957474d-fhx25\" (UID: \"df061447-182f-4f24-a0d0-95178339d48f\") " pod="openshift-monitoring/prometheus-operator-78f957474d-fhx25" Apr 16 14:01:08.734859 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:08.734732 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlt4l\" (UniqueName: \"kubernetes.io/projected/df061447-182f-4f24-a0d0-95178339d48f-kube-api-access-vlt4l\") pod \"prometheus-operator-78f957474d-fhx25\" (UID: \"df061447-182f-4f24-a0d0-95178339d48f\") " pod="openshift-monitoring/prometheus-operator-78f957474d-fhx25" Apr 16 14:01:08.734859 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:08.734842 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/df061447-182f-4f24-a0d0-95178339d48f-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-fhx25\" (UID: \"df061447-182f-4f24-a0d0-95178339d48f\") " pod="openshift-monitoring/prometheus-operator-78f957474d-fhx25" Apr 16 14:01:08.835635 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:08.835578 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/df061447-182f-4f24-a0d0-95178339d48f-metrics-client-ca\") pod \"prometheus-operator-78f957474d-fhx25\" (UID: \"df061447-182f-4f24-a0d0-95178339d48f\") " pod="openshift-monitoring/prometheus-operator-78f957474d-fhx25" Apr 16 14:01:08.835842 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:08.835650 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vlt4l\" (UniqueName: \"kubernetes.io/projected/df061447-182f-4f24-a0d0-95178339d48f-kube-api-access-vlt4l\") pod \"prometheus-operator-78f957474d-fhx25\" (UID: \"df061447-182f-4f24-a0d0-95178339d48f\") " pod="openshift-monitoring/prometheus-operator-78f957474d-fhx25" Apr 16 14:01:08.835842 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:08.835706 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/df061447-182f-4f24-a0d0-95178339d48f-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-fhx25\" (UID: \"df061447-182f-4f24-a0d0-95178339d48f\") " pod="openshift-monitoring/prometheus-operator-78f957474d-fhx25" Apr 16 14:01:08.835842 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:01:08.835833 2570 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 16 14:01:08.835978 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:01:08.835895 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/df061447-182f-4f24-a0d0-95178339d48f-prometheus-operator-tls podName:df061447-182f-4f24-a0d0-95178339d48f nodeName:}" failed. No retries permitted until 2026-04-16 14:01:09.335875006 +0000 UTC m=+95.541518170 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/df061447-182f-4f24-a0d0-95178339d48f-prometheus-operator-tls") pod "prometheus-operator-78f957474d-fhx25" (UID: "df061447-182f-4f24-a0d0-95178339d48f") : secret "prometheus-operator-tls" not found Apr 16 14:01:08.835978 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:08.835914 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/df061447-182f-4f24-a0d0-95178339d48f-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-fhx25\" (UID: \"df061447-182f-4f24-a0d0-95178339d48f\") " pod="openshift-monitoring/prometheus-operator-78f957474d-fhx25" Apr 16 14:01:08.836271 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:08.836247 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/df061447-182f-4f24-a0d0-95178339d48f-metrics-client-ca\") pod \"prometheus-operator-78f957474d-fhx25\" (UID: \"df061447-182f-4f24-a0d0-95178339d48f\") " pod="openshift-monitoring/prometheus-operator-78f957474d-fhx25" Apr 16 14:01:08.838446 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:08.838413 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/df061447-182f-4f24-a0d0-95178339d48f-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-78f957474d-fhx25\" (UID: \"df061447-182f-4f24-a0d0-95178339d48f\") " pod="openshift-monitoring/prometheus-operator-78f957474d-fhx25" Apr 16 14:01:08.845106 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:08.845077 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlt4l\" (UniqueName: \"kubernetes.io/projected/df061447-182f-4f24-a0d0-95178339d48f-kube-api-access-vlt4l\") pod \"prometheus-operator-78f957474d-fhx25\" (UID: \"df061447-182f-4f24-a0d0-95178339d48f\") " pod="openshift-monitoring/prometheus-operator-78f957474d-fhx25" Apr 16 14:01:09.340845 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:09.340801 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/df061447-182f-4f24-a0d0-95178339d48f-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-fhx25\" (UID: \"df061447-182f-4f24-a0d0-95178339d48f\") " pod="openshift-monitoring/prometheus-operator-78f957474d-fhx25" Apr 16 14:01:09.343352 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:09.343332 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/df061447-182f-4f24-a0d0-95178339d48f-prometheus-operator-tls\") pod \"prometheus-operator-78f957474d-fhx25\" (UID: \"df061447-182f-4f24-a0d0-95178339d48f\") " pod="openshift-monitoring/prometheus-operator-78f957474d-fhx25" Apr 16 14:01:09.579157 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:09.579113 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-78f957474d-fhx25" Apr 16 14:01:09.705235 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:09.705203 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-78f957474d-fhx25"] Apr 16 14:01:09.708679 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:01:09.708653 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf061447_182f_4f24_a0d0_95178339d48f.slice/crio-b39d1e97521a506fbc1b3a04f6c38e63c47c738619580d3802036bda4b5d8e0a WatchSource:0}: Error finding container b39d1e97521a506fbc1b3a04f6c38e63c47c738619580d3802036bda4b5d8e0a: Status 404 returned error can't find the container with id b39d1e97521a506fbc1b3a04f6c38e63c47c738619580d3802036bda4b5d8e0a Apr 16 14:01:10.684606 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:10.684572 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-fhx25" event={"ID":"df061447-182f-4f24-a0d0-95178339d48f","Type":"ContainerStarted","Data":"b39d1e97521a506fbc1b3a04f6c38e63c47c738619580d3802036bda4b5d8e0a"} Apr 16 14:01:11.688522 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:11.688484 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-fhx25" event={"ID":"df061447-182f-4f24-a0d0-95178339d48f","Type":"ContainerStarted","Data":"6e9cb316f3f245d10825311aa6f302e2c4c3626bf2f6eeabd042b742703eb758"} Apr 16 14:01:11.688522 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:11.688526 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-78f957474d-fhx25" event={"ID":"df061447-182f-4f24-a0d0-95178339d48f","Type":"ContainerStarted","Data":"0e088f1feee753a0eb93ed85dc40246853d807f723411ae33dbc4a7467c737ac"} Apr 16 14:01:11.704574 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:11.704490 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-78f957474d-fhx25" podStartSLOduration=2.37587255 podStartE2EDuration="3.704471894s" podCreationTimestamp="2026-04-16 14:01:08 +0000 UTC" firstStartedPulling="2026-04-16 14:01:09.71089469 +0000 UTC m=+95.916537835" lastFinishedPulling="2026-04-16 14:01:11.039494037 +0000 UTC m=+97.245137179" observedRunningTime="2026-04-16 14:01:11.704022366 +0000 UTC m=+97.909665532" watchObservedRunningTime="2026-04-16 14:01:11.704471894 +0000 UTC m=+97.910115059" Apr 16 14:01:12.063102 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:12.063004 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6bb67241-6874-4040-a810-80b829751cf9-metrics-tls\") pod \"dns-default-smqtz\" (UID: \"6bb67241-6874-4040-a810-80b829751cf9\") " pod="openshift-dns/dns-default-smqtz" Apr 16 14:01:12.063102 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:12.063065 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d037ded-fc00-41e0-b31f-c9fb98bdc629-cert\") pod \"ingress-canary-wc4m2\" (UID: \"2d037ded-fc00-41e0-b31f-c9fb98bdc629\") " pod="openshift-ingress-canary/ingress-canary-wc4m2" Apr 16 14:01:12.065597 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:12.065571 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/6bb67241-6874-4040-a810-80b829751cf9-metrics-tls\") pod \"dns-default-smqtz\" (UID: \"6bb67241-6874-4040-a810-80b829751cf9\") " pod="openshift-dns/dns-default-smqtz" Apr 16 14:01:12.065733 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:12.065612 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/2d037ded-fc00-41e0-b31f-c9fb98bdc629-cert\") pod \"ingress-canary-wc4m2\" (UID: \"2d037ded-fc00-41e0-b31f-c9fb98bdc629\") " pod="openshift-ingress-canary/ingress-canary-wc4m2" Apr 16 14:01:12.089683 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:12.089650 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wjgsd\"" Apr 16 14:01:12.095649 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:12.095626 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pt57w\"" Apr 16 14:01:12.098404 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:12.098385 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-smqtz" Apr 16 14:01:12.104187 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:12.104162 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-wc4m2" Apr 16 14:01:12.253405 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:12.253373 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-smqtz"] Apr 16 14:01:12.256943 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:01:12.256915 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bb67241_6874_4040_a810_80b829751cf9.slice/crio-0133833e27daf89f64f07147aa3324ed9324da32a4d53f537a96f6c3dcc6d6fd WatchSource:0}: Error finding container 0133833e27daf89f64f07147aa3324ed9324da32a4d53f537a96f6c3dcc6d6fd: Status 404 returned error can't find the container with id 0133833e27daf89f64f07147aa3324ed9324da32a4d53f537a96f6c3dcc6d6fd Apr 16 14:01:12.273807 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:12.273768 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-wc4m2"] Apr 16 14:01:12.279720 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:01:12.279692 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d037ded_fc00_41e0_b31f_c9fb98bdc629.slice/crio-26f084080a9cb02d26928d8798e7573941be58d4deff89b41fd9fef8615e906a WatchSource:0}: Error finding container 26f084080a9cb02d26928d8798e7573941be58d4deff89b41fd9fef8615e906a: Status 404 returned error can't find the container with id 26f084080a9cb02d26928d8798e7573941be58d4deff89b41fd9fef8615e906a Apr 16 14:01:12.693026 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:12.692968 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-smqtz" event={"ID":"6bb67241-6874-4040-a810-80b829751cf9","Type":"ContainerStarted","Data":"0133833e27daf89f64f07147aa3324ed9324da32a4d53f537a96f6c3dcc6d6fd"} Apr 16 14:01:12.694660 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:12.693995 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wc4m2" event={"ID":"2d037ded-fc00-41e0-b31f-c9fb98bdc629","Type":"ContainerStarted","Data":"26f084080a9cb02d26928d8798e7573941be58d4deff89b41fd9fef8615e906a"} Apr 16 14:01:14.006565 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.005688 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-g4rrg"] Apr 16 14:01:14.009818 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.009786 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-g4rrg" Apr 16 14:01:14.013796 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.013676 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-lwn4w\"" Apr 16 14:01:14.014213 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.014140 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 16 14:01:14.014213 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.014140 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 16 14:01:14.021673 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.021627 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-g4rrg"] Apr 16 14:01:14.040765 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.040733 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-v86qq"] Apr 16 14:01:14.044264 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.044244 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-v86qq" Apr 16 14:01:14.054487 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.054442 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 16 14:01:14.054805 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.054498 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 16 14:01:14.054805 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.054688 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-hj8mv\"" Apr 16 14:01:14.055081 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.055059 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 16 14:01:14.061867 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.061839 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-v86qq"] Apr 16 14:01:14.083039 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.083006 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-hmhjw"] Apr 16 14:01:14.086131 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.086109 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hmhjw" Apr 16 14:01:14.088295 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.088274 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 14:01:14.088743 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.088723 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 14:01:14.088963 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.088937 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 14:01:14.089115 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.089099 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-w66d6\"" Apr 16 14:01:14.185969 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.185933 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3ac9f0e6-4ee7-4de1-85f7-67127085b819-root\") pod \"node-exporter-hmhjw\" (UID: \"3ac9f0e6-4ee7-4de1-85f7-67127085b819\") " pod="openshift-monitoring/node-exporter-hmhjw" Apr 16 14:01:14.186130 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.185984 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/244a82a3-51aa-45f2-a8fe-823723a2410e-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-v86qq\" (UID: \"244a82a3-51aa-45f2-a8fe-823723a2410e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-v86qq" Apr 16 14:01:14.186130 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.186034 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/244a82a3-51aa-45f2-a8fe-823723a2410e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-v86qq\" (UID: \"244a82a3-51aa-45f2-a8fe-823723a2410e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-v86qq" Apr 16 14:01:14.186130 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.186061 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3ac9f0e6-4ee7-4de1-85f7-67127085b819-sys\") pod \"node-exporter-hmhjw\" (UID: \"3ac9f0e6-4ee7-4de1-85f7-67127085b819\") " pod="openshift-monitoring/node-exporter-hmhjw" Apr 16 14:01:14.186130 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.186090 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3ac9f0e6-4ee7-4de1-85f7-67127085b819-node-exporter-accelerators-collector-config\") pod \"node-exporter-hmhjw\" (UID: \"3ac9f0e6-4ee7-4de1-85f7-67127085b819\") " pod="openshift-monitoring/node-exporter-hmhjw" Apr 16 14:01:14.186130 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.186116 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/244a82a3-51aa-45f2-a8fe-823723a2410e-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-v86qq\" (UID: \"244a82a3-51aa-45f2-a8fe-823723a2410e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-v86qq" Apr 16 14:01:14.186345 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.186145 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/46fa769b-4f39-41d5-8ac2-232f23bbcbdb-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-g4rrg\" (UID: \"46fa769b-4f39-41d5-8ac2-232f23bbcbdb\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-g4rrg" Apr 16 14:01:14.186345 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.186172 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqg9v\" (UniqueName: \"kubernetes.io/projected/46fa769b-4f39-41d5-8ac2-232f23bbcbdb-kube-api-access-bqg9v\") pod \"openshift-state-metrics-5669946b84-g4rrg\" (UID: \"46fa769b-4f39-41d5-8ac2-232f23bbcbdb\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-g4rrg" Apr 16 14:01:14.186345 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.186200 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3ac9f0e6-4ee7-4de1-85f7-67127085b819-metrics-client-ca\") pod \"node-exporter-hmhjw\" (UID: \"3ac9f0e6-4ee7-4de1-85f7-67127085b819\") " pod="openshift-monitoring/node-exporter-hmhjw" Apr 16 14:01:14.186345 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.186232 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/46fa769b-4f39-41d5-8ac2-232f23bbcbdb-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-g4rrg\" (UID: \"46fa769b-4f39-41d5-8ac2-232f23bbcbdb\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-g4rrg" Apr 16 14:01:14.186345 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.186270 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3ac9f0e6-4ee7-4de1-85f7-67127085b819-node-exporter-tls\") pod \"node-exporter-hmhjw\" (UID: \"3ac9f0e6-4ee7-4de1-85f7-67127085b819\") " pod="openshift-monitoring/node-exporter-hmhjw" Apr 16 14:01:14.186345 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.186295 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbjz8\" (UniqueName: \"kubernetes.io/projected/3ac9f0e6-4ee7-4de1-85f7-67127085b819-kube-api-access-mbjz8\") pod \"node-exporter-hmhjw\" (UID: \"3ac9f0e6-4ee7-4de1-85f7-67127085b819\") " pod="openshift-monitoring/node-exporter-hmhjw" Apr 16 14:01:14.186345 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.186330 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/46fa769b-4f39-41d5-8ac2-232f23bbcbdb-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-g4rrg\" (UID: \"46fa769b-4f39-41d5-8ac2-232f23bbcbdb\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-g4rrg" Apr 16 14:01:14.186672 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.186369 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3ac9f0e6-4ee7-4de1-85f7-67127085b819-node-exporter-textfile\") pod \"node-exporter-hmhjw\" (UID: \"3ac9f0e6-4ee7-4de1-85f7-67127085b819\") " pod="openshift-monitoring/node-exporter-hmhjw" Apr 16 14:01:14.186672 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.186418 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3ac9f0e6-4ee7-4de1-85f7-67127085b819-node-exporter-wtmp\") pod \"node-exporter-hmhjw\" (UID: \"3ac9f0e6-4ee7-4de1-85f7-67127085b819\") " pod="openshift-monitoring/node-exporter-hmhjw" Apr 16 14:01:14.186672 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.186453 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/244a82a3-51aa-45f2-a8fe-823723a2410e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-v86qq\" (UID: \"244a82a3-51aa-45f2-a8fe-823723a2410e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-v86qq" Apr 16 14:01:14.186672 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.186504 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3ac9f0e6-4ee7-4de1-85f7-67127085b819-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hmhjw\" (UID: \"3ac9f0e6-4ee7-4de1-85f7-67127085b819\") " pod="openshift-monitoring/node-exporter-hmhjw" Apr 16 14:01:14.186672 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.186544 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/244a82a3-51aa-45f2-a8fe-823723a2410e-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-v86qq\" (UID: \"244a82a3-51aa-45f2-a8fe-823723a2410e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-v86qq" Apr 16 14:01:14.186672 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.186573 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk9zs\" (UniqueName: \"kubernetes.io/projected/244a82a3-51aa-45f2-a8fe-823723a2410e-kube-api-access-gk9zs\") pod \"kube-state-metrics-7479c89684-v86qq\" (UID: \"244a82a3-51aa-45f2-a8fe-823723a2410e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-v86qq" Apr 16 14:01:14.289410 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.287601 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/46fa769b-4f39-41d5-8ac2-232f23bbcbdb-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-g4rrg\" (UID: \"46fa769b-4f39-41d5-8ac2-232f23bbcbdb\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-g4rrg" Apr 16 14:01:14.289410 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.287650 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3ac9f0e6-4ee7-4de1-85f7-67127085b819-node-exporter-tls\") pod \"node-exporter-hmhjw\" (UID: \"3ac9f0e6-4ee7-4de1-85f7-67127085b819\") " pod="openshift-monitoring/node-exporter-hmhjw" Apr 16 14:01:14.289410 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.287670 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mbjz8\" (UniqueName: \"kubernetes.io/projected/3ac9f0e6-4ee7-4de1-85f7-67127085b819-kube-api-access-mbjz8\") pod \"node-exporter-hmhjw\" (UID: \"3ac9f0e6-4ee7-4de1-85f7-67127085b819\") " pod="openshift-monitoring/node-exporter-hmhjw" Apr 16 14:01:14.289410 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.287689 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/46fa769b-4f39-41d5-8ac2-232f23bbcbdb-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-g4rrg\" (UID: \"46fa769b-4f39-41d5-8ac2-232f23bbcbdb\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-g4rrg" Apr 16 14:01:14.289410 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.287719 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3ac9f0e6-4ee7-4de1-85f7-67127085b819-node-exporter-textfile\") pod \"node-exporter-hmhjw\" (UID: \"3ac9f0e6-4ee7-4de1-85f7-67127085b819\") " pod="openshift-monitoring/node-exporter-hmhjw" Apr 16 14:01:14.289410 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.287743 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3ac9f0e6-4ee7-4de1-85f7-67127085b819-node-exporter-wtmp\") pod \"node-exporter-hmhjw\" (UID: \"3ac9f0e6-4ee7-4de1-85f7-67127085b819\") " pod="openshift-monitoring/node-exporter-hmhjw" Apr 16 14:01:14.289410 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.287763 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/244a82a3-51aa-45f2-a8fe-823723a2410e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-v86qq\" (UID: \"244a82a3-51aa-45f2-a8fe-823723a2410e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-v86qq" Apr 16 14:01:14.289410 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.287793 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3ac9f0e6-4ee7-4de1-85f7-67127085b819-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hmhjw\" (UID: \"3ac9f0e6-4ee7-4de1-85f7-67127085b819\") " pod="openshift-monitoring/node-exporter-hmhjw" Apr 16 14:01:14.289410 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.287809 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/244a82a3-51aa-45f2-a8fe-823723a2410e-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-v86qq\" (UID: \"244a82a3-51aa-45f2-a8fe-823723a2410e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-v86qq" Apr 16 14:01:14.289410 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.287825 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gk9zs\" (UniqueName: \"kubernetes.io/projected/244a82a3-51aa-45f2-a8fe-823723a2410e-kube-api-access-gk9zs\") pod \"kube-state-metrics-7479c89684-v86qq\" (UID: \"244a82a3-51aa-45f2-a8fe-823723a2410e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-v86qq" Apr 16 14:01:14.289410 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.287854 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3ac9f0e6-4ee7-4de1-85f7-67127085b819-root\") pod \"node-exporter-hmhjw\" (UID: \"3ac9f0e6-4ee7-4de1-85f7-67127085b819\") " pod="openshift-monitoring/node-exporter-hmhjw" Apr 16 14:01:14.289410 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.287880 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/244a82a3-51aa-45f2-a8fe-823723a2410e-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-v86qq\" (UID: \"244a82a3-51aa-45f2-a8fe-823723a2410e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-v86qq" Apr 16 14:01:14.289410 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.287913 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/244a82a3-51aa-45f2-a8fe-823723a2410e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-v86qq\" (UID: \"244a82a3-51aa-45f2-a8fe-823723a2410e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-v86qq" Apr 16 14:01:14.289410 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.287935 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3ac9f0e6-4ee7-4de1-85f7-67127085b819-sys\") pod \"node-exporter-hmhjw\" (UID: \"3ac9f0e6-4ee7-4de1-85f7-67127085b819\") " pod="openshift-monitoring/node-exporter-hmhjw" Apr 16 14:01:14.289410 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.287954 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3ac9f0e6-4ee7-4de1-85f7-67127085b819-node-exporter-accelerators-collector-config\") pod \"node-exporter-hmhjw\" (UID: \"3ac9f0e6-4ee7-4de1-85f7-67127085b819\") " pod="openshift-monitoring/node-exporter-hmhjw" Apr 16 14:01:14.289410 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.287973 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/244a82a3-51aa-45f2-a8fe-823723a2410e-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-v86qq\" (UID: \"244a82a3-51aa-45f2-a8fe-823723a2410e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-v86qq" Apr 16 14:01:14.290412 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.287994 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/46fa769b-4f39-41d5-8ac2-232f23bbcbdb-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-g4rrg\" (UID: \"46fa769b-4f39-41d5-8ac2-232f23bbcbdb\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-g4rrg" Apr 16 14:01:14.290412 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.288018 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bqg9v\" (UniqueName: \"kubernetes.io/projected/46fa769b-4f39-41d5-8ac2-232f23bbcbdb-kube-api-access-bqg9v\") pod \"openshift-state-metrics-5669946b84-g4rrg\" (UID: \"46fa769b-4f39-41d5-8ac2-232f23bbcbdb\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-g4rrg" Apr 16 14:01:14.290412 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.288037 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3ac9f0e6-4ee7-4de1-85f7-67127085b819-metrics-client-ca\") pod \"node-exporter-hmhjw\" (UID: \"3ac9f0e6-4ee7-4de1-85f7-67127085b819\") " pod="openshift-monitoring/node-exporter-hmhjw" Apr 16 14:01:14.290412 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.288585 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/244a82a3-51aa-45f2-a8fe-823723a2410e-volume-directive-shadow\") pod \"kube-state-metrics-7479c89684-v86qq\" (UID: \"244a82a3-51aa-45f2-a8fe-823723a2410e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-v86qq" Apr 16 14:01:14.290412 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.289215 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/3ac9f0e6-4ee7-4de1-85f7-67127085b819-node-exporter-textfile\") pod \"node-exporter-hmhjw\" (UID: \"3ac9f0e6-4ee7-4de1-85f7-67127085b819\") " pod="openshift-monitoring/node-exporter-hmhjw" Apr 16 14:01:14.290412 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:01:14.289442 2570 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 14:01:14.290412 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:01:14.289516 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ac9f0e6-4ee7-4de1-85f7-67127085b819-node-exporter-tls podName:3ac9f0e6-4ee7-4de1-85f7-67127085b819 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:14.789485743 +0000 UTC m=+100.995128891 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/3ac9f0e6-4ee7-4de1-85f7-67127085b819-node-exporter-tls") pod "node-exporter-hmhjw" (UID: "3ac9f0e6-4ee7-4de1-85f7-67127085b819") : secret "node-exporter-tls" not found Apr 16 14:01:14.290412 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:01:14.289547 2570 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 16 14:01:14.290412 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:01:14.289584 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/244a82a3-51aa-45f2-a8fe-823723a2410e-kube-state-metrics-tls podName:244a82a3-51aa-45f2-a8fe-823723a2410e nodeName:}" failed. No retries permitted until 2026-04-16 14:01:14.789572385 +0000 UTC m=+100.995215531 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/244a82a3-51aa-45f2-a8fe-823723a2410e-kube-state-metrics-tls") pod "kube-state-metrics-7479c89684-v86qq" (UID: "244a82a3-51aa-45f2-a8fe-823723a2410e") : secret "kube-state-metrics-tls" not found Apr 16 14:01:14.290917 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.290513 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/3ac9f0e6-4ee7-4de1-85f7-67127085b819-metrics-client-ca\") pod \"node-exporter-hmhjw\" (UID: \"3ac9f0e6-4ee7-4de1-85f7-67127085b819\") " pod="openshift-monitoring/node-exporter-hmhjw" Apr 16 14:01:14.293432 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.291006 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/244a82a3-51aa-45f2-a8fe-823723a2410e-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-7479c89684-v86qq\" (UID: \"244a82a3-51aa-45f2-a8fe-823723a2410e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-v86qq" Apr 16 14:01:14.293432 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.291121 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/3ac9f0e6-4ee7-4de1-85f7-67127085b819-node-exporter-wtmp\") pod \"node-exporter-hmhjw\" (UID: \"3ac9f0e6-4ee7-4de1-85f7-67127085b819\") " pod="openshift-monitoring/node-exporter-hmhjw" Apr 16 14:01:14.293432 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.289444 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/3ac9f0e6-4ee7-4de1-85f7-67127085b819-root\") pod \"node-exporter-hmhjw\" (UID: \"3ac9f0e6-4ee7-4de1-85f7-67127085b819\") " pod="openshift-monitoring/node-exporter-hmhjw" Apr 16 14:01:14.293432 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.291591 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3ac9f0e6-4ee7-4de1-85f7-67127085b819-sys\") pod \"node-exporter-hmhjw\" (UID: \"3ac9f0e6-4ee7-4de1-85f7-67127085b819\") " pod="openshift-monitoring/node-exporter-hmhjw" Apr 16 14:01:14.293432 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.292520 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/3ac9f0e6-4ee7-4de1-85f7-67127085b819-node-exporter-accelerators-collector-config\") pod \"node-exporter-hmhjw\" (UID: \"3ac9f0e6-4ee7-4de1-85f7-67127085b819\") " pod="openshift-monitoring/node-exporter-hmhjw" Apr 16 14:01:14.293432 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.292972 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/46fa769b-4f39-41d5-8ac2-232f23bbcbdb-metrics-client-ca\") pod \"openshift-state-metrics-5669946b84-g4rrg\" (UID: \"46fa769b-4f39-41d5-8ac2-232f23bbcbdb\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-g4rrg" Apr 16 14:01:14.293432 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.293359 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/244a82a3-51aa-45f2-a8fe-823723a2410e-metrics-client-ca\") pod \"kube-state-metrics-7479c89684-v86qq\" (UID: \"244a82a3-51aa-45f2-a8fe-823723a2410e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-v86qq" Apr 16 14:01:14.294730 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.294666 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/244a82a3-51aa-45f2-a8fe-823723a2410e-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-7479c89684-v86qq\" (UID: \"244a82a3-51aa-45f2-a8fe-823723a2410e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-v86qq" Apr 16 14:01:14.305570 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.297856 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/3ac9f0e6-4ee7-4de1-85f7-67127085b819-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-hmhjw\" (UID: \"3ac9f0e6-4ee7-4de1-85f7-67127085b819\") " pod="openshift-monitoring/node-exporter-hmhjw" Apr 16 14:01:14.305570 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.302214 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqg9v\" (UniqueName: \"kubernetes.io/projected/46fa769b-4f39-41d5-8ac2-232f23bbcbdb-kube-api-access-bqg9v\") pod \"openshift-state-metrics-5669946b84-g4rrg\" (UID: \"46fa769b-4f39-41d5-8ac2-232f23bbcbdb\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-g4rrg" Apr 16 14:01:14.305570 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.304524 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk9zs\" (UniqueName: \"kubernetes.io/projected/244a82a3-51aa-45f2-a8fe-823723a2410e-kube-api-access-gk9zs\") pod \"kube-state-metrics-7479c89684-v86qq\" (UID: \"244a82a3-51aa-45f2-a8fe-823723a2410e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-v86qq" Apr 16 14:01:14.320243 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.320160 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/46fa769b-4f39-41d5-8ac2-232f23bbcbdb-openshift-state-metrics-tls\") pod \"openshift-state-metrics-5669946b84-g4rrg\" (UID: \"46fa769b-4f39-41d5-8ac2-232f23bbcbdb\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-g4rrg" Apr 16 14:01:14.332490 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.332455 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbjz8\" (UniqueName: \"kubernetes.io/projected/3ac9f0e6-4ee7-4de1-85f7-67127085b819-kube-api-access-mbjz8\") pod \"node-exporter-hmhjw\" (UID: \"3ac9f0e6-4ee7-4de1-85f7-67127085b819\") " pod="openshift-monitoring/node-exporter-hmhjw" Apr 16 14:01:14.338553 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.335580 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/46fa769b-4f39-41d5-8ac2-232f23bbcbdb-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-5669946b84-g4rrg\" (UID: \"46fa769b-4f39-41d5-8ac2-232f23bbcbdb\") " pod="openshift-monitoring/openshift-state-metrics-5669946b84-g4rrg" Apr 16 14:01:14.625160 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.625039 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-5669946b84-g4rrg" Apr 16 14:01:14.702582 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.702001 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-smqtz" event={"ID":"6bb67241-6874-4040-a810-80b829751cf9","Type":"ContainerStarted","Data":"3009b30d8eaef58cdf952cb837c59703a553044e5044e52a1c308bc54dbee2b0"} Apr 16 14:01:14.702582 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.702054 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-smqtz" event={"ID":"6bb67241-6874-4040-a810-80b829751cf9","Type":"ContainerStarted","Data":"e872bd5d7d73d91629e8b744015d3e187eca58436079952086c67b93655c5f56"} Apr 16 14:01:14.702582 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.702141 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-smqtz" Apr 16 14:01:14.703509 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.703480 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-wc4m2" event={"ID":"2d037ded-fc00-41e0-b31f-c9fb98bdc629","Type":"ContainerStarted","Data":"c487e2c87678f048bd49bbea47b50d96cf9aedfd5c3f586b63e53411b50d012d"} Apr 16 14:01:14.719567 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.719471 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-smqtz" podStartSLOduration=64.880979627 podStartE2EDuration="1m6.719453636s" podCreationTimestamp="2026-04-16 14:00:08 +0000 UTC" firstStartedPulling="2026-04-16 14:01:12.259222055 +0000 UTC m=+98.464865197" lastFinishedPulling="2026-04-16 14:01:14.097696049 +0000 UTC m=+100.303339206" observedRunningTime="2026-04-16 14:01:14.719043497 +0000 UTC m=+100.924686662" watchObservedRunningTime="2026-04-16 14:01:14.719453636 +0000 UTC m=+100.925096812" Apr 16 14:01:14.736468 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.736201 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-wc4m2" podStartSLOduration=64.917711921 podStartE2EDuration="1m6.7361778s" podCreationTimestamp="2026-04-16 14:00:08 +0000 UTC" firstStartedPulling="2026-04-16 14:01:12.281690113 +0000 UTC m=+98.487333264" lastFinishedPulling="2026-04-16 14:01:14.100155994 +0000 UTC m=+100.305799143" observedRunningTime="2026-04-16 14:01:14.735223863 +0000 UTC m=+100.940867027" watchObservedRunningTime="2026-04-16 14:01:14.7361778 +0000 UTC m=+100.941820966" Apr 16 14:01:14.771806 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.771769 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-5669946b84-g4rrg"] Apr 16 14:01:14.775735 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:01:14.775699 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46fa769b_4f39_41d5_8ac2_232f23bbcbdb.slice/crio-1f905e69a3f4e7af46422718c90290b31a69331f64d73a10e8c4b8fed09bec9d WatchSource:0}: Error finding container 1f905e69a3f4e7af46422718c90290b31a69331f64d73a10e8c4b8fed09bec9d: Status 404 returned error can't find the container with id 1f905e69a3f4e7af46422718c90290b31a69331f64d73a10e8c4b8fed09bec9d Apr 16 14:01:14.792525 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.792490 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/244a82a3-51aa-45f2-a8fe-823723a2410e-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-v86qq\" (UID: \"244a82a3-51aa-45f2-a8fe-823723a2410e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-v86qq" Apr 16 14:01:14.792767 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.792621 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3ac9f0e6-4ee7-4de1-85f7-67127085b819-node-exporter-tls\") pod \"node-exporter-hmhjw\" (UID: \"3ac9f0e6-4ee7-4de1-85f7-67127085b819\") " pod="openshift-monitoring/node-exporter-hmhjw" Apr 16 14:01:14.792767 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:01:14.792745 2570 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 14:01:14.792871 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:01:14.792810 2570 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3ac9f0e6-4ee7-4de1-85f7-67127085b819-node-exporter-tls podName:3ac9f0e6-4ee7-4de1-85f7-67127085b819 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:15.792788852 +0000 UTC m=+101.998431993 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/3ac9f0e6-4ee7-4de1-85f7-67127085b819-node-exporter-tls") pod "node-exporter-hmhjw" (UID: "3ac9f0e6-4ee7-4de1-85f7-67127085b819") : secret "node-exporter-tls" not found Apr 16 14:01:14.795771 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.795747 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/244a82a3-51aa-45f2-a8fe-823723a2410e-kube-state-metrics-tls\") pod \"kube-state-metrics-7479c89684-v86qq\" (UID: \"244a82a3-51aa-45f2-a8fe-823723a2410e\") " pod="openshift-monitoring/kube-state-metrics-7479c89684-v86qq" Apr 16 14:01:14.955257 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:14.955214 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-7479c89684-v86qq" Apr 16 14:01:15.101813 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:15.101781 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-7479c89684-v86qq"] Apr 16 14:01:15.105065 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:01:15.105037 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod244a82a3_51aa_45f2_a8fe_823723a2410e.slice/crio-677edc8925fc6f9feebd082e163fd3bcbfebeb6a2fc9e6817c56606f886913f9 WatchSource:0}: Error finding container 677edc8925fc6f9feebd082e163fd3bcbfebeb6a2fc9e6817c56606f886913f9: Status 404 returned error can't find the container with id 677edc8925fc6f9feebd082e163fd3bcbfebeb6a2fc9e6817c56606f886913f9 Apr 16 14:01:15.708748 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:15.708632 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-v86qq" event={"ID":"244a82a3-51aa-45f2-a8fe-823723a2410e","Type":"ContainerStarted","Data":"677edc8925fc6f9feebd082e163fd3bcbfebeb6a2fc9e6817c56606f886913f9"} Apr 16 14:01:15.711337 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:15.711305 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-g4rrg" event={"ID":"46fa769b-4f39-41d5-8ac2-232f23bbcbdb","Type":"ContainerStarted","Data":"d1ca9ef6d5360a6039f4ecefb3db9a46e401fa6ce58b3e0d883e93a8913b11ea"} Apr 16 14:01:15.711511 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:15.711344 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-g4rrg" event={"ID":"46fa769b-4f39-41d5-8ac2-232f23bbcbdb","Type":"ContainerStarted","Data":"93aa1414d38d71d5cb308dea3aa9eb636d4d342f1cdf300b066bf9f49ddb7134"} Apr 16 14:01:15.711511 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:15.711360 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-g4rrg" event={"ID":"46fa769b-4f39-41d5-8ac2-232f23bbcbdb","Type":"ContainerStarted","Data":"1f905e69a3f4e7af46422718c90290b31a69331f64d73a10e8c4b8fed09bec9d"} Apr 16 14:01:15.804231 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:15.804177 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3ac9f0e6-4ee7-4de1-85f7-67127085b819-node-exporter-tls\") pod \"node-exporter-hmhjw\" (UID: \"3ac9f0e6-4ee7-4de1-85f7-67127085b819\") " pod="openshift-monitoring/node-exporter-hmhjw" Apr 16 14:01:15.809741 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:15.809680 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/3ac9f0e6-4ee7-4de1-85f7-67127085b819-node-exporter-tls\") pod \"node-exporter-hmhjw\" (UID: \"3ac9f0e6-4ee7-4de1-85f7-67127085b819\") " pod="openshift-monitoring/node-exporter-hmhjw" Apr 16 14:01:15.810210 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:15.810184 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-586b57c7b4-szfcb"] Apr 16 14:01:15.817907 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:15.817881 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-szfcb" Apr 16 14:01:15.824074 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:15.824042 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-szfcb"] Apr 16 14:01:15.824248 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:15.824217 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 16 14:01:15.824484 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:15.824465 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-q2d9z\"" Apr 16 14:01:15.824720 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:15.824702 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 16 14:01:15.898397 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:15.898355 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-hmhjw" Apr 16 14:01:15.905661 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:15.905553 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nnnx\" (UniqueName: \"kubernetes.io/projected/135c9f9a-4369-45dc-8f77-8d080f471674-kube-api-access-9nnnx\") pod \"downloads-586b57c7b4-szfcb\" (UID: \"135c9f9a-4369-45dc-8f77-8d080f471674\") " pod="openshift-console/downloads-586b57c7b4-szfcb" Apr 16 14:01:16.006155 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:16.006063 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9nnnx\" (UniqueName: \"kubernetes.io/projected/135c9f9a-4369-45dc-8f77-8d080f471674-kube-api-access-9nnnx\") pod \"downloads-586b57c7b4-szfcb\" (UID: \"135c9f9a-4369-45dc-8f77-8d080f471674\") " pod="openshift-console/downloads-586b57c7b4-szfcb" Apr 16 14:01:16.015612 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:16.015577 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nnnx\" (UniqueName: \"kubernetes.io/projected/135c9f9a-4369-45dc-8f77-8d080f471674-kube-api-access-9nnnx\") pod \"downloads-586b57c7b4-szfcb\" (UID: \"135c9f9a-4369-45dc-8f77-8d080f471674\") " pod="openshift-console/downloads-586b57c7b4-szfcb" Apr 16 14:01:16.135142 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:16.134958 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-586b57c7b4-szfcb" Apr 16 14:01:16.673115 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:01:16.672951 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ac9f0e6_4ee7_4de1_85f7_67127085b819.slice/crio-f1e862a8052f791253b3bb88b284f83c7b6babf770a7f5f2db32262ffcffa507 WatchSource:0}: Error finding container f1e862a8052f791253b3bb88b284f83c7b6babf770a7f5f2db32262ffcffa507: Status 404 returned error can't find the container with id f1e862a8052f791253b3bb88b284f83c7b6babf770a7f5f2db32262ffcffa507 Apr 16 14:01:16.715033 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:16.714991 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hmhjw" event={"ID":"3ac9f0e6-4ee7-4de1-85f7-67127085b819","Type":"ContainerStarted","Data":"f1e862a8052f791253b3bb88b284f83c7b6babf770a7f5f2db32262ffcffa507"} Apr 16 14:01:16.805120 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:16.805063 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-586b57c7b4-szfcb"] Apr 16 14:01:16.810585 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:01:16.810552 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod135c9f9a_4369_45dc_8f77_8d080f471674.slice/crio-7619705a79ce9e91652389e69b8ed39bbaf987b9c6e58501b4e959fd5f47dbad WatchSource:0}: Error finding container 7619705a79ce9e91652389e69b8ed39bbaf987b9c6e58501b4e959fd5f47dbad: Status 404 returned error can't find the container with id 7619705a79ce9e91652389e69b8ed39bbaf987b9c6e58501b4e959fd5f47dbad Apr 16 14:01:17.720915 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:17.720370 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-5669946b84-g4rrg" event={"ID":"46fa769b-4f39-41d5-8ac2-232f23bbcbdb","Type":"ContainerStarted","Data":"90508ceaabbd2cc918cc7aa8cdc8356048c2296826229aadb239f0f74350c8ee"} Apr 16 14:01:17.723223 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:17.723193 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hmhjw" event={"ID":"3ac9f0e6-4ee7-4de1-85f7-67127085b819","Type":"ContainerStarted","Data":"d0aa5e2ecd7b89b61de9ac4ff94f8709bbac0d6fa3d01e150ca1d75187e88fe4"} Apr 16 14:01:17.725551 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:17.725498 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-v86qq" event={"ID":"244a82a3-51aa-45f2-a8fe-823723a2410e","Type":"ContainerStarted","Data":"713730c7e0903ddd8ae78c4c9eb392a9d2156d3500550a02ec5cf725cc28216f"} Apr 16 14:01:17.725675 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:17.725559 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-v86qq" event={"ID":"244a82a3-51aa-45f2-a8fe-823723a2410e","Type":"ContainerStarted","Data":"90e3a1b08eddc87ea878efc80de4f1739f66e31469fe6750026e897e24f4ad20"} Apr 16 14:01:17.725675 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:17.725576 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-7479c89684-v86qq" event={"ID":"244a82a3-51aa-45f2-a8fe-823723a2410e","Type":"ContainerStarted","Data":"563df1be96e29b90ffa900575aada0d3188c7c87efe80d957c1cea37d38f07be"} Apr 16 14:01:17.726754 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:17.726728 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-szfcb" event={"ID":"135c9f9a-4369-45dc-8f77-8d080f471674","Type":"ContainerStarted","Data":"7619705a79ce9e91652389e69b8ed39bbaf987b9c6e58501b4e959fd5f47dbad"} Apr 16 14:01:17.781083 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:17.781028 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-5669946b84-g4rrg" podStartSLOduration=3.036356035 podStartE2EDuration="4.781009846s" podCreationTimestamp="2026-04-16 14:01:13 +0000 UTC" firstStartedPulling="2026-04-16 14:01:14.927363964 +0000 UTC m=+101.133007113" lastFinishedPulling="2026-04-16 14:01:16.672017762 +0000 UTC m=+102.877660924" observedRunningTime="2026-04-16 14:01:17.750944663 +0000 UTC m=+103.956587827" watchObservedRunningTime="2026-04-16 14:01:17.781009846 +0000 UTC m=+103.986653011" Apr 16 14:01:17.819646 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:17.819592 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-7479c89684-v86qq" podStartSLOduration=2.253170258 podStartE2EDuration="3.819577104s" podCreationTimestamp="2026-04-16 14:01:14 +0000 UTC" firstStartedPulling="2026-04-16 14:01:15.107064993 +0000 UTC m=+101.312708135" lastFinishedPulling="2026-04-16 14:01:16.673471824 +0000 UTC m=+102.879114981" observedRunningTime="2026-04-16 14:01:17.818465356 +0000 UTC m=+104.024108520" watchObservedRunningTime="2026-04-16 14:01:17.819577104 +0000 UTC m=+104.025220269" Apr 16 14:01:18.467319 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:18.467278 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7695b9cb7-4mmkk"] Apr 16 14:01:18.470262 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:18.470237 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7695b9cb7-4mmkk" Apr 16 14:01:18.472555 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:18.472503 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 16 14:01:18.472692 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:18.472557 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-1o9cgu4dd3smp\"" Apr 16 14:01:18.472692 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:18.472575 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 16 14:01:18.473031 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:18.473008 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 16 14:01:18.473556 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:18.473275 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-sjt69\"" Apr 16 14:01:18.473556 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:18.473289 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 16 14:01:18.482870 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:18.482838 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7695b9cb7-4mmkk"] Apr 16 14:01:18.618850 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:18.618821 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-88c5t" Apr 16 14:01:18.628115 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:18.628076 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7695b9cb7-4mmkk\" (UID: \"e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd\") " pod="openshift-monitoring/metrics-server-7695b9cb7-4mmkk" Apr 16 14:01:18.628293 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:18.628129 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd-metrics-server-audit-profiles\") pod \"metrics-server-7695b9cb7-4mmkk\" (UID: \"e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd\") " pod="openshift-monitoring/metrics-server-7695b9cb7-4mmkk" Apr 16 14:01:18.628293 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:18.628194 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd-client-ca-bundle\") pod \"metrics-server-7695b9cb7-4mmkk\" (UID: \"e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd\") " pod="openshift-monitoring/metrics-server-7695b9cb7-4mmkk" Apr 16 14:01:18.628293 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:18.628270 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6wqz\" (UniqueName: \"kubernetes.io/projected/e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd-kube-api-access-z6wqz\") pod \"metrics-server-7695b9cb7-4mmkk\" (UID: \"e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd\") " pod="openshift-monitoring/metrics-server-7695b9cb7-4mmkk" Apr 16 14:01:18.628458 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:18.628314 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd-audit-log\") pod \"metrics-server-7695b9cb7-4mmkk\" (UID: \"e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd\") " pod="openshift-monitoring/metrics-server-7695b9cb7-4mmkk" Apr 16 14:01:18.628458 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:18.628391 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd-secret-metrics-server-tls\") pod \"metrics-server-7695b9cb7-4mmkk\" (UID: \"e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd\") " pod="openshift-monitoring/metrics-server-7695b9cb7-4mmkk" Apr 16 14:01:18.628592 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:18.628506 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd-secret-metrics-server-client-certs\") pod \"metrics-server-7695b9cb7-4mmkk\" (UID: \"e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd\") " pod="openshift-monitoring/metrics-server-7695b9cb7-4mmkk" Apr 16 14:01:18.730799 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:18.729906 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7695b9cb7-4mmkk\" (UID: \"e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd\") " pod="openshift-monitoring/metrics-server-7695b9cb7-4mmkk" Apr 16 14:01:18.730799 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:18.729971 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd-metrics-server-audit-profiles\") pod \"metrics-server-7695b9cb7-4mmkk\" (UID: \"e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd\") " pod="openshift-monitoring/metrics-server-7695b9cb7-4mmkk" Apr 16 14:01:18.730799 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:18.730032 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd-client-ca-bundle\") pod \"metrics-server-7695b9cb7-4mmkk\" (UID: \"e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd\") " pod="openshift-monitoring/metrics-server-7695b9cb7-4mmkk" Apr 16 14:01:18.730799 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:18.730075 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6wqz\" (UniqueName: \"kubernetes.io/projected/e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd-kube-api-access-z6wqz\") pod \"metrics-server-7695b9cb7-4mmkk\" (UID: \"e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd\") " pod="openshift-monitoring/metrics-server-7695b9cb7-4mmkk" Apr 16 14:01:18.730799 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:18.730118 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd-audit-log\") pod \"metrics-server-7695b9cb7-4mmkk\" (UID: \"e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd\") " pod="openshift-monitoring/metrics-server-7695b9cb7-4mmkk" Apr 16 14:01:18.730799 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:18.730170 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd-secret-metrics-server-tls\") pod \"metrics-server-7695b9cb7-4mmkk\" (UID: \"e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd\") " pod="openshift-monitoring/metrics-server-7695b9cb7-4mmkk" Apr 16 14:01:18.730799 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:18.730218 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd-secret-metrics-server-client-certs\") pod \"metrics-server-7695b9cb7-4mmkk\" (UID: \"e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd\") " pod="openshift-monitoring/metrics-server-7695b9cb7-4mmkk" Apr 16 14:01:18.731701 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:18.731667 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7695b9cb7-4mmkk\" (UID: \"e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd\") " pod="openshift-monitoring/metrics-server-7695b9cb7-4mmkk" Apr 16 14:01:18.733038 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:18.733010 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd-audit-log\") pod \"metrics-server-7695b9cb7-4mmkk\" (UID: \"e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd\") " pod="openshift-monitoring/metrics-server-7695b9cb7-4mmkk" Apr 16 14:01:18.736609 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:18.736567 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd-metrics-server-audit-profiles\") pod \"metrics-server-7695b9cb7-4mmkk\" (UID: \"e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd\") " pod="openshift-monitoring/metrics-server-7695b9cb7-4mmkk" Apr 16 14:01:18.737075 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:18.737029 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd-secret-metrics-server-tls\") pod \"metrics-server-7695b9cb7-4mmkk\" (UID: \"e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd\") " pod="openshift-monitoring/metrics-server-7695b9cb7-4mmkk" Apr 16 14:01:18.739374 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:18.739345 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd-secret-metrics-server-client-certs\") pod \"metrics-server-7695b9cb7-4mmkk\" (UID: \"e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd\") " pod="openshift-monitoring/metrics-server-7695b9cb7-4mmkk" Apr 16 14:01:18.739864 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:18.739832 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd-client-ca-bundle\") pod \"metrics-server-7695b9cb7-4mmkk\" (UID: \"e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd\") " pod="openshift-monitoring/metrics-server-7695b9cb7-4mmkk" Apr 16 14:01:18.741736 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:18.740856 2570 generic.go:358] "Generic (PLEG): container finished" podID="3ac9f0e6-4ee7-4de1-85f7-67127085b819" containerID="d0aa5e2ecd7b89b61de9ac4ff94f8709bbac0d6fa3d01e150ca1d75187e88fe4" exitCode=0 Apr 16 14:01:18.741736 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:18.741685 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hmhjw" event={"ID":"3ac9f0e6-4ee7-4de1-85f7-67127085b819","Type":"ContainerDied","Data":"d0aa5e2ecd7b89b61de9ac4ff94f8709bbac0d6fa3d01e150ca1d75187e88fe4"} Apr 16 14:01:18.745121 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:18.744142 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6wqz\" (UniqueName: \"kubernetes.io/projected/e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd-kube-api-access-z6wqz\") pod \"metrics-server-7695b9cb7-4mmkk\" (UID: \"e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd\") " pod="openshift-monitoring/metrics-server-7695b9cb7-4mmkk" Apr 16 14:01:18.782649 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:18.782615 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7695b9cb7-4mmkk" Apr 16 14:01:18.948446 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:18.948400 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7695b9cb7-4mmkk"] Apr 16 14:01:18.955291 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:01:18.955252 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5d1ca4c_458d_4c29_81dd_f62f42d1d4dd.slice/crio-ca3650236aca1676fc4654cc3fa352425cd12043faf469c869fe9dacfa01f781 WatchSource:0}: Error finding container ca3650236aca1676fc4654cc3fa352425cd12043faf469c869fe9dacfa01f781: Status 404 returned error can't find the container with id ca3650236aca1676fc4654cc3fa352425cd12043faf469c869fe9dacfa01f781 Apr 16 14:01:19.361461 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.361420 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-75fc48886f-8mhqt"] Apr 16 14:01:19.364122 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.364095 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-75fc48886f-8mhqt" Apr 16 14:01:19.366239 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.366203 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-45qbv\"" Apr 16 14:01:19.370603 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.366748 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 16 14:01:19.370603 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.366861 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 16 14:01:19.370603 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.367155 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 16 14:01:19.370603 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.368516 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 16 14:01:19.370603 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.368988 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 16 14:01:19.374200 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.374175 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 16 14:01:19.382256 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.382180 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-75fc48886f-8mhqt"] Apr 16 14:01:19.538107 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.538065 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42fb5c45-515a-4ea9-a937-7418314ae5f2-serving-certs-ca-bundle\") pod \"telemeter-client-75fc48886f-8mhqt\" (UID: \"42fb5c45-515a-4ea9-a937-7418314ae5f2\") " pod="openshift-monitoring/telemeter-client-75fc48886f-8mhqt" Apr 16 14:01:19.538305 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.538126 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/42fb5c45-515a-4ea9-a937-7418314ae5f2-federate-client-tls\") pod \"telemeter-client-75fc48886f-8mhqt\" (UID: \"42fb5c45-515a-4ea9-a937-7418314ae5f2\") " pod="openshift-monitoring/telemeter-client-75fc48886f-8mhqt" Apr 16 14:01:19.538305 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.538155 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/42fb5c45-515a-4ea9-a937-7418314ae5f2-telemeter-client-tls\") pod \"telemeter-client-75fc48886f-8mhqt\" (UID: \"42fb5c45-515a-4ea9-a937-7418314ae5f2\") " pod="openshift-monitoring/telemeter-client-75fc48886f-8mhqt" Apr 16 14:01:19.538305 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.538193 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/42fb5c45-515a-4ea9-a937-7418314ae5f2-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-75fc48886f-8mhqt\" (UID: \"42fb5c45-515a-4ea9-a937-7418314ae5f2\") " pod="openshift-monitoring/telemeter-client-75fc48886f-8mhqt" Apr 16 14:01:19.538305 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.538256 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42fb5c45-515a-4ea9-a937-7418314ae5f2-telemeter-trusted-ca-bundle\") pod \"telemeter-client-75fc48886f-8mhqt\" (UID: \"42fb5c45-515a-4ea9-a937-7418314ae5f2\") " pod="openshift-monitoring/telemeter-client-75fc48886f-8mhqt" Apr 16 14:01:19.538565 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.538401 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/42fb5c45-515a-4ea9-a937-7418314ae5f2-metrics-client-ca\") pod \"telemeter-client-75fc48886f-8mhqt\" (UID: \"42fb5c45-515a-4ea9-a937-7418314ae5f2\") " pod="openshift-monitoring/telemeter-client-75fc48886f-8mhqt" Apr 16 14:01:19.538565 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.538434 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hcmwh\" (UniqueName: \"kubernetes.io/projected/42fb5c45-515a-4ea9-a937-7418314ae5f2-kube-api-access-hcmwh\") pod \"telemeter-client-75fc48886f-8mhqt\" (UID: \"42fb5c45-515a-4ea9-a937-7418314ae5f2\") " pod="openshift-monitoring/telemeter-client-75fc48886f-8mhqt" Apr 16 14:01:19.538565 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.538469 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/42fb5c45-515a-4ea9-a937-7418314ae5f2-secret-telemeter-client\") pod \"telemeter-client-75fc48886f-8mhqt\" (UID: \"42fb5c45-515a-4ea9-a937-7418314ae5f2\") " pod="openshift-monitoring/telemeter-client-75fc48886f-8mhqt" Apr 16 14:01:19.639783 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.639663 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/42fb5c45-515a-4ea9-a937-7418314ae5f2-telemeter-client-tls\") pod \"telemeter-client-75fc48886f-8mhqt\" (UID: \"42fb5c45-515a-4ea9-a937-7418314ae5f2\") " pod="openshift-monitoring/telemeter-client-75fc48886f-8mhqt" Apr 16 14:01:19.639783 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.639735 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/42fb5c45-515a-4ea9-a937-7418314ae5f2-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-75fc48886f-8mhqt\" (UID: \"42fb5c45-515a-4ea9-a937-7418314ae5f2\") " pod="openshift-monitoring/telemeter-client-75fc48886f-8mhqt" Apr 16 14:01:19.639783 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.639757 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42fb5c45-515a-4ea9-a937-7418314ae5f2-telemeter-trusted-ca-bundle\") pod \"telemeter-client-75fc48886f-8mhqt\" (UID: \"42fb5c45-515a-4ea9-a937-7418314ae5f2\") " pod="openshift-monitoring/telemeter-client-75fc48886f-8mhqt" Apr 16 14:01:19.639783 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.639781 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/42fb5c45-515a-4ea9-a937-7418314ae5f2-metrics-client-ca\") pod \"telemeter-client-75fc48886f-8mhqt\" (UID: \"42fb5c45-515a-4ea9-a937-7418314ae5f2\") " pod="openshift-monitoring/telemeter-client-75fc48886f-8mhqt" Apr 16 14:01:19.640240 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.639797 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hcmwh\" (UniqueName: \"kubernetes.io/projected/42fb5c45-515a-4ea9-a937-7418314ae5f2-kube-api-access-hcmwh\") pod \"telemeter-client-75fc48886f-8mhqt\" (UID: \"42fb5c45-515a-4ea9-a937-7418314ae5f2\") " pod="openshift-monitoring/telemeter-client-75fc48886f-8mhqt" Apr 16 14:01:19.640240 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.639820 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/42fb5c45-515a-4ea9-a937-7418314ae5f2-secret-telemeter-client\") pod \"telemeter-client-75fc48886f-8mhqt\" (UID: \"42fb5c45-515a-4ea9-a937-7418314ae5f2\") " pod="openshift-monitoring/telemeter-client-75fc48886f-8mhqt" Apr 16 14:01:19.640240 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.639868 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42fb5c45-515a-4ea9-a937-7418314ae5f2-serving-certs-ca-bundle\") pod \"telemeter-client-75fc48886f-8mhqt\" (UID: \"42fb5c45-515a-4ea9-a937-7418314ae5f2\") " pod="openshift-monitoring/telemeter-client-75fc48886f-8mhqt" Apr 16 14:01:19.640240 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.639923 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/42fb5c45-515a-4ea9-a937-7418314ae5f2-federate-client-tls\") pod \"telemeter-client-75fc48886f-8mhqt\" (UID: \"42fb5c45-515a-4ea9-a937-7418314ae5f2\") " pod="openshift-monitoring/telemeter-client-75fc48886f-8mhqt" Apr 16 14:01:19.640860 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.640789 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42fb5c45-515a-4ea9-a937-7418314ae5f2-telemeter-trusted-ca-bundle\") pod \"telemeter-client-75fc48886f-8mhqt\" (UID: \"42fb5c45-515a-4ea9-a937-7418314ae5f2\") " pod="openshift-monitoring/telemeter-client-75fc48886f-8mhqt" Apr 16 14:01:19.641306 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.641252 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/42fb5c45-515a-4ea9-a937-7418314ae5f2-serving-certs-ca-bundle\") pod \"telemeter-client-75fc48886f-8mhqt\" (UID: \"42fb5c45-515a-4ea9-a937-7418314ae5f2\") " pod="openshift-monitoring/telemeter-client-75fc48886f-8mhqt" Apr 16 14:01:19.641306 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.641292 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/42fb5c45-515a-4ea9-a937-7418314ae5f2-metrics-client-ca\") pod \"telemeter-client-75fc48886f-8mhqt\" (UID: \"42fb5c45-515a-4ea9-a937-7418314ae5f2\") " pod="openshift-monitoring/telemeter-client-75fc48886f-8mhqt" Apr 16 14:01:19.642875 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.642833 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/42fb5c45-515a-4ea9-a937-7418314ae5f2-telemeter-client-tls\") pod \"telemeter-client-75fc48886f-8mhqt\" (UID: \"42fb5c45-515a-4ea9-a937-7418314ae5f2\") " pod="openshift-monitoring/telemeter-client-75fc48886f-8mhqt" Apr 16 14:01:19.642959 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.642925 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/42fb5c45-515a-4ea9-a937-7418314ae5f2-federate-client-tls\") pod \"telemeter-client-75fc48886f-8mhqt\" (UID: \"42fb5c45-515a-4ea9-a937-7418314ae5f2\") " pod="openshift-monitoring/telemeter-client-75fc48886f-8mhqt" Apr 16 14:01:19.643149 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.643116 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/42fb5c45-515a-4ea9-a937-7418314ae5f2-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-75fc48886f-8mhqt\" (UID: \"42fb5c45-515a-4ea9-a937-7418314ae5f2\") " pod="openshift-monitoring/telemeter-client-75fc48886f-8mhqt" Apr 16 14:01:19.643251 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.643238 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/42fb5c45-515a-4ea9-a937-7418314ae5f2-secret-telemeter-client\") pod \"telemeter-client-75fc48886f-8mhqt\" (UID: \"42fb5c45-515a-4ea9-a937-7418314ae5f2\") " pod="openshift-monitoring/telemeter-client-75fc48886f-8mhqt" Apr 16 14:01:19.649053 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.648976 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hcmwh\" (UniqueName: \"kubernetes.io/projected/42fb5c45-515a-4ea9-a937-7418314ae5f2-kube-api-access-hcmwh\") pod \"telemeter-client-75fc48886f-8mhqt\" (UID: \"42fb5c45-515a-4ea9-a937-7418314ae5f2\") " pod="openshift-monitoring/telemeter-client-75fc48886f-8mhqt" Apr 16 14:01:19.679711 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.679674 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-75fc48886f-8mhqt" Apr 16 14:01:19.748609 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.748554 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7695b9cb7-4mmkk" event={"ID":"e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd","Type":"ContainerStarted","Data":"ca3650236aca1676fc4654cc3fa352425cd12043faf469c869fe9dacfa01f781"} Apr 16 14:01:19.752366 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.752323 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hmhjw" event={"ID":"3ac9f0e6-4ee7-4de1-85f7-67127085b819","Type":"ContainerStarted","Data":"9b21ff943d11c759e9ad3f3acb6503be118b79a3d851e0a4694edeef2f2de942"} Apr 16 14:01:19.753102 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.752375 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-hmhjw" event={"ID":"3ac9f0e6-4ee7-4de1-85f7-67127085b819","Type":"ContainerStarted","Data":"40bf9f90ed723c5d76fa19755ff5ed5ba6aacebc96a3810397b4f601ed010a37"} Apr 16 14:01:19.839624 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.839550 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-hmhjw" podStartSLOduration=4.9359193470000005 podStartE2EDuration="5.839511007s" podCreationTimestamp="2026-04-16 14:01:14 +0000 UTC" firstStartedPulling="2026-04-16 14:01:16.67482819 +0000 UTC m=+102.880471332" lastFinishedPulling="2026-04-16 14:01:17.578419842 +0000 UTC m=+103.784062992" observedRunningTime="2026-04-16 14:01:19.772929956 +0000 UTC m=+105.978573120" watchObservedRunningTime="2026-04-16 14:01:19.839511007 +0000 UTC m=+106.045154228" Apr 16 14:01:19.839888 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:19.839866 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-75fc48886f-8mhqt"] Apr 16 14:01:19.844731 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:01:19.844698 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42fb5c45_515a_4ea9_a937_7418314ae5f2.slice/crio-04d380662f5d5f750dd20e7c6b06546ec73fc7be0cf9bd03be476ab049e56fd4 WatchSource:0}: Error finding container 04d380662f5d5f750dd20e7c6b06546ec73fc7be0cf9bd03be476ab049e56fd4: Status 404 returned error can't find the container with id 04d380662f5d5f750dd20e7c6b06546ec73fc7be0cf9bd03be476ab049e56fd4 Apr 16 14:01:20.415048 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.415000 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:01:20.421234 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.421204 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.425352 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.425325 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 14:01:20.426079 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.426050 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 14:01:20.426286 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.426270 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 14:01:20.427348 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.427325 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 14:01:20.427660 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.427628 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-8l3jl34optn5s\"" Apr 16 14:01:20.428596 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.427927 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 14:01:20.428596 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.428075 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 14:01:20.428596 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.428147 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-m66wh\"" Apr 16 14:01:20.428596 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.428208 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 14:01:20.428596 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.428332 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 14:01:20.428596 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.428445 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 14:01:20.428596 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.428493 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 14:01:20.429108 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.429086 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 14:01:20.429185 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.429168 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 14:01:20.439503 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.439443 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:01:20.551070 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.551029 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/28203d08-f398-4104-9ecd-65a25e29ac81-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.551253 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.551089 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28203d08-f398-4104-9ecd-65a25e29ac81-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.551253 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.551134 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/28203d08-f398-4104-9ecd-65a25e29ac81-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.551253 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.551200 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.558200 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.551355 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-web-config\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.558200 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.551400 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.558200 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.551427 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/28203d08-f398-4104-9ecd-65a25e29ac81-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.558200 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.551483 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.558200 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.551553 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/28203d08-f398-4104-9ecd-65a25e29ac81-config-out\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.558200 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.551582 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28203d08-f398-4104-9ecd-65a25e29ac81-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.558200 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.551618 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/28203d08-f398-4104-9ecd-65a25e29ac81-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.558200 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.551669 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.558200 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.551737 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.558200 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.551783 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.558200 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.551817 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-config\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.558200 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.551842 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpqw4\" (UniqueName: \"kubernetes.io/projected/28203d08-f398-4104-9ecd-65a25e29ac81-kube-api-access-rpqw4\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.558200 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.551903 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28203d08-f398-4104-9ecd-65a25e29ac81-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.558200 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.551933 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.653565 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.652843 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28203d08-f398-4104-9ecd-65a25e29ac81-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.653565 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.652906 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/28203d08-f398-4104-9ecd-65a25e29ac81-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.653565 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.652937 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.653565 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.652963 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-web-config\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.653565 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.653035 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.653565 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.653061 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/28203d08-f398-4104-9ecd-65a25e29ac81-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.653565 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.653088 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.653565 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.653138 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/28203d08-f398-4104-9ecd-65a25e29ac81-config-out\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.653565 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.653166 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28203d08-f398-4104-9ecd-65a25e29ac81-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.653565 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.653201 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/28203d08-f398-4104-9ecd-65a25e29ac81-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.653565 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.653230 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.653565 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.653261 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.653565 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.653303 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.653565 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.653335 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-config\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.653565 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.653358 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rpqw4\" (UniqueName: \"kubernetes.io/projected/28203d08-f398-4104-9ecd-65a25e29ac81-kube-api-access-rpqw4\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.653565 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.653407 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28203d08-f398-4104-9ecd-65a25e29ac81-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.653565 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.653437 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.654562 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.653490 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/28203d08-f398-4104-9ecd-65a25e29ac81-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.654562 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.654397 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/28203d08-f398-4104-9ecd-65a25e29ac81-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.654562 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.654511 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28203d08-f398-4104-9ecd-65a25e29ac81-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.655348 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.655232 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28203d08-f398-4104-9ecd-65a25e29ac81-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.655910 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.655524 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/28203d08-f398-4104-9ecd-65a25e29ac81-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.660086 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.660045 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.664335 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.663875 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.665598 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.664701 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-web-config\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.665598 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.665072 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.667749 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.666284 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-config\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.667749 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.667292 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28203d08-f398-4104-9ecd-65a25e29ac81-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.667749 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.667593 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpqw4\" (UniqueName: \"kubernetes.io/projected/28203d08-f398-4104-9ecd-65a25e29ac81-kube-api-access-rpqw4\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.669420 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.668718 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/28203d08-f398-4104-9ecd-65a25e29ac81-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.669420 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.669032 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.669420 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.669036 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/28203d08-f398-4104-9ecd-65a25e29ac81-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.669420 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.669354 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.669682 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.669521 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.671947 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.671923 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/28203d08-f398-4104-9ecd-65a25e29ac81-config-out\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.672054 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.672000 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.736396 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.736085 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:20.761894 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.761842 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-75fc48886f-8mhqt" event={"ID":"42fb5c45-515a-4ea9-a937-7418314ae5f2","Type":"ContainerStarted","Data":"04d380662f5d5f750dd20e7c6b06546ec73fc7be0cf9bd03be476ab049e56fd4"} Apr 16 14:01:20.935185 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:20.935131 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:01:20.937199 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:01:20.937162 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod28203d08_f398_4104_9ecd_65a25e29ac81.slice/crio-e5cbe573737c1b0b21db08b05c79b75710e1b2f7d05fe934701042483c590623 WatchSource:0}: Error finding container e5cbe573737c1b0b21db08b05c79b75710e1b2f7d05fe934701042483c590623: Status 404 returned error can't find the container with id e5cbe573737c1b0b21db08b05c79b75710e1b2f7d05fe934701042483c590623 Apr 16 14:01:21.767292 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:21.767255 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28203d08-f398-4104-9ecd-65a25e29ac81","Type":"ContainerStarted","Data":"e5cbe573737c1b0b21db08b05c79b75710e1b2f7d05fe934701042483c590623"} Apr 16 14:01:22.772946 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:22.772905 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-75fc48886f-8mhqt" event={"ID":"42fb5c45-515a-4ea9-a937-7418314ae5f2","Type":"ContainerStarted","Data":"d1462d52b8e373f24db9b9102adaec2f9a5f4ecf04df0256f03e4081261831f4"} Apr 16 14:01:22.774872 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:22.774824 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7695b9cb7-4mmkk" event={"ID":"e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd","Type":"ContainerStarted","Data":"a2d6b5cd196896f3c9d8c8530f65cffd12bfa7c4085f82cc852b133562453c6f"} Apr 16 14:01:22.796373 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:22.795717 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7695b9cb7-4mmkk" podStartSLOduration=1.583285387 podStartE2EDuration="4.795692815s" podCreationTimestamp="2026-04-16 14:01:18 +0000 UTC" firstStartedPulling="2026-04-16 14:01:18.958845375 +0000 UTC m=+105.164488523" lastFinishedPulling="2026-04-16 14:01:22.171252795 +0000 UTC m=+108.376895951" observedRunningTime="2026-04-16 14:01:22.793835798 +0000 UTC m=+108.999478966" watchObservedRunningTime="2026-04-16 14:01:22.795692815 +0000 UTC m=+109.001335980" Apr 16 14:01:23.779738 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:23.779697 2570 generic.go:358] "Generic (PLEG): container finished" podID="28203d08-f398-4104-9ecd-65a25e29ac81" containerID="853af0c652b79622d8a25fef7fb178b7ad12263faf26cb058a1bb0ee3a460edc" exitCode=0 Apr 16 14:01:23.779738 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:23.779779 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28203d08-f398-4104-9ecd-65a25e29ac81","Type":"ContainerDied","Data":"853af0c652b79622d8a25fef7fb178b7ad12263faf26cb058a1bb0ee3a460edc"} Apr 16 14:01:23.782648 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:23.782108 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-75fc48886f-8mhqt" event={"ID":"42fb5c45-515a-4ea9-a937-7418314ae5f2","Type":"ContainerStarted","Data":"fda612886eb65fb29b431b4cb6e5dd2ca1a5c422b869f7fc63d69d4347bf358c"} Apr 16 14:01:23.782648 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:23.782157 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-75fc48886f-8mhqt" event={"ID":"42fb5c45-515a-4ea9-a937-7418314ae5f2","Type":"ContainerStarted","Data":"0eafc936edfe5cef3e686d4ebcdbcd18a60576ea02223e8aaa10131ff5855e38"} Apr 16 14:01:23.843813 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:23.843747 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-75fc48886f-8mhqt" podStartSLOduration=1.632592525 podStartE2EDuration="4.843732395s" podCreationTimestamp="2026-04-16 14:01:19 +0000 UTC" firstStartedPulling="2026-04-16 14:01:19.84645066 +0000 UTC m=+106.052093804" lastFinishedPulling="2026-04-16 14:01:23.057590533 +0000 UTC m=+109.263233674" observedRunningTime="2026-04-16 14:01:23.842418683 +0000 UTC m=+110.048061848" watchObservedRunningTime="2026-04-16 14:01:23.843732395 +0000 UTC m=+110.049375556" Apr 16 14:01:24.715115 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:24.715069 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-smqtz" Apr 16 14:01:27.805020 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:27.804977 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28203d08-f398-4104-9ecd-65a25e29ac81","Type":"ContainerStarted","Data":"7e30259b98ab981df47addeaa615b21559934fa92bbcb02d75685e69d67ce328"} Apr 16 14:01:27.805424 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:27.805029 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28203d08-f398-4104-9ecd-65a25e29ac81","Type":"ContainerStarted","Data":"830c4237528c3826bdf9cbf97f505aca987881738952c583a09481700dfd5913"} Apr 16 14:01:34.834284 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:34.834188 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-586b57c7b4-szfcb" event={"ID":"135c9f9a-4369-45dc-8f77-8d080f471674","Type":"ContainerStarted","Data":"03edfb3eecf8d651b18076bc3886a498cb36be5b43d2fd1c59afc58d1c7ebe83"} Apr 16 14:01:34.834768 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:34.834518 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-586b57c7b4-szfcb" Apr 16 14:01:34.853336 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:34.853096 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-586b57c7b4-szfcb" Apr 16 14:01:34.854732 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:34.854608 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-586b57c7b4-szfcb" podStartSLOduration=2.226802008 podStartE2EDuration="19.854589809s" podCreationTimestamp="2026-04-16 14:01:15 +0000 UTC" firstStartedPulling="2026-04-16 14:01:16.812406076 +0000 UTC m=+103.018049218" lastFinishedPulling="2026-04-16 14:01:34.440193872 +0000 UTC m=+120.645837019" observedRunningTime="2026-04-16 14:01:34.852367005 +0000 UTC m=+121.058010169" watchObservedRunningTime="2026-04-16 14:01:34.854589809 +0000 UTC m=+121.060232975" Apr 16 14:01:36.844324 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:36.844288 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28203d08-f398-4104-9ecd-65a25e29ac81","Type":"ContainerStarted","Data":"d90d8565d8c55f2582db28471b7968df76e69504655ee2d1f51c40442becfe05"} Apr 16 14:01:36.844701 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:36.844336 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28203d08-f398-4104-9ecd-65a25e29ac81","Type":"ContainerStarted","Data":"d88fb0ee2c5350fa7bbe6746dd437e41f99f8dd1df09a05bec662df8bd4ae28b"} Apr 16 14:01:37.855462 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:37.855425 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28203d08-f398-4104-9ecd-65a25e29ac81","Type":"ContainerStarted","Data":"e922ff92096ab61d87e3cb6d75507d466c644d576457bffc1c5389f95248cacc"} Apr 16 14:01:37.855462 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:37.855463 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28203d08-f398-4104-9ecd-65a25e29ac81","Type":"ContainerStarted","Data":"094e37cbbb4518179c349b658c82c821f1c95580d26356e6da4e78665dd540ce"} Apr 16 14:01:37.890790 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:37.890739 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.37124239 podStartE2EDuration="17.890723055s" podCreationTimestamp="2026-04-16 14:01:20 +0000 UTC" firstStartedPulling="2026-04-16 14:01:20.941016619 +0000 UTC m=+107.146659775" lastFinishedPulling="2026-04-16 14:01:36.460497294 +0000 UTC m=+122.666140440" observedRunningTime="2026-04-16 14:01:37.889854196 +0000 UTC m=+124.095497366" watchObservedRunningTime="2026-04-16 14:01:37.890723055 +0000 UTC m=+124.096366218" Apr 16 14:01:38.782967 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:38.782928 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-7695b9cb7-4mmkk" Apr 16 14:01:38.783253 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:38.783012 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7695b9cb7-4mmkk" Apr 16 14:01:40.736707 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:40.736666 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:01:44.107818 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:44.107753 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e6284f77-08e3-4846-904d-6a21f10707ae-metrics-certs\") pod \"network-metrics-daemon-29pd4\" (UID: \"e6284f77-08e3-4846-904d-6a21f10707ae\") " pod="openshift-multus/network-metrics-daemon-29pd4" Apr 16 14:01:44.111460 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:44.111422 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/e6284f77-08e3-4846-904d-6a21f10707ae-metrics-certs\") pod \"network-metrics-daemon-29pd4\" (UID: \"e6284f77-08e3-4846-904d-6a21f10707ae\") " pod="openshift-multus/network-metrics-daemon-29pd4" Apr 16 14:01:44.319624 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:44.319592 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-p5bbb\"" Apr 16 14:01:44.327790 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:44.327754 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-29pd4" Apr 16 14:01:44.482682 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:44.482628 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-29pd4"] Apr 16 14:01:44.878389 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:44.878353 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-29pd4" event={"ID":"e6284f77-08e3-4846-904d-6a21f10707ae","Type":"ContainerStarted","Data":"02d228167943c9e7160919f8b379a4e3f123ca3a02fe4283a16e62730b62ab00"} Apr 16 14:01:45.882357 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:45.882271 2570 generic.go:358] "Generic (PLEG): container finished" podID="e39233e7-6f83-4e72-8e15-0f19ce865b49" containerID="400eb4e3feb3576ee03ad19d5ebd0349027e29d8157a7c2eb4fdb9229cc4f5af" exitCode=0 Apr 16 14:01:45.882357 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:45.882346 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-qgcsx" event={"ID":"e39233e7-6f83-4e72-8e15-0f19ce865b49","Type":"ContainerDied","Data":"400eb4e3feb3576ee03ad19d5ebd0349027e29d8157a7c2eb4fdb9229cc4f5af"} Apr 16 14:01:45.882760 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:45.882711 2570 scope.go:117] "RemoveContainer" containerID="400eb4e3feb3576ee03ad19d5ebd0349027e29d8157a7c2eb4fdb9229cc4f5af" Apr 16 14:01:46.889221 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:46.889122 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-29pd4" event={"ID":"e6284f77-08e3-4846-904d-6a21f10707ae","Type":"ContainerStarted","Data":"321d9f23b7f145fa0c96be97dc1f67063a39c73d4a09e746c38206146d3971ae"} Apr 16 14:01:46.889221 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:46.889166 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-29pd4" event={"ID":"e6284f77-08e3-4846-904d-6a21f10707ae","Type":"ContainerStarted","Data":"ccebbd61e25a249b0773c89fc1500a0419b7295da0810a6b9ea7d2d61fb06ef6"} Apr 16 14:01:46.890892 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:46.890868 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-5785d4fcdd-qgcsx" event={"ID":"e39233e7-6f83-4e72-8e15-0f19ce865b49","Type":"ContainerStarted","Data":"780229e00a0aeb31a82bf1bf7c5fa6dc553f4560cb92116b51c9ae09fa3faf0d"} Apr 16 14:01:46.916136 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:46.916066 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-29pd4" podStartSLOduration=131.368439831 podStartE2EDuration="2m12.916046941s" podCreationTimestamp="2026-04-16 13:59:34 +0000 UTC" firstStartedPulling="2026-04-16 14:01:44.489078807 +0000 UTC m=+130.694721965" lastFinishedPulling="2026-04-16 14:01:46.036685929 +0000 UTC m=+132.242329075" observedRunningTime="2026-04-16 14:01:46.913722211 +0000 UTC m=+133.119365374" watchObservedRunningTime="2026-04-16 14:01:46.916046941 +0000 UTC m=+133.121690141" Apr 16 14:01:50.904285 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:50.904256 2570 generic.go:358] "Generic (PLEG): container finished" podID="234da4eb-18fa-4c53-ab1a-fbbe46b9e3cb" containerID="e2116bfba95c00ac457b921051d604ff04c693de0750873d827bb1dc4890ac40" exitCode=0 Apr 16 14:01:50.904719 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:50.904310 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wskzx" event={"ID":"234da4eb-18fa-4c53-ab1a-fbbe46b9e3cb","Type":"ContainerDied","Data":"e2116bfba95c00ac457b921051d604ff04c693de0750873d827bb1dc4890ac40"} Apr 16 14:01:50.904719 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:50.904637 2570 scope.go:117] "RemoveContainer" containerID="e2116bfba95c00ac457b921051d604ff04c693de0750873d827bb1dc4890ac40" Apr 16 14:01:51.908653 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:51.908615 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-756bb7d76f-wskzx" event={"ID":"234da4eb-18fa-4c53-ab1a-fbbe46b9e3cb","Type":"ContainerStarted","Data":"e6b2592ede427c97fd22a3bada580f2b2e43a23c43de24c1513b053e41c8d749"} Apr 16 14:01:58.788440 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:58.788403 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7695b9cb7-4mmkk" Apr 16 14:01:58.792511 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:01:58.792491 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7695b9cb7-4mmkk" Apr 16 14:02:20.736765 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:20.736661 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:20.801577 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:20.801545 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:21.010231 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:21.010149 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:38.845455 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:38.845406 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:02:38.846283 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:38.846069 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="28203d08-f398-4104-9ecd-65a25e29ac81" containerName="kube-rbac-proxy" containerID="cri-o://094e37cbbb4518179c349b658c82c821f1c95580d26356e6da4e78665dd540ce" gracePeriod=600 Apr 16 14:02:38.846283 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:38.846104 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="28203d08-f398-4104-9ecd-65a25e29ac81" containerName="thanos-sidecar" containerID="cri-o://d88fb0ee2c5350fa7bbe6746dd437e41f99f8dd1df09a05bec662df8bd4ae28b" gracePeriod=600 Apr 16 14:02:38.846283 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:38.846108 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="28203d08-f398-4104-9ecd-65a25e29ac81" containerName="kube-rbac-proxy-thanos" containerID="cri-o://e922ff92096ab61d87e3cb6d75507d466c644d576457bffc1c5389f95248cacc" gracePeriod=600 Apr 16 14:02:38.846283 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:38.846163 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="28203d08-f398-4104-9ecd-65a25e29ac81" containerName="kube-rbac-proxy-web" containerID="cri-o://d90d8565d8c55f2582db28471b7968df76e69504655ee2d1f51c40442becfe05" gracePeriod=600 Apr 16 14:02:38.846283 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:38.846064 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="28203d08-f398-4104-9ecd-65a25e29ac81" containerName="prometheus" containerID="cri-o://830c4237528c3826bdf9cbf97f505aca987881738952c583a09481700dfd5913" gracePeriod=600 Apr 16 14:02:38.846283 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:38.846256 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="28203d08-f398-4104-9ecd-65a25e29ac81" containerName="config-reloader" containerID="cri-o://7e30259b98ab981df47addeaa615b21559934fa92bbcb02d75685e69d67ce328" gracePeriod=600 Apr 16 14:02:39.069890 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.069852 2570 generic.go:358] "Generic (PLEG): container finished" podID="28203d08-f398-4104-9ecd-65a25e29ac81" containerID="e922ff92096ab61d87e3cb6d75507d466c644d576457bffc1c5389f95248cacc" exitCode=0 Apr 16 14:02:39.069890 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.069885 2570 generic.go:358] "Generic (PLEG): container finished" podID="28203d08-f398-4104-9ecd-65a25e29ac81" containerID="094e37cbbb4518179c349b658c82c821f1c95580d26356e6da4e78665dd540ce" exitCode=0 Apr 16 14:02:39.069890 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.069894 2570 generic.go:358] "Generic (PLEG): container finished" podID="28203d08-f398-4104-9ecd-65a25e29ac81" containerID="d90d8565d8c55f2582db28471b7968df76e69504655ee2d1f51c40442becfe05" exitCode=0 Apr 16 14:02:39.069890 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.069903 2570 generic.go:358] "Generic (PLEG): container finished" podID="28203d08-f398-4104-9ecd-65a25e29ac81" containerID="d88fb0ee2c5350fa7bbe6746dd437e41f99f8dd1df09a05bec662df8bd4ae28b" exitCode=0 Apr 16 14:02:39.069890 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.069911 2570 generic.go:358] "Generic (PLEG): container finished" podID="28203d08-f398-4104-9ecd-65a25e29ac81" containerID="7e30259b98ab981df47addeaa615b21559934fa92bbcb02d75685e69d67ce328" exitCode=0 Apr 16 14:02:39.070228 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.069920 2570 generic.go:358] "Generic (PLEG): container finished" podID="28203d08-f398-4104-9ecd-65a25e29ac81" containerID="830c4237528c3826bdf9cbf97f505aca987881738952c583a09481700dfd5913" exitCode=0 Apr 16 14:02:39.070228 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.069925 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28203d08-f398-4104-9ecd-65a25e29ac81","Type":"ContainerDied","Data":"e922ff92096ab61d87e3cb6d75507d466c644d576457bffc1c5389f95248cacc"} Apr 16 14:02:39.070228 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.069970 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28203d08-f398-4104-9ecd-65a25e29ac81","Type":"ContainerDied","Data":"094e37cbbb4518179c349b658c82c821f1c95580d26356e6da4e78665dd540ce"} Apr 16 14:02:39.070228 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.069986 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28203d08-f398-4104-9ecd-65a25e29ac81","Type":"ContainerDied","Data":"d90d8565d8c55f2582db28471b7968df76e69504655ee2d1f51c40442becfe05"} Apr 16 14:02:39.070228 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.070002 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28203d08-f398-4104-9ecd-65a25e29ac81","Type":"ContainerDied","Data":"d88fb0ee2c5350fa7bbe6746dd437e41f99f8dd1df09a05bec662df8bd4ae28b"} Apr 16 14:02:39.070228 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.070015 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28203d08-f398-4104-9ecd-65a25e29ac81","Type":"ContainerDied","Data":"7e30259b98ab981df47addeaa615b21559934fa92bbcb02d75685e69d67ce328"} Apr 16 14:02:39.070228 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.070026 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28203d08-f398-4104-9ecd-65a25e29ac81","Type":"ContainerDied","Data":"830c4237528c3826bdf9cbf97f505aca987881738952c583a09481700dfd5913"} Apr 16 14:02:39.096490 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.096408 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:39.191264 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.191231 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28203d08-f398-4104-9ecd-65a25e29ac81-configmap-kubelet-serving-ca-bundle\") pod \"28203d08-f398-4104-9ecd-65a25e29ac81\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " Apr 16 14:02:39.191441 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.191272 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/28203d08-f398-4104-9ecd-65a25e29ac81-prometheus-k8s-db\") pod \"28203d08-f398-4104-9ecd-65a25e29ac81\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " Apr 16 14:02:39.191441 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.191302 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-thanos-prometheus-http-client-file\") pod \"28203d08-f398-4104-9ecd-65a25e29ac81\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " Apr 16 14:02:39.191441 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.191335 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-prometheus-k8s-tls\") pod \"28203d08-f398-4104-9ecd-65a25e29ac81\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " Apr 16 14:02:39.191642 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.191458 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"28203d08-f398-4104-9ecd-65a25e29ac81\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " Apr 16 14:02:39.191642 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.191551 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-web-config\") pod \"28203d08-f398-4104-9ecd-65a25e29ac81\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " Apr 16 14:02:39.191642 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.191584 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-metrics-client-certs\") pod \"28203d08-f398-4104-9ecd-65a25e29ac81\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " Apr 16 14:02:39.191642 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.191612 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/28203d08-f398-4104-9ecd-65a25e29ac81-tls-assets\") pod \"28203d08-f398-4104-9ecd-65a25e29ac81\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " Apr 16 14:02:39.191642 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.191639 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28203d08-f398-4104-9ecd-65a25e29ac81-configmap-serving-certs-ca-bundle\") pod \"28203d08-f398-4104-9ecd-65a25e29ac81\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " Apr 16 14:02:39.191887 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.191677 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28203d08-f398-4104-9ecd-65a25e29ac81-prometheus-trusted-ca-bundle\") pod \"28203d08-f398-4104-9ecd-65a25e29ac81\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " Apr 16 14:02:39.191887 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.191703 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-kube-rbac-proxy\") pod \"28203d08-f398-4104-9ecd-65a25e29ac81\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " Apr 16 14:02:39.191887 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.191702 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28203d08-f398-4104-9ecd-65a25e29ac81-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "28203d08-f398-4104-9ecd-65a25e29ac81" (UID: "28203d08-f398-4104-9ecd-65a25e29ac81"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:02:39.191887 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.191741 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/28203d08-f398-4104-9ecd-65a25e29ac81-config-out\") pod \"28203d08-f398-4104-9ecd-65a25e29ac81\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " Apr 16 14:02:39.191887 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.191772 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/28203d08-f398-4104-9ecd-65a25e29ac81-prometheus-k8s-rulefiles-0\") pod \"28203d08-f398-4104-9ecd-65a25e29ac81\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " Apr 16 14:02:39.191887 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.191807 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/28203d08-f398-4104-9ecd-65a25e29ac81-configmap-metrics-client-ca\") pod \"28203d08-f398-4104-9ecd-65a25e29ac81\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " Apr 16 14:02:39.191887 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.191842 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-config\") pod \"28203d08-f398-4104-9ecd-65a25e29ac81\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " Apr 16 14:02:39.191887 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.191884 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rpqw4\" (UniqueName: \"kubernetes.io/projected/28203d08-f398-4104-9ecd-65a25e29ac81-kube-api-access-rpqw4\") pod \"28203d08-f398-4104-9ecd-65a25e29ac81\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " Apr 16 14:02:39.192279 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.192089 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28203d08-f398-4104-9ecd-65a25e29ac81-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "28203d08-f398-4104-9ecd-65a25e29ac81" (UID: "28203d08-f398-4104-9ecd-65a25e29ac81"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:02:39.192848 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.192430 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28203d08-f398-4104-9ecd-65a25e29ac81-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "28203d08-f398-4104-9ecd-65a25e29ac81" (UID: "28203d08-f398-4104-9ecd-65a25e29ac81"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:02:39.192848 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.192785 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28203d08-f398-4104-9ecd-65a25e29ac81-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "28203d08-f398-4104-9ecd-65a25e29ac81" (UID: "28203d08-f398-4104-9ecd-65a25e29ac81"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:02:39.194766 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.193041 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28203d08-f398-4104-9ecd-65a25e29ac81-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "28203d08-f398-4104-9ecd-65a25e29ac81" (UID: "28203d08-f398-4104-9ecd-65a25e29ac81"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:02:39.194766 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.194421 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/28203d08-f398-4104-9ecd-65a25e29ac81-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "28203d08-f398-4104-9ecd-65a25e29ac81" (UID: "28203d08-f398-4104-9ecd-65a25e29ac81"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:02:39.194766 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.194481 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"28203d08-f398-4104-9ecd-65a25e29ac81\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " Apr 16 14:02:39.194766 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.194561 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-grpc-tls\") pod \"28203d08-f398-4104-9ecd-65a25e29ac81\" (UID: \"28203d08-f398-4104-9ecd-65a25e29ac81\") " Apr 16 14:02:39.195046 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.194834 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "28203d08-f398-4104-9ecd-65a25e29ac81" (UID: "28203d08-f398-4104-9ecd-65a25e29ac81"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:02:39.195046 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.194862 2570 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28203d08-f398-4104-9ecd-65a25e29ac81-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:02:39.195046 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.194883 2570 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28203d08-f398-4104-9ecd-65a25e29ac81-prometheus-trusted-ca-bundle\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:02:39.195046 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.194898 2570 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/28203d08-f398-4104-9ecd-65a25e29ac81-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:02:39.195046 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.194916 2570 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/28203d08-f398-4104-9ecd-65a25e29ac81-configmap-metrics-client-ca\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:02:39.195046 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.194932 2570 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/28203d08-f398-4104-9ecd-65a25e29ac81-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:02:39.195046 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.194946 2570 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/28203d08-f398-4104-9ecd-65a25e29ac81-prometheus-k8s-db\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:02:39.195560 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.195502 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/28203d08-f398-4104-9ecd-65a25e29ac81-config-out" (OuterVolumeSpecName: "config-out") pod "28203d08-f398-4104-9ecd-65a25e29ac81" (UID: "28203d08-f398-4104-9ecd-65a25e29ac81"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:02:39.195664 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.195557 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28203d08-f398-4104-9ecd-65a25e29ac81-kube-api-access-rpqw4" (OuterVolumeSpecName: "kube-api-access-rpqw4") pod "28203d08-f398-4104-9ecd-65a25e29ac81" (UID: "28203d08-f398-4104-9ecd-65a25e29ac81"). InnerVolumeSpecName "kube-api-access-rpqw4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:02:39.195882 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.195801 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "28203d08-f398-4104-9ecd-65a25e29ac81" (UID: "28203d08-f398-4104-9ecd-65a25e29ac81"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:02:39.195882 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.195851 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "28203d08-f398-4104-9ecd-65a25e29ac81" (UID: "28203d08-f398-4104-9ecd-65a25e29ac81"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:02:39.196059 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.196037 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "28203d08-f398-4104-9ecd-65a25e29ac81" (UID: "28203d08-f398-4104-9ecd-65a25e29ac81"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:02:39.196059 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.196032 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/28203d08-f398-4104-9ecd-65a25e29ac81-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "28203d08-f398-4104-9ecd-65a25e29ac81" (UID: "28203d08-f398-4104-9ecd-65a25e29ac81"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:02:39.196384 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.196352 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "28203d08-f398-4104-9ecd-65a25e29ac81" (UID: "28203d08-f398-4104-9ecd-65a25e29ac81"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:02:39.198396 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.198358 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-config" (OuterVolumeSpecName: "config") pod "28203d08-f398-4104-9ecd-65a25e29ac81" (UID: "28203d08-f398-4104-9ecd-65a25e29ac81"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:02:39.198640 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.198610 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "28203d08-f398-4104-9ecd-65a25e29ac81" (UID: "28203d08-f398-4104-9ecd-65a25e29ac81"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:02:39.199908 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.199881 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "28203d08-f398-4104-9ecd-65a25e29ac81" (UID: "28203d08-f398-4104-9ecd-65a25e29ac81"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:02:39.212003 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.211957 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-web-config" (OuterVolumeSpecName: "web-config") pod "28203d08-f398-4104-9ecd-65a25e29ac81" (UID: "28203d08-f398-4104-9ecd-65a25e29ac81"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:02:39.296245 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.296201 2570 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-thanos-prometheus-http-client-file\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:02:39.296245 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.296237 2570 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-prometheus-k8s-tls\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:02:39.296245 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.296249 2570 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:02:39.296245 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.296261 2570 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-web-config\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:02:39.296517 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.296270 2570 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-metrics-client-certs\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:02:39.296517 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.296279 2570 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/28203d08-f398-4104-9ecd-65a25e29ac81-tls-assets\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:02:39.296517 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.296288 2570 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-kube-rbac-proxy\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:02:39.296517 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.296295 2570 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/28203d08-f398-4104-9ecd-65a25e29ac81-config-out\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:02:39.296517 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.296304 2570 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-config\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:02:39.296517 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.296313 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rpqw4\" (UniqueName: \"kubernetes.io/projected/28203d08-f398-4104-9ecd-65a25e29ac81-kube-api-access-rpqw4\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:02:39.296517 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.296322 2570 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:02:39.296517 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:39.296331 2570 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/28203d08-f398-4104-9ecd-65a25e29ac81-secret-grpc-tls\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:02:40.075937 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.075903 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"28203d08-f398-4104-9ecd-65a25e29ac81","Type":"ContainerDied","Data":"e5cbe573737c1b0b21db08b05c79b75710e1b2f7d05fe934701042483c590623"} Apr 16 14:02:40.076394 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.075966 2570 scope.go:117] "RemoveContainer" containerID="e922ff92096ab61d87e3cb6d75507d466c644d576457bffc1c5389f95248cacc" Apr 16 14:02:40.076394 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.075978 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.084730 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.084710 2570 scope.go:117] "RemoveContainer" containerID="094e37cbbb4518179c349b658c82c821f1c95580d26356e6da4e78665dd540ce" Apr 16 14:02:40.092150 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.092130 2570 scope.go:117] "RemoveContainer" containerID="d90d8565d8c55f2582db28471b7968df76e69504655ee2d1f51c40442becfe05" Apr 16 14:02:40.099780 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.099755 2570 scope.go:117] "RemoveContainer" containerID="d88fb0ee2c5350fa7bbe6746dd437e41f99f8dd1df09a05bec662df8bd4ae28b" Apr 16 14:02:40.102397 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.102372 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:02:40.107043 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.107011 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:02:40.109655 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.109632 2570 scope.go:117] "RemoveContainer" containerID="7e30259b98ab981df47addeaa615b21559934fa92bbcb02d75685e69d67ce328" Apr 16 14:02:40.119077 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.119057 2570 scope.go:117] "RemoveContainer" containerID="830c4237528c3826bdf9cbf97f505aca987881738952c583a09481700dfd5913" Apr 16 14:02:40.126945 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.126921 2570 scope.go:117] "RemoveContainer" containerID="853af0c652b79622d8a25fef7fb178b7ad12263faf26cb058a1bb0ee3a460edc" Apr 16 14:02:40.147514 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.147476 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:02:40.147839 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.147824 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28203d08-f398-4104-9ecd-65a25e29ac81" containerName="kube-rbac-proxy" Apr 16 14:02:40.147889 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.147840 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="28203d08-f398-4104-9ecd-65a25e29ac81" containerName="kube-rbac-proxy" Apr 16 14:02:40.147889 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.147855 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28203d08-f398-4104-9ecd-65a25e29ac81" containerName="config-reloader" Apr 16 14:02:40.147889 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.147860 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="28203d08-f398-4104-9ecd-65a25e29ac81" containerName="config-reloader" Apr 16 14:02:40.147889 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.147870 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28203d08-f398-4104-9ecd-65a25e29ac81" containerName="kube-rbac-proxy-web" Apr 16 14:02:40.147889 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.147875 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="28203d08-f398-4104-9ecd-65a25e29ac81" containerName="kube-rbac-proxy-web" Apr 16 14:02:40.147889 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.147882 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28203d08-f398-4104-9ecd-65a25e29ac81" containerName="thanos-sidecar" Apr 16 14:02:40.147889 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.147887 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="28203d08-f398-4104-9ecd-65a25e29ac81" containerName="thanos-sidecar" Apr 16 14:02:40.148095 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.147894 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28203d08-f398-4104-9ecd-65a25e29ac81" containerName="prometheus" Apr 16 14:02:40.148095 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.147899 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="28203d08-f398-4104-9ecd-65a25e29ac81" containerName="prometheus" Apr 16 14:02:40.148095 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.147908 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28203d08-f398-4104-9ecd-65a25e29ac81" containerName="init-config-reloader" Apr 16 14:02:40.148095 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.147913 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="28203d08-f398-4104-9ecd-65a25e29ac81" containerName="init-config-reloader" Apr 16 14:02:40.148095 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.147921 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="28203d08-f398-4104-9ecd-65a25e29ac81" containerName="kube-rbac-proxy-thanos" Apr 16 14:02:40.148095 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.147926 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="28203d08-f398-4104-9ecd-65a25e29ac81" containerName="kube-rbac-proxy-thanos" Apr 16 14:02:40.148095 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.147969 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="28203d08-f398-4104-9ecd-65a25e29ac81" containerName="thanos-sidecar" Apr 16 14:02:40.148095 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.147976 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="28203d08-f398-4104-9ecd-65a25e29ac81" containerName="kube-rbac-proxy-web" Apr 16 14:02:40.148095 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.147983 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="28203d08-f398-4104-9ecd-65a25e29ac81" containerName="config-reloader" Apr 16 14:02:40.148095 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.147990 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="28203d08-f398-4104-9ecd-65a25e29ac81" containerName="prometheus" Apr 16 14:02:40.148095 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.147996 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="28203d08-f398-4104-9ecd-65a25e29ac81" containerName="kube-rbac-proxy" Apr 16 14:02:40.148095 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.148004 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="28203d08-f398-4104-9ecd-65a25e29ac81" containerName="kube-rbac-proxy-thanos" Apr 16 14:02:40.152100 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.152074 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.154977 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.154947 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 16 14:02:40.155206 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.155190 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 16 14:02:40.155454 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.155437 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 16 14:02:40.155613 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.155523 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 16 14:02:40.155685 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.155560 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 16 14:02:40.155895 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.155880 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 16 14:02:40.156250 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.156233 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-8l3jl34optn5s\"" Apr 16 14:02:40.156338 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.156322 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 16 14:02:40.156394 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.156351 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 16 14:02:40.156447 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.156422 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-m66wh\"" Apr 16 14:02:40.158489 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.158435 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 16 14:02:40.160214 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.160190 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 16 14:02:40.161432 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.161252 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 16 14:02:40.165122 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.165091 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 16 14:02:40.180335 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.180295 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:02:40.303933 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.303886 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-config\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.304117 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.303946 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.304117 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.303973 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.304117 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.303993 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.304117 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.304019 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n2xxw\" (UniqueName: \"kubernetes.io/projected/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-kube-api-access-n2xxw\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.304117 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.304084 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.304291 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.304142 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.304291 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.304186 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.304291 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.304225 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.304291 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.304251 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.304291 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.304278 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.304460 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.304323 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-web-config\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.304460 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.304368 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.304460 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.304402 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.304460 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.304432 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.304679 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.304503 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.304679 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.304576 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-config-out\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.304679 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.304609 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.308908 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.308875 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="28203d08-f398-4104-9ecd-65a25e29ac81" path="/var/lib/kubelet/pods/28203d08-f398-4104-9ecd-65a25e29ac81/volumes" Apr 16 14:02:40.406062 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.406029 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-config\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.406230 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.406074 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.406230 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.406094 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.406230 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.406119 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.406230 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.406154 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n2xxw\" (UniqueName: \"kubernetes.io/projected/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-kube-api-access-n2xxw\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.406230 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.406181 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.406230 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.406206 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.406499 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.406429 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.406499 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.406476 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.406648 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.406506 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.406648 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.406564 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.406648 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.406592 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-web-config\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.406648 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.406617 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.406845 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.406668 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.406845 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.406735 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.406845 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.406767 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.406845 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.406806 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-config-out\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.407107 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.406846 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.407351 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.407237 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.407351 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.407237 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.408262 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.407567 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.408262 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.408160 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.409030 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.409008 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.409816 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.409794 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.410881 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.410850 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.411592 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.411517 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-config\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.411757 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.411729 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.412296 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.411896 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.412296 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.412057 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.412296 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.412247 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.412630 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.412606 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-config-out\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.412710 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.412683 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.412755 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.412690 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.412931 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.412908 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.413740 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.413722 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-web-config\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.427208 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.427171 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n2xxw\" (UniqueName: \"kubernetes.io/projected/c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4-kube-api-access-n2xxw\") pod \"prometheus-k8s-0\" (UID: \"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.467988 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.467948 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:02:40.625422 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:40.625382 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 16 14:02:40.626159 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:02:40.626132 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc83f9d8e_3333_4559_9ebc_6c7d1fbfecf4.slice/crio-74ceaf8287a177a04c38c4afbbba8edb868068b5f75bbe08eedb8fd6ccd3c61f WatchSource:0}: Error finding container 74ceaf8287a177a04c38c4afbbba8edb868068b5f75bbe08eedb8fd6ccd3c61f: Status 404 returned error can't find the container with id 74ceaf8287a177a04c38c4afbbba8edb868068b5f75bbe08eedb8fd6ccd3c61f Apr 16 14:02:41.080553 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:41.080438 2570 generic.go:358] "Generic (PLEG): container finished" podID="c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4" containerID="44c421fcfb392ed9c42a0b2c4b1f4b74c307de3fbc42c5f351593c28c2a08cbb" exitCode=0 Apr 16 14:02:41.080958 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:41.080546 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4","Type":"ContainerDied","Data":"44c421fcfb392ed9c42a0b2c4b1f4b74c307de3fbc42c5f351593c28c2a08cbb"} Apr 16 14:02:41.080958 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:41.080586 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4","Type":"ContainerStarted","Data":"74ceaf8287a177a04c38c4afbbba8edb868068b5f75bbe08eedb8fd6ccd3c61f"} Apr 16 14:02:42.088295 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:42.088258 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4","Type":"ContainerStarted","Data":"1f08cc01d849537e7416ed16106f5714ed424d4b6e36f53455a090c6c5612570"} Apr 16 14:02:42.088295 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:42.088297 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4","Type":"ContainerStarted","Data":"e045dcaf5ea2f8325ce36c66503c4029018fa248b9f813c5751b7a14b5a851cb"} Apr 16 14:02:42.088731 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:42.088306 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4","Type":"ContainerStarted","Data":"fc99b5e2824e0e6e4092804e7521c8b4256933c042545705ea3756dab2f402c9"} Apr 16 14:02:42.088731 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:42.088315 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4","Type":"ContainerStarted","Data":"1d77fc9811fdb7cf04eae565c9da05fab193761e4a662882165618eaf12abc74"} Apr 16 14:02:42.088731 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:42.088324 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4","Type":"ContainerStarted","Data":"93c902e0dca7b78c2688023861e8d570b61be74591a88a89dd3c04239a1704fd"} Apr 16 14:02:42.088731 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:42.088335 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4","Type":"ContainerStarted","Data":"36504181cc6e45c95e1c4a2c05272e5fbfb670275353153744645a0b6a48c77b"} Apr 16 14:02:42.122658 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:42.122585 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.12256639 podStartE2EDuration="2.12256639s" podCreationTimestamp="2026-04-16 14:02:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:02:42.120394717 +0000 UTC m=+188.326037881" watchObservedRunningTime="2026-04-16 14:02:42.12256639 +0000 UTC m=+188.328209554" Apr 16 14:02:45.468673 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:02:45.468637 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:31.865670 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:03:31.865635 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-hztbp"] Apr 16 14:03:31.868011 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:03:31.867993 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hztbp" Apr 16 14:03:31.871273 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:03:31.871247 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 14:03:31.881308 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:03:31.881278 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hztbp"] Apr 16 14:03:31.943055 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:03:31.942998 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c0a1d356-9c19-4e1d-9761-5d6982c13212-kubelet-config\") pod \"global-pull-secret-syncer-hztbp\" (UID: \"c0a1d356-9c19-4e1d-9761-5d6982c13212\") " pod="kube-system/global-pull-secret-syncer-hztbp" Apr 16 14:03:31.943055 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:03:31.943060 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c0a1d356-9c19-4e1d-9761-5d6982c13212-original-pull-secret\") pod \"global-pull-secret-syncer-hztbp\" (UID: \"c0a1d356-9c19-4e1d-9761-5d6982c13212\") " pod="kube-system/global-pull-secret-syncer-hztbp" Apr 16 14:03:31.943273 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:03:31.943196 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c0a1d356-9c19-4e1d-9761-5d6982c13212-dbus\") pod \"global-pull-secret-syncer-hztbp\" (UID: \"c0a1d356-9c19-4e1d-9761-5d6982c13212\") " pod="kube-system/global-pull-secret-syncer-hztbp" Apr 16 14:03:32.043917 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:03:32.043876 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c0a1d356-9c19-4e1d-9761-5d6982c13212-dbus\") pod \"global-pull-secret-syncer-hztbp\" (UID: \"c0a1d356-9c19-4e1d-9761-5d6982c13212\") " pod="kube-system/global-pull-secret-syncer-hztbp" Apr 16 14:03:32.044138 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:03:32.043944 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c0a1d356-9c19-4e1d-9761-5d6982c13212-kubelet-config\") pod \"global-pull-secret-syncer-hztbp\" (UID: \"c0a1d356-9c19-4e1d-9761-5d6982c13212\") " pod="kube-system/global-pull-secret-syncer-hztbp" Apr 16 14:03:32.044138 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:03:32.044026 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c0a1d356-9c19-4e1d-9761-5d6982c13212-original-pull-secret\") pod \"global-pull-secret-syncer-hztbp\" (UID: \"c0a1d356-9c19-4e1d-9761-5d6982c13212\") " pod="kube-system/global-pull-secret-syncer-hztbp" Apr 16 14:03:32.044448 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:03:32.044410 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/c0a1d356-9c19-4e1d-9761-5d6982c13212-kubelet-config\") pod \"global-pull-secret-syncer-hztbp\" (UID: \"c0a1d356-9c19-4e1d-9761-5d6982c13212\") " pod="kube-system/global-pull-secret-syncer-hztbp" Apr 16 14:03:32.044600 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:03:32.044417 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/c0a1d356-9c19-4e1d-9761-5d6982c13212-dbus\") pod \"global-pull-secret-syncer-hztbp\" (UID: \"c0a1d356-9c19-4e1d-9761-5d6982c13212\") " pod="kube-system/global-pull-secret-syncer-hztbp" Apr 16 14:03:32.046521 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:03:32.046497 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/c0a1d356-9c19-4e1d-9761-5d6982c13212-original-pull-secret\") pod \"global-pull-secret-syncer-hztbp\" (UID: \"c0a1d356-9c19-4e1d-9761-5d6982c13212\") " pod="kube-system/global-pull-secret-syncer-hztbp" Apr 16 14:03:32.178739 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:03:32.178695 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hztbp" Apr 16 14:03:32.353767 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:03:32.353732 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hztbp"] Apr 16 14:03:32.357442 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:03:32.357413 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc0a1d356_9c19_4e1d_9761_5d6982c13212.slice/crio-5a6287d4e437b0ddeaf4307f9833c27d892e919fc7f8af23091a30d534bd49a0 WatchSource:0}: Error finding container 5a6287d4e437b0ddeaf4307f9833c27d892e919fc7f8af23091a30d534bd49a0: Status 404 returned error can't find the container with id 5a6287d4e437b0ddeaf4307f9833c27d892e919fc7f8af23091a30d534bd49a0 Apr 16 14:03:33.243166 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:03:33.243115 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hztbp" event={"ID":"c0a1d356-9c19-4e1d-9761-5d6982c13212","Type":"ContainerStarted","Data":"5a6287d4e437b0ddeaf4307f9833c27d892e919fc7f8af23091a30d534bd49a0"} Apr 16 14:03:37.256119 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:03:37.256081 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hztbp" event={"ID":"c0a1d356-9c19-4e1d-9761-5d6982c13212","Type":"ContainerStarted","Data":"87b5a5690d36b489fc921e6ddf4182e599eb6b3af6065d19c5c012db0c39172c"} Apr 16 14:03:40.468651 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:03:40.468605 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:40.484779 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:03:40.484755 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:03:40.530334 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:03:40.530278 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-hztbp" podStartSLOduration=5.437447293 podStartE2EDuration="9.530261897s" podCreationTimestamp="2026-04-16 14:03:31 +0000 UTC" firstStartedPulling="2026-04-16 14:03:32.359181328 +0000 UTC m=+238.564824473" lastFinishedPulling="2026-04-16 14:03:36.451995932 +0000 UTC m=+242.657639077" observedRunningTime="2026-04-16 14:03:37.28367621 +0000 UTC m=+243.489319375" watchObservedRunningTime="2026-04-16 14:03:40.530261897 +0000 UTC m=+246.735905064" Apr 16 14:03:41.285568 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:03:41.285512 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 16 14:04:34.208583 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:04:34.208553 2570 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 14:09:11.073496 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:11.073461 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-gbcsj"] Apr 16 14:09:11.076807 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:11.076790 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-gbcsj" Apr 16 14:09:11.078900 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:11.078876 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 14:09:11.078999 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:11.078889 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 14:09:11.078999 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:11.078930 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-bqvxc\"" Apr 16 14:09:11.083272 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:11.083248 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-gbcsj"] Apr 16 14:09:11.111021 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:11.110990 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eea532fd-20c6-4b41-90f2-482935b6d867-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-gbcsj\" (UID: \"eea532fd-20c6-4b41-90f2-482935b6d867\") " pod="cert-manager/cert-manager-webhook-597b96b99b-gbcsj" Apr 16 14:09:11.111138 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:11.111071 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnswn\" (UniqueName: \"kubernetes.io/projected/eea532fd-20c6-4b41-90f2-482935b6d867-kube-api-access-jnswn\") pod \"cert-manager-webhook-597b96b99b-gbcsj\" (UID: \"eea532fd-20c6-4b41-90f2-482935b6d867\") " pod="cert-manager/cert-manager-webhook-597b96b99b-gbcsj" Apr 16 14:09:11.211745 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:11.211715 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jnswn\" (UniqueName: \"kubernetes.io/projected/eea532fd-20c6-4b41-90f2-482935b6d867-kube-api-access-jnswn\") pod \"cert-manager-webhook-597b96b99b-gbcsj\" (UID: \"eea532fd-20c6-4b41-90f2-482935b6d867\") " pod="cert-manager/cert-manager-webhook-597b96b99b-gbcsj" Apr 16 14:09:11.211877 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:11.211758 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eea532fd-20c6-4b41-90f2-482935b6d867-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-gbcsj\" (UID: \"eea532fd-20c6-4b41-90f2-482935b6d867\") " pod="cert-manager/cert-manager-webhook-597b96b99b-gbcsj" Apr 16 14:09:11.219905 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:11.219876 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eea532fd-20c6-4b41-90f2-482935b6d867-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-gbcsj\" (UID: \"eea532fd-20c6-4b41-90f2-482935b6d867\") " pod="cert-manager/cert-manager-webhook-597b96b99b-gbcsj" Apr 16 14:09:11.220023 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:11.219980 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jnswn\" (UniqueName: \"kubernetes.io/projected/eea532fd-20c6-4b41-90f2-482935b6d867-kube-api-access-jnswn\") pod \"cert-manager-webhook-597b96b99b-gbcsj\" (UID: \"eea532fd-20c6-4b41-90f2-482935b6d867\") " pod="cert-manager/cert-manager-webhook-597b96b99b-gbcsj" Apr 16 14:09:11.395261 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:11.395175 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-gbcsj" Apr 16 14:09:11.522804 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:11.522773 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-gbcsj"] Apr 16 14:09:11.525967 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:09:11.525927 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeea532fd_20c6_4b41_90f2_482935b6d867.slice/crio-43517dd0c1e00cfc046a511e3b09297e3908ad19cfb8cddf417e6780181de7bc WatchSource:0}: Error finding container 43517dd0c1e00cfc046a511e3b09297e3908ad19cfb8cddf417e6780181de7bc: Status 404 returned error can't find the container with id 43517dd0c1e00cfc046a511e3b09297e3908ad19cfb8cddf417e6780181de7bc Apr 16 14:09:11.528184 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:11.528167 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:09:12.221145 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:12.221082 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-gbcsj" event={"ID":"eea532fd-20c6-4b41-90f2-482935b6d867","Type":"ContainerStarted","Data":"43517dd0c1e00cfc046a511e3b09297e3908ad19cfb8cddf417e6780181de7bc"} Apr 16 14:09:15.233982 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:15.233948 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-gbcsj" event={"ID":"eea532fd-20c6-4b41-90f2-482935b6d867","Type":"ContainerStarted","Data":"eaaf4b91be2a3ece905abd3f6ab4d483f2bae9f8164b6cf23497eacb1a0da113"} Apr 16 14:09:15.234357 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:15.234064 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-gbcsj" Apr 16 14:09:15.249722 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:15.249678 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-gbcsj" podStartSLOduration=1.126563328 podStartE2EDuration="4.249663593s" podCreationTimestamp="2026-04-16 14:09:11 +0000 UTC" firstStartedPulling="2026-04-16 14:09:11.528293341 +0000 UTC m=+577.733936487" lastFinishedPulling="2026-04-16 14:09:14.651393604 +0000 UTC m=+580.857036752" observedRunningTime="2026-04-16 14:09:15.247475233 +0000 UTC m=+581.453118396" watchObservedRunningTime="2026-04-16 14:09:15.249663593 +0000 UTC m=+581.455306756" Apr 16 14:09:19.769903 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:19.769864 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-759f64656b-wvm78"] Apr 16 14:09:19.773227 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:19.773208 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-wvm78" Apr 16 14:09:19.775158 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:19.775136 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-fvpdm\"" Apr 16 14:09:19.782394 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:19.782369 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-wvm78"] Apr 16 14:09:19.875994 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:19.875965 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqcb9\" (UniqueName: \"kubernetes.io/projected/4d7ce487-4026-4d7a-9487-cf2b47915831-kube-api-access-hqcb9\") pod \"cert-manager-759f64656b-wvm78\" (UID: \"4d7ce487-4026-4d7a-9487-cf2b47915831\") " pod="cert-manager/cert-manager-759f64656b-wvm78" Apr 16 14:09:19.876114 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:19.876042 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4d7ce487-4026-4d7a-9487-cf2b47915831-bound-sa-token\") pod \"cert-manager-759f64656b-wvm78\" (UID: \"4d7ce487-4026-4d7a-9487-cf2b47915831\") " pod="cert-manager/cert-manager-759f64656b-wvm78" Apr 16 14:09:19.976832 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:19.976797 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4d7ce487-4026-4d7a-9487-cf2b47915831-bound-sa-token\") pod \"cert-manager-759f64656b-wvm78\" (UID: \"4d7ce487-4026-4d7a-9487-cf2b47915831\") " pod="cert-manager/cert-manager-759f64656b-wvm78" Apr 16 14:09:19.976979 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:19.976845 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hqcb9\" (UniqueName: \"kubernetes.io/projected/4d7ce487-4026-4d7a-9487-cf2b47915831-kube-api-access-hqcb9\") pod \"cert-manager-759f64656b-wvm78\" (UID: \"4d7ce487-4026-4d7a-9487-cf2b47915831\") " pod="cert-manager/cert-manager-759f64656b-wvm78" Apr 16 14:09:19.985580 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:19.985558 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4d7ce487-4026-4d7a-9487-cf2b47915831-bound-sa-token\") pod \"cert-manager-759f64656b-wvm78\" (UID: \"4d7ce487-4026-4d7a-9487-cf2b47915831\") " pod="cert-manager/cert-manager-759f64656b-wvm78" Apr 16 14:09:19.985777 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:19.985760 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqcb9\" (UniqueName: \"kubernetes.io/projected/4d7ce487-4026-4d7a-9487-cf2b47915831-kube-api-access-hqcb9\") pod \"cert-manager-759f64656b-wvm78\" (UID: \"4d7ce487-4026-4d7a-9487-cf2b47915831\") " pod="cert-manager/cert-manager-759f64656b-wvm78" Apr 16 14:09:20.083174 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:20.083090 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-759f64656b-wvm78" Apr 16 14:09:20.204963 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:20.204935 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-759f64656b-wvm78"] Apr 16 14:09:20.207702 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:09:20.207664 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d7ce487_4026_4d7a_9487_cf2b47915831.slice/crio-b713e4534fad00815b15fc55cfade9c200faec878a3a96e3f612849ff0b8b8f9 WatchSource:0}: Error finding container b713e4534fad00815b15fc55cfade9c200faec878a3a96e3f612849ff0b8b8f9: Status 404 returned error can't find the container with id b713e4534fad00815b15fc55cfade9c200faec878a3a96e3f612849ff0b8b8f9 Apr 16 14:09:20.250941 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:20.250919 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-wvm78" event={"ID":"4d7ce487-4026-4d7a-9487-cf2b47915831","Type":"ContainerStarted","Data":"b713e4534fad00815b15fc55cfade9c200faec878a3a96e3f612849ff0b8b8f9"} Apr 16 14:09:21.239306 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:21.239271 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-gbcsj" Apr 16 14:09:21.255408 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:21.255375 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-759f64656b-wvm78" event={"ID":"4d7ce487-4026-4d7a-9487-cf2b47915831","Type":"ContainerStarted","Data":"0fbac5b1e34107da097d8cb8f34c00f14a1107091ed75e07ba7c48900bf2882f"} Apr 16 14:09:21.274065 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:21.274019 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-759f64656b-wvm78" podStartSLOduration=2.273999426 podStartE2EDuration="2.273999426s" podCreationTimestamp="2026-04-16 14:09:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:09:21.272753028 +0000 UTC m=+587.478396192" watchObservedRunningTime="2026-04-16 14:09:21.273999426 +0000 UTC m=+587.479642602" Apr 16 14:09:55.725834 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:55.725760 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-dc77c844c-hn9vm"] Apr 16 14:09:55.727859 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:55.727837 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-dc77c844c-hn9vm" Apr 16 14:09:55.730486 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:55.730463 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 16 14:09:55.730486 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:55.730481 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 16 14:09:55.731245 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:55.731061 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-5tl54\"" Apr 16 14:09:55.731245 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:55.731119 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 16 14:09:55.731245 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:55.731145 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 14:09:55.731245 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:55.731235 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:09:55.738754 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:55.738735 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-dc77c844c-hn9vm"] Apr 16 14:09:55.899080 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:55.899039 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/56973a88-a650-4c60-9f0d-df76e2dc41ae-manager-config\") pod \"lws-controller-manager-dc77c844c-hn9vm\" (UID: \"56973a88-a650-4c60-9f0d-df76e2dc41ae\") " pod="openshift-lws-operator/lws-controller-manager-dc77c844c-hn9vm" Apr 16 14:09:55.899080 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:55.899081 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pzlx\" (UniqueName: \"kubernetes.io/projected/56973a88-a650-4c60-9f0d-df76e2dc41ae-kube-api-access-6pzlx\") pod \"lws-controller-manager-dc77c844c-hn9vm\" (UID: \"56973a88-a650-4c60-9f0d-df76e2dc41ae\") " pod="openshift-lws-operator/lws-controller-manager-dc77c844c-hn9vm" Apr 16 14:09:55.899361 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:55.899134 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/56973a88-a650-4c60-9f0d-df76e2dc41ae-metrics-cert\") pod \"lws-controller-manager-dc77c844c-hn9vm\" (UID: \"56973a88-a650-4c60-9f0d-df76e2dc41ae\") " pod="openshift-lws-operator/lws-controller-manager-dc77c844c-hn9vm" Apr 16 14:09:55.899361 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:55.899200 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/56973a88-a650-4c60-9f0d-df76e2dc41ae-cert\") pod \"lws-controller-manager-dc77c844c-hn9vm\" (UID: \"56973a88-a650-4c60-9f0d-df76e2dc41ae\") " pod="openshift-lws-operator/lws-controller-manager-dc77c844c-hn9vm" Apr 16 14:09:56.000377 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:56.000288 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/56973a88-a650-4c60-9f0d-df76e2dc41ae-cert\") pod \"lws-controller-manager-dc77c844c-hn9vm\" (UID: \"56973a88-a650-4c60-9f0d-df76e2dc41ae\") " pod="openshift-lws-operator/lws-controller-manager-dc77c844c-hn9vm" Apr 16 14:09:56.000566 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:56.000418 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/56973a88-a650-4c60-9f0d-df76e2dc41ae-manager-config\") pod \"lws-controller-manager-dc77c844c-hn9vm\" (UID: \"56973a88-a650-4c60-9f0d-df76e2dc41ae\") " pod="openshift-lws-operator/lws-controller-manager-dc77c844c-hn9vm" Apr 16 14:09:56.000566 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:56.000454 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6pzlx\" (UniqueName: \"kubernetes.io/projected/56973a88-a650-4c60-9f0d-df76e2dc41ae-kube-api-access-6pzlx\") pod \"lws-controller-manager-dc77c844c-hn9vm\" (UID: \"56973a88-a650-4c60-9f0d-df76e2dc41ae\") " pod="openshift-lws-operator/lws-controller-manager-dc77c844c-hn9vm" Apr 16 14:09:56.000694 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:56.000522 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/56973a88-a650-4c60-9f0d-df76e2dc41ae-metrics-cert\") pod \"lws-controller-manager-dc77c844c-hn9vm\" (UID: \"56973a88-a650-4c60-9f0d-df76e2dc41ae\") " pod="openshift-lws-operator/lws-controller-manager-dc77c844c-hn9vm" Apr 16 14:09:56.001230 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:56.001205 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/56973a88-a650-4c60-9f0d-df76e2dc41ae-manager-config\") pod \"lws-controller-manager-dc77c844c-hn9vm\" (UID: \"56973a88-a650-4c60-9f0d-df76e2dc41ae\") " pod="openshift-lws-operator/lws-controller-manager-dc77c844c-hn9vm" Apr 16 14:09:56.003408 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:56.003384 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/56973a88-a650-4c60-9f0d-df76e2dc41ae-metrics-cert\") pod \"lws-controller-manager-dc77c844c-hn9vm\" (UID: \"56973a88-a650-4c60-9f0d-df76e2dc41ae\") " pod="openshift-lws-operator/lws-controller-manager-dc77c844c-hn9vm" Apr 16 14:09:56.003519 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:56.003497 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/56973a88-a650-4c60-9f0d-df76e2dc41ae-cert\") pod \"lws-controller-manager-dc77c844c-hn9vm\" (UID: \"56973a88-a650-4c60-9f0d-df76e2dc41ae\") " pod="openshift-lws-operator/lws-controller-manager-dc77c844c-hn9vm" Apr 16 14:09:56.013100 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:56.013064 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pzlx\" (UniqueName: \"kubernetes.io/projected/56973a88-a650-4c60-9f0d-df76e2dc41ae-kube-api-access-6pzlx\") pod \"lws-controller-manager-dc77c844c-hn9vm\" (UID: \"56973a88-a650-4c60-9f0d-df76e2dc41ae\") " pod="openshift-lws-operator/lws-controller-manager-dc77c844c-hn9vm" Apr 16 14:09:56.037920 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:56.037900 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-dc77c844c-hn9vm" Apr 16 14:09:56.182187 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:56.182166 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-dc77c844c-hn9vm"] Apr 16 14:09:56.184271 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:09:56.184240 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56973a88_a650_4c60_9f0d_df76e2dc41ae.slice/crio-a86cfaff87f8a738cad93f9ba8fe697fd0729fcfe8ec4b51431d4d10c3a41a19 WatchSource:0}: Error finding container a86cfaff87f8a738cad93f9ba8fe697fd0729fcfe8ec4b51431d4d10c3a41a19: Status 404 returned error can't find the container with id a86cfaff87f8a738cad93f9ba8fe697fd0729fcfe8ec4b51431d4d10c3a41a19 Apr 16 14:09:56.357055 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:09:56.356980 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-dc77c844c-hn9vm" event={"ID":"56973a88-a650-4c60-9f0d-df76e2dc41ae","Type":"ContainerStarted","Data":"a86cfaff87f8a738cad93f9ba8fe697fd0729fcfe8ec4b51431d4d10c3a41a19"} Apr 16 14:10:02.377361 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:02.377320 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-dc77c844c-hn9vm" event={"ID":"56973a88-a650-4c60-9f0d-df76e2dc41ae","Type":"ContainerStarted","Data":"d3647b876d6b3f5c96abaee3be2f5f3be979c2ac84311cd2fd1a256c9b6b3a59"} Apr 16 14:10:02.377764 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:02.377449 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-dc77c844c-hn9vm" Apr 16 14:10:02.394842 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:02.394801 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-dc77c844c-hn9vm" podStartSLOduration=1.722265744 podStartE2EDuration="7.394788123s" podCreationTimestamp="2026-04-16 14:09:55 +0000 UTC" firstStartedPulling="2026-04-16 14:09:56.186236962 +0000 UTC m=+622.391880104" lastFinishedPulling="2026-04-16 14:10:01.858759339 +0000 UTC m=+628.064402483" observedRunningTime="2026-04-16 14:10:02.3933196 +0000 UTC m=+628.598962763" watchObservedRunningTime="2026-04-16 14:10:02.394788123 +0000 UTC m=+628.600431287" Apr 16 14:10:13.382744 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:13.382710 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-dc77c844c-hn9vm" Apr 16 14:10:31.995242 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:31.995206 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq"] Apr 16 14:10:32.001427 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.001402 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" Apr 16 14:10:32.003571 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.003545 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 14:10:32.003709 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.003662 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 14:10:32.003773 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.003545 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 14:10:32.003877 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.003859 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"openshift-ai-inference-openshift-default-dockercfg-cvszz\"" Apr 16 14:10:32.013181 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.013153 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq"] Apr 16 14:10:32.095396 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.095367 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sprhk\" (UniqueName: \"kubernetes.io/projected/8d0b7d74-4c84-4b40-9ef3-5c1e6641a116-kube-api-access-sprhk\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-2l2lq\" (UID: \"8d0b7d74-4c84-4b40-9ef3-5c1e6641a116\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" Apr 16 14:10:32.095572 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.095410 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/8d0b7d74-4c84-4b40-9ef3-5c1e6641a116-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-2l2lq\" (UID: \"8d0b7d74-4c84-4b40-9ef3-5c1e6641a116\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" Apr 16 14:10:32.095572 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.095439 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/8d0b7d74-4c84-4b40-9ef3-5c1e6641a116-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-2l2lq\" (UID: \"8d0b7d74-4c84-4b40-9ef3-5c1e6641a116\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" Apr 16 14:10:32.095572 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.095509 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/8d0b7d74-4c84-4b40-9ef3-5c1e6641a116-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-2l2lq\" (UID: \"8d0b7d74-4c84-4b40-9ef3-5c1e6641a116\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" Apr 16 14:10:32.095572 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.095556 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/8d0b7d74-4c84-4b40-9ef3-5c1e6641a116-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-2l2lq\" (UID: \"8d0b7d74-4c84-4b40-9ef3-5c1e6641a116\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" Apr 16 14:10:32.095744 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.095587 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/8d0b7d74-4c84-4b40-9ef3-5c1e6641a116-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-2l2lq\" (UID: \"8d0b7d74-4c84-4b40-9ef3-5c1e6641a116\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" Apr 16 14:10:32.095744 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.095604 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/8d0b7d74-4c84-4b40-9ef3-5c1e6641a116-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-2l2lq\" (UID: \"8d0b7d74-4c84-4b40-9ef3-5c1e6641a116\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" Apr 16 14:10:32.095744 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.095625 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/8d0b7d74-4c84-4b40-9ef3-5c1e6641a116-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-2l2lq\" (UID: \"8d0b7d74-4c84-4b40-9ef3-5c1e6641a116\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" Apr 16 14:10:32.095744 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.095674 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/8d0b7d74-4c84-4b40-9ef3-5c1e6641a116-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-2l2lq\" (UID: \"8d0b7d74-4c84-4b40-9ef3-5c1e6641a116\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" Apr 16 14:10:32.197097 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.197067 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/8d0b7d74-4c84-4b40-9ef3-5c1e6641a116-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-2l2lq\" (UID: \"8d0b7d74-4c84-4b40-9ef3-5c1e6641a116\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" Apr 16 14:10:32.197097 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.197099 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/8d0b7d74-4c84-4b40-9ef3-5c1e6641a116-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-2l2lq\" (UID: \"8d0b7d74-4c84-4b40-9ef3-5c1e6641a116\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" Apr 16 14:10:32.197328 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.197115 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/8d0b7d74-4c84-4b40-9ef3-5c1e6641a116-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-2l2lq\" (UID: \"8d0b7d74-4c84-4b40-9ef3-5c1e6641a116\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" Apr 16 14:10:32.197328 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.197131 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/8d0b7d74-4c84-4b40-9ef3-5c1e6641a116-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-2l2lq\" (UID: \"8d0b7d74-4c84-4b40-9ef3-5c1e6641a116\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" Apr 16 14:10:32.197328 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.197270 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/8d0b7d74-4c84-4b40-9ef3-5c1e6641a116-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-2l2lq\" (UID: \"8d0b7d74-4c84-4b40-9ef3-5c1e6641a116\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" Apr 16 14:10:32.197328 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.197315 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/8d0b7d74-4c84-4b40-9ef3-5c1e6641a116-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-2l2lq\" (UID: \"8d0b7d74-4c84-4b40-9ef3-5c1e6641a116\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" Apr 16 14:10:32.197520 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.197436 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sprhk\" (UniqueName: \"kubernetes.io/projected/8d0b7d74-4c84-4b40-9ef3-5c1e6641a116-kube-api-access-sprhk\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-2l2lq\" (UID: \"8d0b7d74-4c84-4b40-9ef3-5c1e6641a116\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" Apr 16 14:10:32.197520 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.197477 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/8d0b7d74-4c84-4b40-9ef3-5c1e6641a116-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-2l2lq\" (UID: \"8d0b7d74-4c84-4b40-9ef3-5c1e6641a116\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" Apr 16 14:10:32.197520 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.197508 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/8d0b7d74-4c84-4b40-9ef3-5c1e6641a116-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-2l2lq\" (UID: \"8d0b7d74-4c84-4b40-9ef3-5c1e6641a116\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" Apr 16 14:10:32.197758 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.197587 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/8d0b7d74-4c84-4b40-9ef3-5c1e6641a116-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-2l2lq\" (UID: \"8d0b7d74-4c84-4b40-9ef3-5c1e6641a116\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" Apr 16 14:10:32.197816 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.197794 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/8d0b7d74-4c84-4b40-9ef3-5c1e6641a116-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-2l2lq\" (UID: \"8d0b7d74-4c84-4b40-9ef3-5c1e6641a116\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" Apr 16 14:10:32.197880 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.197849 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/8d0b7d74-4c84-4b40-9ef3-5c1e6641a116-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-2l2lq\" (UID: \"8d0b7d74-4c84-4b40-9ef3-5c1e6641a116\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" Apr 16 14:10:32.197880 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.197865 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/8d0b7d74-4c84-4b40-9ef3-5c1e6641a116-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-2l2lq\" (UID: \"8d0b7d74-4c84-4b40-9ef3-5c1e6641a116\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" Apr 16 14:10:32.197989 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.197971 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/8d0b7d74-4c84-4b40-9ef3-5c1e6641a116-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-2l2lq\" (UID: \"8d0b7d74-4c84-4b40-9ef3-5c1e6641a116\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" Apr 16 14:10:32.199586 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.199559 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/8d0b7d74-4c84-4b40-9ef3-5c1e6641a116-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-2l2lq\" (UID: \"8d0b7d74-4c84-4b40-9ef3-5c1e6641a116\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" Apr 16 14:10:32.199785 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.199768 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/8d0b7d74-4c84-4b40-9ef3-5c1e6641a116-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-2l2lq\" (UID: \"8d0b7d74-4c84-4b40-9ef3-5c1e6641a116\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" Apr 16 14:10:32.205214 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.205186 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/8d0b7d74-4c84-4b40-9ef3-5c1e6641a116-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-2l2lq\" (UID: \"8d0b7d74-4c84-4b40-9ef3-5c1e6641a116\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" Apr 16 14:10:32.205335 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.205317 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sprhk\" (UniqueName: \"kubernetes.io/projected/8d0b7d74-4c84-4b40-9ef3-5c1e6641a116-kube-api-access-sprhk\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-2l2lq\" (UID: \"8d0b7d74-4c84-4b40-9ef3-5c1e6641a116\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" Apr 16 14:10:32.314855 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.314802 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" Apr 16 14:10:32.441685 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.441654 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq"] Apr 16 14:10:32.462949 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:32.462920 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" event={"ID":"8d0b7d74-4c84-4b40-9ef3-5c1e6641a116","Type":"ContainerStarted","Data":"be0f0111ca9e454a4c774c6f8247913b7f07927b385a833192811f8320e934b9"} Apr 16 14:10:34.778417 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:34.778381 2570 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 16 14:10:34.778706 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:34.778461 2570 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 16 14:10:34.778706 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:34.778491 2570 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"30892164Ki","pods":"250"} Apr 16 14:10:35.473819 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:35.473786 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" event={"ID":"8d0b7d74-4c84-4b40-9ef3-5c1e6641a116","Type":"ContainerStarted","Data":"4e1fed849849a5d6089944940aacc00efbdeecc3e012d7ce5f0e31454d09ec66"} Apr 16 14:10:35.494816 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:35.494772 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" podStartSLOduration=2.16696756 podStartE2EDuration="4.494759842s" podCreationTimestamp="2026-04-16 14:10:31 +0000 UTC" firstStartedPulling="2026-04-16 14:10:32.450331541 +0000 UTC m=+658.655974686" lastFinishedPulling="2026-04-16 14:10:34.778123824 +0000 UTC m=+660.983766968" observedRunningTime="2026-04-16 14:10:35.49371387 +0000 UTC m=+661.699357046" watchObservedRunningTime="2026-04-16 14:10:35.494759842 +0000 UTC m=+661.700403005" Apr 16 14:10:36.315241 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:36.315211 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" Apr 16 14:10:36.319876 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:36.319852 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" Apr 16 14:10:36.477108 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:36.477072 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" Apr 16 14:10:36.478132 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:36.478112 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-2l2lq" Apr 16 14:10:53.350319 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:53.350283 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-hltvm"] Apr 16 14:10:53.353644 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:53.353624 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-hltvm" Apr 16 14:10:53.355950 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:53.355928 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 14:10:53.356068 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:53.356051 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-8cdjl\"" Apr 16 14:10:53.356599 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:53.356584 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 14:10:53.365479 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:53.365456 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-hltvm"] Apr 16 14:10:53.388419 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:53.388375 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r7kk\" (UniqueName: \"kubernetes.io/projected/b04cc4a4-6e86-47ab-a3be-7800cdf133d0-kube-api-access-4r7kk\") pod \"limitador-operator-controller-manager-c7fb4c8d5-hltvm\" (UID: \"b04cc4a4-6e86-47ab-a3be-7800cdf133d0\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-hltvm" Apr 16 14:10:53.489042 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:53.489010 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4r7kk\" (UniqueName: \"kubernetes.io/projected/b04cc4a4-6e86-47ab-a3be-7800cdf133d0-kube-api-access-4r7kk\") pod \"limitador-operator-controller-manager-c7fb4c8d5-hltvm\" (UID: \"b04cc4a4-6e86-47ab-a3be-7800cdf133d0\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-hltvm" Apr 16 14:10:53.506802 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:53.506768 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r7kk\" (UniqueName: \"kubernetes.io/projected/b04cc4a4-6e86-47ab-a3be-7800cdf133d0-kube-api-access-4r7kk\") pod \"limitador-operator-controller-manager-c7fb4c8d5-hltvm\" (UID: \"b04cc4a4-6e86-47ab-a3be-7800cdf133d0\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-hltvm" Apr 16 14:10:53.664168 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:53.664135 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-hltvm" Apr 16 14:10:53.805061 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:53.805035 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-hltvm"] Apr 16 14:10:53.807767 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:10:53.807734 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb04cc4a4_6e86_47ab_a3be_7800cdf133d0.slice/crio-02d0029b1b6637ee381516cad4062dc9f1d007a7342b78e5f5cf2b7019733180 WatchSource:0}: Error finding container 02d0029b1b6637ee381516cad4062dc9f1d007a7342b78e5f5cf2b7019733180: Status 404 returned error can't find the container with id 02d0029b1b6637ee381516cad4062dc9f1d007a7342b78e5f5cf2b7019733180 Apr 16 14:10:54.533090 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:54.533059 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-hltvm" event={"ID":"b04cc4a4-6e86-47ab-a3be-7800cdf133d0","Type":"ContainerStarted","Data":"02d0029b1b6637ee381516cad4062dc9f1d007a7342b78e5f5cf2b7019733180"} Apr 16 14:10:56.541227 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:56.541187 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-hltvm" event={"ID":"b04cc4a4-6e86-47ab-a3be-7800cdf133d0","Type":"ContainerStarted","Data":"72ab3114370f43a341e837534e9db2538b4d06d15e58754eb0f2e639cb7e5e90"} Apr 16 14:10:56.541590 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:56.541255 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-hltvm" Apr 16 14:10:56.558952 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:10:56.558905 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-hltvm" podStartSLOduration=0.960659686 podStartE2EDuration="3.558891205s" podCreationTimestamp="2026-04-16 14:10:53 +0000 UTC" firstStartedPulling="2026-04-16 14:10:53.809735158 +0000 UTC m=+680.015378302" lastFinishedPulling="2026-04-16 14:10:56.407966672 +0000 UTC m=+682.613609821" observedRunningTime="2026-04-16 14:10:56.556961941 +0000 UTC m=+682.762605105" watchObservedRunningTime="2026-04-16 14:10:56.558891205 +0000 UTC m=+682.764534450" Apr 16 14:11:07.546895 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:11:07.546864 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-hltvm" Apr 16 14:11:45.454721 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:11:45.454683 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-pkszh"] Apr 16 14:11:45.461465 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:11:45.461441 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-pkszh" Apr 16 14:11:45.463442 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:11:45.463416 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-9bhj5\"" Apr 16 14:11:45.463835 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:11:45.463811 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 14:11:45.464311 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:11:45.464292 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-pkszh"] Apr 16 14:11:45.487107 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:11:45.487075 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-pkszh"] Apr 16 14:11:45.542382 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:11:45.542344 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chcvz\" (UniqueName: \"kubernetes.io/projected/a3cf85e1-1240-4bae-af6b-e2a75f3bd779-kube-api-access-chcvz\") pod \"limitador-limitador-67566c68b4-pkszh\" (UID: \"a3cf85e1-1240-4bae-af6b-e2a75f3bd779\") " pod="kuadrant-system/limitador-limitador-67566c68b4-pkszh" Apr 16 14:11:45.542382 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:11:45.542387 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/a3cf85e1-1240-4bae-af6b-e2a75f3bd779-config-file\") pod \"limitador-limitador-67566c68b4-pkszh\" (UID: \"a3cf85e1-1240-4bae-af6b-e2a75f3bd779\") " pod="kuadrant-system/limitador-limitador-67566c68b4-pkszh" Apr 16 14:11:45.643310 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:11:45.643278 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-chcvz\" (UniqueName: \"kubernetes.io/projected/a3cf85e1-1240-4bae-af6b-e2a75f3bd779-kube-api-access-chcvz\") pod \"limitador-limitador-67566c68b4-pkszh\" (UID: \"a3cf85e1-1240-4bae-af6b-e2a75f3bd779\") " pod="kuadrant-system/limitador-limitador-67566c68b4-pkszh" Apr 16 14:11:45.643449 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:11:45.643318 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/a3cf85e1-1240-4bae-af6b-e2a75f3bd779-config-file\") pod \"limitador-limitador-67566c68b4-pkszh\" (UID: \"a3cf85e1-1240-4bae-af6b-e2a75f3bd779\") " pod="kuadrant-system/limitador-limitador-67566c68b4-pkszh" Apr 16 14:11:45.643956 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:11:45.643933 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/a3cf85e1-1240-4bae-af6b-e2a75f3bd779-config-file\") pod \"limitador-limitador-67566c68b4-pkszh\" (UID: \"a3cf85e1-1240-4bae-af6b-e2a75f3bd779\") " pod="kuadrant-system/limitador-limitador-67566c68b4-pkszh" Apr 16 14:11:45.651043 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:11:45.651023 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-chcvz\" (UniqueName: \"kubernetes.io/projected/a3cf85e1-1240-4bae-af6b-e2a75f3bd779-kube-api-access-chcvz\") pod \"limitador-limitador-67566c68b4-pkszh\" (UID: \"a3cf85e1-1240-4bae-af6b-e2a75f3bd779\") " pod="kuadrant-system/limitador-limitador-67566c68b4-pkszh" Apr 16 14:11:45.774127 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:11:45.774056 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-pkszh" Apr 16 14:11:45.903028 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:11:45.903008 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-pkszh"] Apr 16 14:11:45.905742 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:11:45.905712 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3cf85e1_1240_4bae_af6b_e2a75f3bd779.slice/crio-e808b4f9b09ee5423ede2397e831e60dffebc46d4841e1bd49a7f8a42c204c80 WatchSource:0}: Error finding container e808b4f9b09ee5423ede2397e831e60dffebc46d4841e1bd49a7f8a42c204c80: Status 404 returned error can't find the container with id e808b4f9b09ee5423ede2397e831e60dffebc46d4841e1bd49a7f8a42c204c80 Apr 16 14:11:46.708404 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:11:46.708366 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-pkszh" event={"ID":"a3cf85e1-1240-4bae-af6b-e2a75f3bd779","Type":"ContainerStarted","Data":"e808b4f9b09ee5423ede2397e831e60dffebc46d4841e1bd49a7f8a42c204c80"} Apr 16 14:11:50.723930 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:11:50.723890 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-pkszh" event={"ID":"a3cf85e1-1240-4bae-af6b-e2a75f3bd779","Type":"ContainerStarted","Data":"05d9d7ad878e73572605829f9efe6f01c5661a2cecdb952fa0a60f214e6f33cb"} Apr 16 14:11:50.724365 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:11:50.724007 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-67566c68b4-pkszh" Apr 16 14:11:50.740570 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:11:50.740511 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-67566c68b4-pkszh" podStartSLOduration=1.91575689 podStartE2EDuration="5.740499882s" podCreationTimestamp="2026-04-16 14:11:45 +0000 UTC" firstStartedPulling="2026-04-16 14:11:45.907600587 +0000 UTC m=+732.113243729" lastFinishedPulling="2026-04-16 14:11:49.732343568 +0000 UTC m=+735.937986721" observedRunningTime="2026-04-16 14:11:50.73821295 +0000 UTC m=+736.943856111" watchObservedRunningTime="2026-04-16 14:11:50.740499882 +0000 UTC m=+736.946143046" Apr 16 14:12:01.728106 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:12:01.728076 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-67566c68b4-pkszh" Apr 16 14:16:19.421850 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:19.421809 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2"] Apr 16 14:16:19.424431 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:19.424407 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" Apr 16 14:16:19.426811 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:19.426785 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 14:16:19.426918 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:19.426815 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 14:16:19.426918 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:19.426825 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 16 14:16:19.426918 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:19.426899 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-xs8cb\"" Apr 16 14:16:19.437576 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:19.437524 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2"] Apr 16 14:16:19.464021 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:19.463979 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgglh\" (UniqueName: \"kubernetes.io/projected/d3467049-6924-4132-8a62-5171aa6ce551-kube-api-access-sgglh\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2\" (UID: \"d3467049-6924-4132-8a62-5171aa6ce551\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" Apr 16 14:16:19.464149 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:19.464042 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d3467049-6924-4132-8a62-5171aa6ce551-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2\" (UID: \"d3467049-6924-4132-8a62-5171aa6ce551\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" Apr 16 14:16:19.464149 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:19.464103 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d3467049-6924-4132-8a62-5171aa6ce551-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2\" (UID: \"d3467049-6924-4132-8a62-5171aa6ce551\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" Apr 16 14:16:19.464149 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:19.464143 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d3467049-6924-4132-8a62-5171aa6ce551-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2\" (UID: \"d3467049-6924-4132-8a62-5171aa6ce551\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" Apr 16 14:16:19.464340 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:19.464192 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d3467049-6924-4132-8a62-5171aa6ce551-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2\" (UID: \"d3467049-6924-4132-8a62-5171aa6ce551\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" Apr 16 14:16:19.464340 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:19.464221 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d3467049-6924-4132-8a62-5171aa6ce551-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2\" (UID: \"d3467049-6924-4132-8a62-5171aa6ce551\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" Apr 16 14:16:19.565344 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:19.565315 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d3467049-6924-4132-8a62-5171aa6ce551-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2\" (UID: \"d3467049-6924-4132-8a62-5171aa6ce551\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" Apr 16 14:16:19.565462 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:19.565353 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d3467049-6924-4132-8a62-5171aa6ce551-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2\" (UID: \"d3467049-6924-4132-8a62-5171aa6ce551\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" Apr 16 14:16:19.565462 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:19.565382 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d3467049-6924-4132-8a62-5171aa6ce551-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2\" (UID: \"d3467049-6924-4132-8a62-5171aa6ce551\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" Apr 16 14:16:19.565462 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:19.565400 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d3467049-6924-4132-8a62-5171aa6ce551-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2\" (UID: \"d3467049-6924-4132-8a62-5171aa6ce551\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" Apr 16 14:16:19.565462 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:19.565452 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sgglh\" (UniqueName: \"kubernetes.io/projected/d3467049-6924-4132-8a62-5171aa6ce551-kube-api-access-sgglh\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2\" (UID: \"d3467049-6924-4132-8a62-5171aa6ce551\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" Apr 16 14:16:19.565718 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:19.565484 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d3467049-6924-4132-8a62-5171aa6ce551-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2\" (UID: \"d3467049-6924-4132-8a62-5171aa6ce551\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" Apr 16 14:16:19.565825 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:19.565803 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d3467049-6924-4132-8a62-5171aa6ce551-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2\" (UID: \"d3467049-6924-4132-8a62-5171aa6ce551\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" Apr 16 14:16:19.565887 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:19.565858 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d3467049-6924-4132-8a62-5171aa6ce551-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2\" (UID: \"d3467049-6924-4132-8a62-5171aa6ce551\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" Apr 16 14:16:19.565950 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:19.565906 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d3467049-6924-4132-8a62-5171aa6ce551-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2\" (UID: \"d3467049-6924-4132-8a62-5171aa6ce551\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" Apr 16 14:16:19.567788 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:19.567762 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d3467049-6924-4132-8a62-5171aa6ce551-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2\" (UID: \"d3467049-6924-4132-8a62-5171aa6ce551\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" Apr 16 14:16:19.568040 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:19.568023 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d3467049-6924-4132-8a62-5171aa6ce551-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2\" (UID: \"d3467049-6924-4132-8a62-5171aa6ce551\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" Apr 16 14:16:19.573963 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:19.573940 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgglh\" (UniqueName: \"kubernetes.io/projected/d3467049-6924-4132-8a62-5171aa6ce551-kube-api-access-sgglh\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2\" (UID: \"d3467049-6924-4132-8a62-5171aa6ce551\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" Apr 16 14:16:19.738196 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:19.738125 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" Apr 16 14:16:19.865125 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:19.864988 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2"] Apr 16 14:16:19.867997 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:16:19.867970 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd3467049_6924_4132_8a62_5171aa6ce551.slice/crio-784f85db7ba1f99837a66ba09e60780749b31eca2c363b1c2dcc18298c717634 WatchSource:0}: Error finding container 784f85db7ba1f99837a66ba09e60780749b31eca2c363b1c2dcc18298c717634: Status 404 returned error can't find the container with id 784f85db7ba1f99837a66ba09e60780749b31eca2c363b1c2dcc18298c717634 Apr 16 14:16:19.869666 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:19.869651 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:16:20.634390 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:20.634338 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" event={"ID":"d3467049-6924-4132-8a62-5171aa6ce551","Type":"ContainerStarted","Data":"784f85db7ba1f99837a66ba09e60780749b31eca2c363b1c2dcc18298c717634"} Apr 16 14:16:23.648885 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:23.648793 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" event={"ID":"d3467049-6924-4132-8a62-5171aa6ce551","Type":"ContainerStarted","Data":"4156b8e23fe84e1ff79cd9455cfa004ec7be828d6d101befdc67ff3481a899b2"} Apr 16 14:16:27.665911 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:27.665874 2570 generic.go:358] "Generic (PLEG): container finished" podID="d3467049-6924-4132-8a62-5171aa6ce551" containerID="4156b8e23fe84e1ff79cd9455cfa004ec7be828d6d101befdc67ff3481a899b2" exitCode=0 Apr 16 14:16:27.666307 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:27.665954 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" event={"ID":"d3467049-6924-4132-8a62-5171aa6ce551","Type":"ContainerDied","Data":"4156b8e23fe84e1ff79cd9455cfa004ec7be828d6d101befdc67ff3481a899b2"} Apr 16 14:16:51.251146 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:51.251109 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6"] Apr 16 14:16:51.544590 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:51.544490 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6"] Apr 16 14:16:51.545180 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:51.545158 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6" Apr 16 14:16:51.547603 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:51.547582 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 16 14:16:51.692391 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:51.692345 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/52b392df-9754-431e-81a0-e9e419bf88d9-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-576666b544-2p4k6\" (UID: \"52b392df-9754-431e-81a0-e9e419bf88d9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6" Apr 16 14:16:51.692596 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:51.692417 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/52b392df-9754-431e-81a0-e9e419bf88d9-tls-certs\") pod \"precise-prefix-cache-test-kserve-576666b544-2p4k6\" (UID: \"52b392df-9754-431e-81a0-e9e419bf88d9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6" Apr 16 14:16:51.692596 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:51.692474 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/52b392df-9754-431e-81a0-e9e419bf88d9-model-cache\") pod \"precise-prefix-cache-test-kserve-576666b544-2p4k6\" (UID: \"52b392df-9754-431e-81a0-e9e419bf88d9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6" Apr 16 14:16:51.692596 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:51.692518 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkdpm\" (UniqueName: \"kubernetes.io/projected/52b392df-9754-431e-81a0-e9e419bf88d9-kube-api-access-tkdpm\") pod \"precise-prefix-cache-test-kserve-576666b544-2p4k6\" (UID: \"52b392df-9754-431e-81a0-e9e419bf88d9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6" Apr 16 14:16:51.692596 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:51.692578 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/52b392df-9754-431e-81a0-e9e419bf88d9-home\") pod \"precise-prefix-cache-test-kserve-576666b544-2p4k6\" (UID: \"52b392df-9754-431e-81a0-e9e419bf88d9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6" Apr 16 14:16:51.692796 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:51.692654 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/52b392df-9754-431e-81a0-e9e419bf88d9-dshm\") pod \"precise-prefix-cache-test-kserve-576666b544-2p4k6\" (UID: \"52b392df-9754-431e-81a0-e9e419bf88d9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6" Apr 16 14:16:51.793325 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:51.793278 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/52b392df-9754-431e-81a0-e9e419bf88d9-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-576666b544-2p4k6\" (UID: \"52b392df-9754-431e-81a0-e9e419bf88d9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6" Apr 16 14:16:51.793526 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:51.793356 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/52b392df-9754-431e-81a0-e9e419bf88d9-tls-certs\") pod \"precise-prefix-cache-test-kserve-576666b544-2p4k6\" (UID: \"52b392df-9754-431e-81a0-e9e419bf88d9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6" Apr 16 14:16:51.793526 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:51.793401 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/52b392df-9754-431e-81a0-e9e419bf88d9-model-cache\") pod \"precise-prefix-cache-test-kserve-576666b544-2p4k6\" (UID: \"52b392df-9754-431e-81a0-e9e419bf88d9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6" Apr 16 14:16:51.793526 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:51.793424 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tkdpm\" (UniqueName: \"kubernetes.io/projected/52b392df-9754-431e-81a0-e9e419bf88d9-kube-api-access-tkdpm\") pod \"precise-prefix-cache-test-kserve-576666b544-2p4k6\" (UID: \"52b392df-9754-431e-81a0-e9e419bf88d9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6" Apr 16 14:16:51.793526 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:51.793456 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/52b392df-9754-431e-81a0-e9e419bf88d9-home\") pod \"precise-prefix-cache-test-kserve-576666b544-2p4k6\" (UID: \"52b392df-9754-431e-81a0-e9e419bf88d9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6" Apr 16 14:16:51.793526 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:51.793524 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/52b392df-9754-431e-81a0-e9e419bf88d9-dshm\") pod \"precise-prefix-cache-test-kserve-576666b544-2p4k6\" (UID: \"52b392df-9754-431e-81a0-e9e419bf88d9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6" Apr 16 14:16:51.793927 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:51.793875 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/52b392df-9754-431e-81a0-e9e419bf88d9-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-576666b544-2p4k6\" (UID: \"52b392df-9754-431e-81a0-e9e419bf88d9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6" Apr 16 14:16:51.794082 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:51.793924 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/52b392df-9754-431e-81a0-e9e419bf88d9-model-cache\") pod \"precise-prefix-cache-test-kserve-576666b544-2p4k6\" (UID: \"52b392df-9754-431e-81a0-e9e419bf88d9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6" Apr 16 14:16:51.794170 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:51.794140 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/52b392df-9754-431e-81a0-e9e419bf88d9-home\") pod \"precise-prefix-cache-test-kserve-576666b544-2p4k6\" (UID: \"52b392df-9754-431e-81a0-e9e419bf88d9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6" Apr 16 14:16:51.796426 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:51.796368 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/52b392df-9754-431e-81a0-e9e419bf88d9-dshm\") pod \"precise-prefix-cache-test-kserve-576666b544-2p4k6\" (UID: \"52b392df-9754-431e-81a0-e9e419bf88d9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6" Apr 16 14:16:51.796614 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:51.796593 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/52b392df-9754-431e-81a0-e9e419bf88d9-tls-certs\") pod \"precise-prefix-cache-test-kserve-576666b544-2p4k6\" (UID: \"52b392df-9754-431e-81a0-e9e419bf88d9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6" Apr 16 14:16:51.801812 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:51.801793 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkdpm\" (UniqueName: \"kubernetes.io/projected/52b392df-9754-431e-81a0-e9e419bf88d9-kube-api-access-tkdpm\") pod \"precise-prefix-cache-test-kserve-576666b544-2p4k6\" (UID: \"52b392df-9754-431e-81a0-e9e419bf88d9\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6" Apr 16 14:16:51.858701 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:51.858646 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6" Apr 16 14:16:54.649673 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:54.649641 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6"] Apr 16 14:16:54.652888 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:16:54.652857 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52b392df_9754_431e_81a0_e9e419bf88d9.slice/crio-c847f1cbb74fa13c2a9b27ea53928ce7de1e8c4bdc579c06edef3577d541465c WatchSource:0}: Error finding container c847f1cbb74fa13c2a9b27ea53928ce7de1e8c4bdc579c06edef3577d541465c: Status 404 returned error can't find the container with id c847f1cbb74fa13c2a9b27ea53928ce7de1e8c4bdc579c06edef3577d541465c Apr 16 14:16:54.783291 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:54.783205 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6" event={"ID":"52b392df-9754-431e-81a0-e9e419bf88d9","Type":"ContainerStarted","Data":"158ab00624a2ef45875c19cddc761384d3bbb2780135d97140f4ba355082c13e"} Apr 16 14:16:54.783291 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:54.783249 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6" event={"ID":"52b392df-9754-431e-81a0-e9e419bf88d9","Type":"ContainerStarted","Data":"c847f1cbb74fa13c2a9b27ea53928ce7de1e8c4bdc579c06edef3577d541465c"} Apr 16 14:16:55.789103 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:55.789071 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" event={"ID":"d3467049-6924-4132-8a62-5171aa6ce551","Type":"ContainerStarted","Data":"b165e48a26a2cc423fae8f66da83ef1b09e72b77ac1c0bc6eaf28cbd2065a280"} Apr 16 14:16:55.808356 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:55.808285 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" podStartSLOduration=1.2834718999999999 podStartE2EDuration="36.808262408s" podCreationTimestamp="2026-04-16 14:16:19 +0000 UTC" firstStartedPulling="2026-04-16 14:16:19.869776298 +0000 UTC m=+1006.075419440" lastFinishedPulling="2026-04-16 14:16:55.394566801 +0000 UTC m=+1041.600209948" observedRunningTime="2026-04-16 14:16:55.807957048 +0000 UTC m=+1042.013600227" watchObservedRunningTime="2026-04-16 14:16:55.808262408 +0000 UTC m=+1042.013905573" Apr 16 14:16:59.738768 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:59.738731 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" Apr 16 14:16:59.739208 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:59.738878 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" Apr 16 14:16:59.740596 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:59.740520 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" podUID="d3467049-6924-4132-8a62-5171aa6ce551" containerName="main" probeResult="failure" output="Get \"https://10.134.0.31:8000/health\": dial tcp 10.134.0.31:8000: connect: connection refused" Apr 16 14:16:59.805339 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:59.805300 2570 generic.go:358] "Generic (PLEG): container finished" podID="52b392df-9754-431e-81a0-e9e419bf88d9" containerID="158ab00624a2ef45875c19cddc761384d3bbb2780135d97140f4ba355082c13e" exitCode=0 Apr 16 14:16:59.805506 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:16:59.805372 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6" event={"ID":"52b392df-9754-431e-81a0-e9e419bf88d9","Type":"ContainerDied","Data":"158ab00624a2ef45875c19cddc761384d3bbb2780135d97140f4ba355082c13e"} Apr 16 14:17:01.815058 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:01.814975 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6" event={"ID":"52b392df-9754-431e-81a0-e9e419bf88d9","Type":"ContainerStarted","Data":"43900619e2b878440cb914cf0644de5759bf3ee295ec71ec5ada61e57e7eee2e"} Apr 16 14:17:01.835306 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:01.835249 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6" podStartSLOduration=9.122263529 podStartE2EDuration="10.835233765s" podCreationTimestamp="2026-04-16 14:16:51 +0000 UTC" firstStartedPulling="2026-04-16 14:16:59.806645825 +0000 UTC m=+1046.012288967" lastFinishedPulling="2026-04-16 14:17:01.519616062 +0000 UTC m=+1047.725259203" observedRunningTime="2026-04-16 14:17:01.833122838 +0000 UTC m=+1048.038765999" watchObservedRunningTime="2026-04-16 14:17:01.835233765 +0000 UTC m=+1048.040876929" Apr 16 14:17:01.859113 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:01.859080 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6" Apr 16 14:17:01.859286 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:01.859138 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6" Apr 16 14:17:01.871692 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:01.871661 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6" Apr 16 14:17:02.830677 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:02.830649 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6" Apr 16 14:17:09.738952 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:09.738902 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" podUID="d3467049-6924-4132-8a62-5171aa6ce551" containerName="main" probeResult="failure" output="Get \"https://10.134.0.31:8000/health\": dial tcp 10.134.0.31:8000: connect: connection refused" Apr 16 14:17:19.739522 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:19.739429 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" podUID="d3467049-6924-4132-8a62-5171aa6ce551" containerName="main" probeResult="failure" output="Get \"https://10.134.0.31:8000/health\": dial tcp 10.134.0.31:8000: connect: connection refused" Apr 16 14:17:29.739197 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:29.739150 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" podUID="d3467049-6924-4132-8a62-5171aa6ce551" containerName="main" probeResult="failure" output="Get \"https://10.134.0.31:8000/health\": dial tcp 10.134.0.31:8000: connect: connection refused" Apr 16 14:17:37.381413 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:37.381381 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6"] Apr 16 14:17:37.381906 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:37.381779 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6" podUID="52b392df-9754-431e-81a0-e9e419bf88d9" containerName="main" containerID="cri-o://43900619e2b878440cb914cf0644de5759bf3ee295ec71ec5ada61e57e7eee2e" gracePeriod=30 Apr 16 14:17:37.951732 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:37.951694 2570 generic.go:358] "Generic (PLEG): container finished" podID="52b392df-9754-431e-81a0-e9e419bf88d9" containerID="43900619e2b878440cb914cf0644de5759bf3ee295ec71ec5ada61e57e7eee2e" exitCode=0 Apr 16 14:17:37.951923 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:37.951778 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6" event={"ID":"52b392df-9754-431e-81a0-e9e419bf88d9","Type":"ContainerDied","Data":"43900619e2b878440cb914cf0644de5759bf3ee295ec71ec5ada61e57e7eee2e"} Apr 16 14:17:38.158066 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:38.158043 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6" Apr 16 14:17:38.237890 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:38.237805 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/52b392df-9754-431e-81a0-e9e419bf88d9-tls-certs\") pod \"52b392df-9754-431e-81a0-e9e419bf88d9\" (UID: \"52b392df-9754-431e-81a0-e9e419bf88d9\") " Apr 16 14:17:38.237890 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:38.237859 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/52b392df-9754-431e-81a0-e9e419bf88d9-dshm\") pod \"52b392df-9754-431e-81a0-e9e419bf88d9\" (UID: \"52b392df-9754-431e-81a0-e9e419bf88d9\") " Apr 16 14:17:38.237890 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:38.237877 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/52b392df-9754-431e-81a0-e9e419bf88d9-home\") pod \"52b392df-9754-431e-81a0-e9e419bf88d9\" (UID: \"52b392df-9754-431e-81a0-e9e419bf88d9\") " Apr 16 14:17:38.238165 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:38.237900 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tkdpm\" (UniqueName: \"kubernetes.io/projected/52b392df-9754-431e-81a0-e9e419bf88d9-kube-api-access-tkdpm\") pod \"52b392df-9754-431e-81a0-e9e419bf88d9\" (UID: \"52b392df-9754-431e-81a0-e9e419bf88d9\") " Apr 16 14:17:38.238165 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:38.237927 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/52b392df-9754-431e-81a0-e9e419bf88d9-model-cache\") pod \"52b392df-9754-431e-81a0-e9e419bf88d9\" (UID: \"52b392df-9754-431e-81a0-e9e419bf88d9\") " Apr 16 14:17:38.238165 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:38.238020 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/52b392df-9754-431e-81a0-e9e419bf88d9-kserve-provision-location\") pod \"52b392df-9754-431e-81a0-e9e419bf88d9\" (UID: \"52b392df-9754-431e-81a0-e9e419bf88d9\") " Apr 16 14:17:38.238323 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:38.238182 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52b392df-9754-431e-81a0-e9e419bf88d9-home" (OuterVolumeSpecName: "home") pod "52b392df-9754-431e-81a0-e9e419bf88d9" (UID: "52b392df-9754-431e-81a0-e9e419bf88d9"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:17:38.238323 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:38.238220 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52b392df-9754-431e-81a0-e9e419bf88d9-model-cache" (OuterVolumeSpecName: "model-cache") pod "52b392df-9754-431e-81a0-e9e419bf88d9" (UID: "52b392df-9754-431e-81a0-e9e419bf88d9"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:17:38.238430 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:38.238407 2570 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/52b392df-9754-431e-81a0-e9e419bf88d9-home\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:17:38.238489 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:38.238429 2570 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/52b392df-9754-431e-81a0-e9e419bf88d9-model-cache\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:17:38.240324 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:38.240294 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52b392df-9754-431e-81a0-e9e419bf88d9-dshm" (OuterVolumeSpecName: "dshm") pod "52b392df-9754-431e-81a0-e9e419bf88d9" (UID: "52b392df-9754-431e-81a0-e9e419bf88d9"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:17:38.240454 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:38.240418 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52b392df-9754-431e-81a0-e9e419bf88d9-kube-api-access-tkdpm" (OuterVolumeSpecName: "kube-api-access-tkdpm") pod "52b392df-9754-431e-81a0-e9e419bf88d9" (UID: "52b392df-9754-431e-81a0-e9e419bf88d9"). InnerVolumeSpecName "kube-api-access-tkdpm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:17:38.240909 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:38.240881 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/52b392df-9754-431e-81a0-e9e419bf88d9-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "52b392df-9754-431e-81a0-e9e419bf88d9" (UID: "52b392df-9754-431e-81a0-e9e419bf88d9"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:17:38.295351 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:38.295305 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/52b392df-9754-431e-81a0-e9e419bf88d9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "52b392df-9754-431e-81a0-e9e419bf88d9" (UID: "52b392df-9754-431e-81a0-e9e419bf88d9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:17:38.339302 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:38.339268 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/52b392df-9754-431e-81a0-e9e419bf88d9-kserve-provision-location\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:17:38.339302 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:38.339302 2570 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/52b392df-9754-431e-81a0-e9e419bf88d9-tls-certs\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:17:38.339468 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:38.339312 2570 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/52b392df-9754-431e-81a0-e9e419bf88d9-dshm\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:17:38.339468 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:38.339321 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tkdpm\" (UniqueName: \"kubernetes.io/projected/52b392df-9754-431e-81a0-e9e419bf88d9-kube-api-access-tkdpm\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:17:38.956819 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:38.956791 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6" Apr 16 14:17:38.957245 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:38.956815 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6" event={"ID":"52b392df-9754-431e-81a0-e9e419bf88d9","Type":"ContainerDied","Data":"c847f1cbb74fa13c2a9b27ea53928ce7de1e8c4bdc579c06edef3577d541465c"} Apr 16 14:17:38.957245 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:38.956858 2570 scope.go:117] "RemoveContainer" containerID="43900619e2b878440cb914cf0644de5759bf3ee295ec71ec5ada61e57e7eee2e" Apr 16 14:17:38.965291 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:38.965273 2570 scope.go:117] "RemoveContainer" containerID="158ab00624a2ef45875c19cddc761384d3bbb2780135d97140f4ba355082c13e" Apr 16 14:17:39.000257 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:39.000222 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6"] Apr 16 14:17:39.002320 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:39.002291 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-576666b544-2p4k6"] Apr 16 14:17:39.739517 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:39.739465 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" podUID="d3467049-6924-4132-8a62-5171aa6ce551" containerName="main" probeResult="failure" output="Get \"https://10.134.0.31:8000/health\": dial tcp 10.134.0.31:8000: connect: connection refused" Apr 16 14:17:40.309395 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:40.309359 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52b392df-9754-431e-81a0-e9e419bf88d9" path="/var/lib/kubelet/pods/52b392df-9754-431e-81a0-e9e419bf88d9/volumes" Apr 16 14:17:48.536993 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:48.536949 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb"] Apr 16 14:17:48.537601 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:48.537580 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52b392df-9754-431e-81a0-e9e419bf88d9" containerName="main" Apr 16 14:17:48.537648 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:48.537606 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="52b392df-9754-431e-81a0-e9e419bf88d9" containerName="main" Apr 16 14:17:48.537682 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:48.537652 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52b392df-9754-431e-81a0-e9e419bf88d9" containerName="storage-initializer" Apr 16 14:17:48.537682 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:48.537662 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="52b392df-9754-431e-81a0-e9e419bf88d9" containerName="storage-initializer" Apr 16 14:17:48.537767 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:48.537753 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="52b392df-9754-431e-81a0-e9e419bf88d9" containerName="main" Apr 16 14:17:48.636700 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:48.636664 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb"] Apr 16 14:17:48.636864 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:48.636843 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb" Apr 16 14:17:48.639087 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:48.639066 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"conv-test-round-trip-kserve-self-signed-certs\"" Apr 16 14:17:48.732301 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:48.732269 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc42b\" (UniqueName: \"kubernetes.io/projected/f603e80d-6f0d-4b97-89fd-04da5e50db88-kube-api-access-zc42b\") pod \"conv-test-round-trip-kserve-5d48444fd8-hgbbb\" (UID: \"f603e80d-6f0d-4b97-89fd-04da5e50db88\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb" Apr 16 14:17:48.732473 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:48.732335 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f603e80d-6f0d-4b97-89fd-04da5e50db88-tls-certs\") pod \"conv-test-round-trip-kserve-5d48444fd8-hgbbb\" (UID: \"f603e80d-6f0d-4b97-89fd-04da5e50db88\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb" Apr 16 14:17:48.732473 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:48.732356 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f603e80d-6f0d-4b97-89fd-04da5e50db88-dshm\") pod \"conv-test-round-trip-kserve-5d48444fd8-hgbbb\" (UID: \"f603e80d-6f0d-4b97-89fd-04da5e50db88\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb" Apr 16 14:17:48.732558 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:48.732473 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f603e80d-6f0d-4b97-89fd-04da5e50db88-model-cache\") pod \"conv-test-round-trip-kserve-5d48444fd8-hgbbb\" (UID: \"f603e80d-6f0d-4b97-89fd-04da5e50db88\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb" Apr 16 14:17:48.732558 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:48.732526 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f603e80d-6f0d-4b97-89fd-04da5e50db88-kserve-provision-location\") pod \"conv-test-round-trip-kserve-5d48444fd8-hgbbb\" (UID: \"f603e80d-6f0d-4b97-89fd-04da5e50db88\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb" Apr 16 14:17:48.732629 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:48.732571 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f603e80d-6f0d-4b97-89fd-04da5e50db88-home\") pod \"conv-test-round-trip-kserve-5d48444fd8-hgbbb\" (UID: \"f603e80d-6f0d-4b97-89fd-04da5e50db88\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb" Apr 16 14:17:48.833760 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:48.833664 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f603e80d-6f0d-4b97-89fd-04da5e50db88-tls-certs\") pod \"conv-test-round-trip-kserve-5d48444fd8-hgbbb\" (UID: \"f603e80d-6f0d-4b97-89fd-04da5e50db88\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb" Apr 16 14:17:48.833760 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:48.833709 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f603e80d-6f0d-4b97-89fd-04da5e50db88-dshm\") pod \"conv-test-round-trip-kserve-5d48444fd8-hgbbb\" (UID: \"f603e80d-6f0d-4b97-89fd-04da5e50db88\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb" Apr 16 14:17:48.833963 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:48.833763 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f603e80d-6f0d-4b97-89fd-04da5e50db88-model-cache\") pod \"conv-test-round-trip-kserve-5d48444fd8-hgbbb\" (UID: \"f603e80d-6f0d-4b97-89fd-04da5e50db88\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb" Apr 16 14:17:48.833963 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:48.833817 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f603e80d-6f0d-4b97-89fd-04da5e50db88-kserve-provision-location\") pod \"conv-test-round-trip-kserve-5d48444fd8-hgbbb\" (UID: \"f603e80d-6f0d-4b97-89fd-04da5e50db88\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb" Apr 16 14:17:48.834065 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:48.833956 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f603e80d-6f0d-4b97-89fd-04da5e50db88-home\") pod \"conv-test-round-trip-kserve-5d48444fd8-hgbbb\" (UID: \"f603e80d-6f0d-4b97-89fd-04da5e50db88\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb" Apr 16 14:17:48.834065 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:48.834013 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zc42b\" (UniqueName: \"kubernetes.io/projected/f603e80d-6f0d-4b97-89fd-04da5e50db88-kube-api-access-zc42b\") pod \"conv-test-round-trip-kserve-5d48444fd8-hgbbb\" (UID: \"f603e80d-6f0d-4b97-89fd-04da5e50db88\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb" Apr 16 14:17:48.834300 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:48.834272 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f603e80d-6f0d-4b97-89fd-04da5e50db88-model-cache\") pod \"conv-test-round-trip-kserve-5d48444fd8-hgbbb\" (UID: \"f603e80d-6f0d-4b97-89fd-04da5e50db88\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb" Apr 16 14:17:48.834300 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:48.834288 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f603e80d-6f0d-4b97-89fd-04da5e50db88-kserve-provision-location\") pod \"conv-test-round-trip-kserve-5d48444fd8-hgbbb\" (UID: \"f603e80d-6f0d-4b97-89fd-04da5e50db88\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb" Apr 16 14:17:48.834490 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:48.834363 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f603e80d-6f0d-4b97-89fd-04da5e50db88-home\") pod \"conv-test-round-trip-kserve-5d48444fd8-hgbbb\" (UID: \"f603e80d-6f0d-4b97-89fd-04da5e50db88\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb" Apr 16 14:17:48.836259 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:48.836232 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f603e80d-6f0d-4b97-89fd-04da5e50db88-dshm\") pod \"conv-test-round-trip-kserve-5d48444fd8-hgbbb\" (UID: \"f603e80d-6f0d-4b97-89fd-04da5e50db88\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb" Apr 16 14:17:48.836864 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:48.836848 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f603e80d-6f0d-4b97-89fd-04da5e50db88-tls-certs\") pod \"conv-test-round-trip-kserve-5d48444fd8-hgbbb\" (UID: \"f603e80d-6f0d-4b97-89fd-04da5e50db88\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb" Apr 16 14:17:48.841870 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:48.841849 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc42b\" (UniqueName: \"kubernetes.io/projected/f603e80d-6f0d-4b97-89fd-04da5e50db88-kube-api-access-zc42b\") pod \"conv-test-round-trip-kserve-5d48444fd8-hgbbb\" (UID: \"f603e80d-6f0d-4b97-89fd-04da5e50db88\") " pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb" Apr 16 14:17:48.948015 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:48.947975 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb" Apr 16 14:17:49.098657 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:49.098581 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb"] Apr 16 14:17:49.102788 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:17:49.102760 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf603e80d_6f0d_4b97_89fd_04da5e50db88.slice/crio-80f9bc5834916f1a9dd4cd46f58fe2ef2bb4cd8ec5ab84104672be514eff9207 WatchSource:0}: Error finding container 80f9bc5834916f1a9dd4cd46f58fe2ef2bb4cd8ec5ab84104672be514eff9207: Status 404 returned error can't find the container with id 80f9bc5834916f1a9dd4cd46f58fe2ef2bb4cd8ec5ab84104672be514eff9207 Apr 16 14:17:49.738917 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:49.738866 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" podUID="d3467049-6924-4132-8a62-5171aa6ce551" containerName="main" probeResult="failure" output="Get \"https://10.134.0.31:8000/health\": dial tcp 10.134.0.31:8000: connect: connection refused" Apr 16 14:17:49.996983 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:49.996897 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb" event={"ID":"f603e80d-6f0d-4b97-89fd-04da5e50db88","Type":"ContainerStarted","Data":"c1ab5a24c848833cdd05162f89a49a01730dd5875cd471db86cd2b38524a91c8"} Apr 16 14:17:49.996983 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:49.996937 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb" event={"ID":"f603e80d-6f0d-4b97-89fd-04da5e50db88","Type":"ContainerStarted","Data":"80f9bc5834916f1a9dd4cd46f58fe2ef2bb4cd8ec5ab84104672be514eff9207"} Apr 16 14:17:51.427940 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:51.427881 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl"] Apr 16 14:17:51.474080 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:51.474021 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl"] Apr 16 14:17:51.474080 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:51.474059 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" Apr 16 14:17:51.476420 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:51.476395 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 16 14:17:51.561341 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:51.561276 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5b44e880-c138-4d91-8466-f3d0706d5171-model-cache\") pod \"stop-feature-test-kserve-85cf5c465f-l4dnl\" (UID: \"5b44e880-c138-4d91-8466-f3d0706d5171\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" Apr 16 14:17:51.561341 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:51.561321 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b44e880-c138-4d91-8466-f3d0706d5171-kserve-provision-location\") pod \"stop-feature-test-kserve-85cf5c465f-l4dnl\" (UID: \"5b44e880-c138-4d91-8466-f3d0706d5171\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" Apr 16 14:17:51.561341 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:51.561343 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwmcz\" (UniqueName: \"kubernetes.io/projected/5b44e880-c138-4d91-8466-f3d0706d5171-kube-api-access-zwmcz\") pod \"stop-feature-test-kserve-85cf5c465f-l4dnl\" (UID: \"5b44e880-c138-4d91-8466-f3d0706d5171\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" Apr 16 14:17:51.561790 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:51.561488 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5b44e880-c138-4d91-8466-f3d0706d5171-dshm\") pod \"stop-feature-test-kserve-85cf5c465f-l4dnl\" (UID: \"5b44e880-c138-4d91-8466-f3d0706d5171\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" Apr 16 14:17:51.561790 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:51.561521 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5b44e880-c138-4d91-8466-f3d0706d5171-tls-certs\") pod \"stop-feature-test-kserve-85cf5c465f-l4dnl\" (UID: \"5b44e880-c138-4d91-8466-f3d0706d5171\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" Apr 16 14:17:51.561790 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:51.561594 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5b44e880-c138-4d91-8466-f3d0706d5171-home\") pod \"stop-feature-test-kserve-85cf5c465f-l4dnl\" (UID: \"5b44e880-c138-4d91-8466-f3d0706d5171\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" Apr 16 14:17:51.662858 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:51.662812 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5b44e880-c138-4d91-8466-f3d0706d5171-dshm\") pod \"stop-feature-test-kserve-85cf5c465f-l4dnl\" (UID: \"5b44e880-c138-4d91-8466-f3d0706d5171\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" Apr 16 14:17:51.662858 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:51.662860 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5b44e880-c138-4d91-8466-f3d0706d5171-tls-certs\") pod \"stop-feature-test-kserve-85cf5c465f-l4dnl\" (UID: \"5b44e880-c138-4d91-8466-f3d0706d5171\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" Apr 16 14:17:51.663119 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:51.662893 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5b44e880-c138-4d91-8466-f3d0706d5171-home\") pod \"stop-feature-test-kserve-85cf5c465f-l4dnl\" (UID: \"5b44e880-c138-4d91-8466-f3d0706d5171\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" Apr 16 14:17:51.663119 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:51.662927 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5b44e880-c138-4d91-8466-f3d0706d5171-model-cache\") pod \"stop-feature-test-kserve-85cf5c465f-l4dnl\" (UID: \"5b44e880-c138-4d91-8466-f3d0706d5171\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" Apr 16 14:17:51.663119 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:51.662944 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b44e880-c138-4d91-8466-f3d0706d5171-kserve-provision-location\") pod \"stop-feature-test-kserve-85cf5c465f-l4dnl\" (UID: \"5b44e880-c138-4d91-8466-f3d0706d5171\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" Apr 16 14:17:51.663119 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:51.662971 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zwmcz\" (UniqueName: \"kubernetes.io/projected/5b44e880-c138-4d91-8466-f3d0706d5171-kube-api-access-zwmcz\") pod \"stop-feature-test-kserve-85cf5c465f-l4dnl\" (UID: \"5b44e880-c138-4d91-8466-f3d0706d5171\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" Apr 16 14:17:51.663469 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:51.663408 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5b44e880-c138-4d91-8466-f3d0706d5171-model-cache\") pod \"stop-feature-test-kserve-85cf5c465f-l4dnl\" (UID: \"5b44e880-c138-4d91-8466-f3d0706d5171\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" Apr 16 14:17:51.663469 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:51.663406 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5b44e880-c138-4d91-8466-f3d0706d5171-home\") pod \"stop-feature-test-kserve-85cf5c465f-l4dnl\" (UID: \"5b44e880-c138-4d91-8466-f3d0706d5171\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" Apr 16 14:17:51.663469 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:51.663459 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b44e880-c138-4d91-8466-f3d0706d5171-kserve-provision-location\") pod \"stop-feature-test-kserve-85cf5c465f-l4dnl\" (UID: \"5b44e880-c138-4d91-8466-f3d0706d5171\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" Apr 16 14:17:51.665883 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:51.665851 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5b44e880-c138-4d91-8466-f3d0706d5171-dshm\") pod \"stop-feature-test-kserve-85cf5c465f-l4dnl\" (UID: \"5b44e880-c138-4d91-8466-f3d0706d5171\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" Apr 16 14:17:51.665883 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:51.665870 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5b44e880-c138-4d91-8466-f3d0706d5171-tls-certs\") pod \"stop-feature-test-kserve-85cf5c465f-l4dnl\" (UID: \"5b44e880-c138-4d91-8466-f3d0706d5171\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" Apr 16 14:17:51.673385 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:51.673361 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwmcz\" (UniqueName: \"kubernetes.io/projected/5b44e880-c138-4d91-8466-f3d0706d5171-kube-api-access-zwmcz\") pod \"stop-feature-test-kserve-85cf5c465f-l4dnl\" (UID: \"5b44e880-c138-4d91-8466-f3d0706d5171\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" Apr 16 14:17:51.791494 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:51.791390 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" Apr 16 14:17:51.954254 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:51.954196 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl"] Apr 16 14:17:51.958223 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:17:51.958191 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b44e880_c138_4d91_8466_f3d0706d5171.slice/crio-d4261ce25d810e0632a5051c693e5c8462939cbb4d2e321dd3a915db7a18b591 WatchSource:0}: Error finding container d4261ce25d810e0632a5051c693e5c8462939cbb4d2e321dd3a915db7a18b591: Status 404 returned error can't find the container with id d4261ce25d810e0632a5051c693e5c8462939cbb4d2e321dd3a915db7a18b591 Apr 16 14:17:52.005341 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:52.005307 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" event={"ID":"5b44e880-c138-4d91-8466-f3d0706d5171","Type":"ContainerStarted","Data":"d4261ce25d810e0632a5051c693e5c8462939cbb4d2e321dd3a915db7a18b591"} Apr 16 14:17:53.013978 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:53.013935 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" event={"ID":"5b44e880-c138-4d91-8466-f3d0706d5171","Type":"ContainerStarted","Data":"f1b6be4fd1d00fcf8982e18453ac5b6f3b189c71e2acd40a829816c7ef687e82"} Apr 16 14:17:54.019924 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:54.019887 2570 generic.go:358] "Generic (PLEG): container finished" podID="f603e80d-6f0d-4b97-89fd-04da5e50db88" containerID="c1ab5a24c848833cdd05162f89a49a01730dd5875cd471db86cd2b38524a91c8" exitCode=0 Apr 16 14:17:54.020303 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:54.019967 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb" event={"ID":"f603e80d-6f0d-4b97-89fd-04da5e50db88","Type":"ContainerDied","Data":"c1ab5a24c848833cdd05162f89a49a01730dd5875cd471db86cd2b38524a91c8"} Apr 16 14:17:54.265559 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:54.265454 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb"] Apr 16 14:17:55.036971 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:55.036893 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb" event={"ID":"f603e80d-6f0d-4b97-89fd-04da5e50db88","Type":"ContainerStarted","Data":"af50df52d118d253f4b01b9e9e20ab9629919a05a97a66d30e98512a46db37c1"} Apr 16 14:17:55.037466 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:55.037227 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb" podUID="f603e80d-6f0d-4b97-89fd-04da5e50db88" containerName="main" containerID="cri-o://af50df52d118d253f4b01b9e9e20ab9629919a05a97a66d30e98512a46db37c1" gracePeriod=30 Apr 16 14:17:55.060133 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:55.060060 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb" podStartSLOduration=7.060040706 podStartE2EDuration="7.060040706s" podCreationTimestamp="2026-04-16 14:17:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:17:55.055770209 +0000 UTC m=+1101.261413386" watchObservedRunningTime="2026-04-16 14:17:55.060040706 +0000 UTC m=+1101.265684039" Apr 16 14:17:57.046716 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:57.046675 2570 generic.go:358] "Generic (PLEG): container finished" podID="5b44e880-c138-4d91-8466-f3d0706d5171" containerID="f1b6be4fd1d00fcf8982e18453ac5b6f3b189c71e2acd40a829816c7ef687e82" exitCode=0 Apr 16 14:17:57.047084 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:57.046753 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" event={"ID":"5b44e880-c138-4d91-8466-f3d0706d5171","Type":"ContainerDied","Data":"f1b6be4fd1d00fcf8982e18453ac5b6f3b189c71e2acd40a829816c7ef687e82"} Apr 16 14:17:58.053008 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:58.052969 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" event={"ID":"5b44e880-c138-4d91-8466-f3d0706d5171","Type":"ContainerStarted","Data":"83eb8eb6ebe9d95b2d432645f85541cf1d47d4455833f4dd5e5580c4ce91f1d2"} Apr 16 14:17:58.073753 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:58.073691 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" podStartSLOduration=7.073673585 podStartE2EDuration="7.073673585s" podCreationTimestamp="2026-04-16 14:17:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:17:58.072359719 +0000 UTC m=+1104.278002883" watchObservedRunningTime="2026-04-16 14:17:58.073673585 +0000 UTC m=+1104.279316749" Apr 16 14:17:58.948413 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:58.948374 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb" Apr 16 14:17:59.739573 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:17:59.739492 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" podUID="d3467049-6924-4132-8a62-5171aa6ce551" containerName="main" probeResult="failure" output="Get \"https://10.134.0.31:8000/health\": dial tcp 10.134.0.31:8000: connect: connection refused" Apr 16 14:18:01.792419 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:01.792376 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" Apr 16 14:18:01.792419 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:01.792428 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" Apr 16 14:18:01.793957 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:01.793924 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" podUID="5b44e880-c138-4d91-8466-f3d0706d5171" containerName="main" probeResult="failure" output="Get \"https://10.134.0.34:8000/health\": dial tcp 10.134.0.34:8000: connect: connection refused" Apr 16 14:18:09.739084 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:09.739034 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" podUID="d3467049-6924-4132-8a62-5171aa6ce551" containerName="main" probeResult="failure" output="Get \"https://10.134.0.31:8000/health\": dial tcp 10.134.0.31:8000: connect: connection refused" Apr 16 14:18:11.792291 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:11.792239 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" podUID="5b44e880-c138-4d91-8466-f3d0706d5171" containerName="main" probeResult="failure" output="Get \"https://10.134.0.34:8000/health\": dial tcp 10.134.0.34:8000: connect: connection refused" Apr 16 14:18:19.739198 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:19.739147 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" podUID="d3467049-6924-4132-8a62-5171aa6ce551" containerName="main" probeResult="failure" output="Get \"https://10.134.0.31:8000/health\": dial tcp 10.134.0.31:8000: connect: connection refused" Apr 16 14:18:21.792198 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:21.792147 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" podUID="5b44e880-c138-4d91-8466-f3d0706d5171" containerName="main" probeResult="failure" output="Get \"https://10.134.0.34:8000/health\": dial tcp 10.134.0.34:8000: connect: connection refused" Apr 16 14:18:25.163640 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:25.163605 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-round-trip-kserve-5d48444fd8-hgbbb_f603e80d-6f0d-4b97-89fd-04da5e50db88/main/0.log" Apr 16 14:18:25.164073 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:25.164013 2570 generic.go:358] "Generic (PLEG): container finished" podID="f603e80d-6f0d-4b97-89fd-04da5e50db88" containerID="af50df52d118d253f4b01b9e9e20ab9629919a05a97a66d30e98512a46db37c1" exitCode=137 Apr 16 14:18:25.164223 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:25.164082 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb" event={"ID":"f603e80d-6f0d-4b97-89fd-04da5e50db88","Type":"ContainerDied","Data":"af50df52d118d253f4b01b9e9e20ab9629919a05a97a66d30e98512a46db37c1"} Apr 16 14:18:25.268366 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:25.268347 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-round-trip-kserve-5d48444fd8-hgbbb_f603e80d-6f0d-4b97-89fd-04da5e50db88/main/0.log" Apr 16 14:18:25.268811 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:25.268792 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb" Apr 16 14:18:25.429204 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:25.429171 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f603e80d-6f0d-4b97-89fd-04da5e50db88-kserve-provision-location\") pod \"f603e80d-6f0d-4b97-89fd-04da5e50db88\" (UID: \"f603e80d-6f0d-4b97-89fd-04da5e50db88\") " Apr 16 14:18:25.429371 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:25.429238 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zc42b\" (UniqueName: \"kubernetes.io/projected/f603e80d-6f0d-4b97-89fd-04da5e50db88-kube-api-access-zc42b\") pod \"f603e80d-6f0d-4b97-89fd-04da5e50db88\" (UID: \"f603e80d-6f0d-4b97-89fd-04da5e50db88\") " Apr 16 14:18:25.429371 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:25.429261 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f603e80d-6f0d-4b97-89fd-04da5e50db88-model-cache\") pod \"f603e80d-6f0d-4b97-89fd-04da5e50db88\" (UID: \"f603e80d-6f0d-4b97-89fd-04da5e50db88\") " Apr 16 14:18:25.429371 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:25.429346 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f603e80d-6f0d-4b97-89fd-04da5e50db88-tls-certs\") pod \"f603e80d-6f0d-4b97-89fd-04da5e50db88\" (UID: \"f603e80d-6f0d-4b97-89fd-04da5e50db88\") " Apr 16 14:18:25.429549 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:25.429384 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f603e80d-6f0d-4b97-89fd-04da5e50db88-home\") pod \"f603e80d-6f0d-4b97-89fd-04da5e50db88\" (UID: \"f603e80d-6f0d-4b97-89fd-04da5e50db88\") " Apr 16 14:18:25.429549 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:25.429409 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f603e80d-6f0d-4b97-89fd-04da5e50db88-dshm\") pod \"f603e80d-6f0d-4b97-89fd-04da5e50db88\" (UID: \"f603e80d-6f0d-4b97-89fd-04da5e50db88\") " Apr 16 14:18:25.429946 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:25.429664 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f603e80d-6f0d-4b97-89fd-04da5e50db88-home" (OuterVolumeSpecName: "home") pod "f603e80d-6f0d-4b97-89fd-04da5e50db88" (UID: "f603e80d-6f0d-4b97-89fd-04da5e50db88"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:18:25.429946 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:25.429789 2570 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f603e80d-6f0d-4b97-89fd-04da5e50db88-home\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:18:25.429946 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:25.429889 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f603e80d-6f0d-4b97-89fd-04da5e50db88-model-cache" (OuterVolumeSpecName: "model-cache") pod "f603e80d-6f0d-4b97-89fd-04da5e50db88" (UID: "f603e80d-6f0d-4b97-89fd-04da5e50db88"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:18:25.431797 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:25.431770 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f603e80d-6f0d-4b97-89fd-04da5e50db88-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f603e80d-6f0d-4b97-89fd-04da5e50db88" (UID: "f603e80d-6f0d-4b97-89fd-04da5e50db88"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:18:25.432220 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:25.432200 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f603e80d-6f0d-4b97-89fd-04da5e50db88-kube-api-access-zc42b" (OuterVolumeSpecName: "kube-api-access-zc42b") pod "f603e80d-6f0d-4b97-89fd-04da5e50db88" (UID: "f603e80d-6f0d-4b97-89fd-04da5e50db88"). InnerVolumeSpecName "kube-api-access-zc42b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:18:25.432283 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:25.432258 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f603e80d-6f0d-4b97-89fd-04da5e50db88-dshm" (OuterVolumeSpecName: "dshm") pod "f603e80d-6f0d-4b97-89fd-04da5e50db88" (UID: "f603e80d-6f0d-4b97-89fd-04da5e50db88"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:18:25.491457 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:25.491421 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f603e80d-6f0d-4b97-89fd-04da5e50db88-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f603e80d-6f0d-4b97-89fd-04da5e50db88" (UID: "f603e80d-6f0d-4b97-89fd-04da5e50db88"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:18:25.530648 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:25.530612 2570 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f603e80d-6f0d-4b97-89fd-04da5e50db88-tls-certs\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:18:25.530648 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:25.530640 2570 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f603e80d-6f0d-4b97-89fd-04da5e50db88-dshm\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:18:25.530648 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:25.530650 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f603e80d-6f0d-4b97-89fd-04da5e50db88-kserve-provision-location\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:18:25.530648 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:25.530659 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zc42b\" (UniqueName: \"kubernetes.io/projected/f603e80d-6f0d-4b97-89fd-04da5e50db88-kube-api-access-zc42b\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:18:25.530929 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:25.530669 2570 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f603e80d-6f0d-4b97-89fd-04da5e50db88-model-cache\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:18:26.168727 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:26.168697 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-round-trip-kserve-5d48444fd8-hgbbb_f603e80d-6f0d-4b97-89fd-04da5e50db88/main/0.log" Apr 16 14:18:26.169189 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:26.169154 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb" Apr 16 14:18:26.169255 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:26.169179 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb" event={"ID":"f603e80d-6f0d-4b97-89fd-04da5e50db88","Type":"ContainerDied","Data":"80f9bc5834916f1a9dd4cd46f58fe2ef2bb4cd8ec5ab84104672be514eff9207"} Apr 16 14:18:26.169255 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:26.169234 2570 scope.go:117] "RemoveContainer" containerID="af50df52d118d253f4b01b9e9e20ab9629919a05a97a66d30e98512a46db37c1" Apr 16 14:18:26.181405 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:26.181385 2570 scope.go:117] "RemoveContainer" containerID="c1ab5a24c848833cdd05162f89a49a01730dd5875cd471db86cd2b38524a91c8" Apr 16 14:18:26.198910 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:26.198882 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb"] Apr 16 14:18:26.202205 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:26.202185 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/conv-test-round-trip-kserve-5d48444fd8-hgbbb"] Apr 16 14:18:26.312342 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:26.312303 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f603e80d-6f0d-4b97-89fd-04da5e50db88" path="/var/lib/kubelet/pods/f603e80d-6f0d-4b97-89fd-04da5e50db88/volumes" Apr 16 14:18:29.738941 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:29.738905 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" podUID="d3467049-6924-4132-8a62-5171aa6ce551" containerName="main" probeResult="failure" output="Get \"https://10.134.0.31:8000/health\": dial tcp 10.134.0.31:8000: connect: connection refused" Apr 16 14:18:31.792056 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:31.792004 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" podUID="5b44e880-c138-4d91-8466-f3d0706d5171" containerName="main" probeResult="failure" output="Get \"https://10.134.0.34:8000/health\": dial tcp 10.134.0.34:8000: connect: connection refused" Apr 16 14:18:39.748772 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:39.748739 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" Apr 16 14:18:39.756803 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:39.756778 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" Apr 16 14:18:41.792083 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:41.792039 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" podUID="5b44e880-c138-4d91-8466-f3d0706d5171" containerName="main" probeResult="failure" output="Get \"https://10.134.0.34:8000/health\": dial tcp 10.134.0.34:8000: connect: connection refused" Apr 16 14:18:45.550763 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:45.550726 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2"] Apr 16 14:18:45.551232 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:45.551095 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" podUID="d3467049-6924-4132-8a62-5171aa6ce551" containerName="main" containerID="cri-o://b165e48a26a2cc423fae8f66da83ef1b09e72b77ac1c0bc6eaf28cbd2065a280" gracePeriod=30 Apr 16 14:18:51.792576 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:51.792453 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" podUID="5b44e880-c138-4d91-8466-f3d0706d5171" containerName="main" probeResult="failure" output="Get \"https://10.134.0.34:8000/health\": dial tcp 10.134.0.34:8000: connect: connection refused" Apr 16 14:18:54.935761 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:54.935730 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4"] Apr 16 14:18:54.936148 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:54.936096 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f603e80d-6f0d-4b97-89fd-04da5e50db88" containerName="storage-initializer" Apr 16 14:18:54.936148 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:54.936107 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="f603e80d-6f0d-4b97-89fd-04da5e50db88" containerName="storage-initializer" Apr 16 14:18:54.936148 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:54.936126 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f603e80d-6f0d-4b97-89fd-04da5e50db88" containerName="main" Apr 16 14:18:54.936148 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:54.936131 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="f603e80d-6f0d-4b97-89fd-04da5e50db88" containerName="main" Apr 16 14:18:54.936296 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:54.936193 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="f603e80d-6f0d-4b97-89fd-04da5e50db88" containerName="main" Apr 16 14:18:54.938172 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:54.938155 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" Apr 16 14:18:54.940493 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:54.940453 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 16 14:18:54.949503 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:54.949474 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4"] Apr 16 14:18:55.009035 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:55.008999 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a2384001-9a49-470f-bd91-e496110e5891-model-cache\") pod \"custom-route-timeout-test-kserve-7f6bcc656c-fqcz4\" (UID: \"a2384001-9a49-470f-bd91-e496110e5891\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" Apr 16 14:18:55.009180 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:55.009051 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a2384001-9a49-470f-bd91-e496110e5891-dshm\") pod \"custom-route-timeout-test-kserve-7f6bcc656c-fqcz4\" (UID: \"a2384001-9a49-470f-bd91-e496110e5891\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" Apr 16 14:18:55.009180 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:55.009076 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a2384001-9a49-470f-bd91-e496110e5891-home\") pod \"custom-route-timeout-test-kserve-7f6bcc656c-fqcz4\" (UID: \"a2384001-9a49-470f-bd91-e496110e5891\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" Apr 16 14:18:55.009180 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:55.009123 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2384001-9a49-470f-bd91-e496110e5891-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-7f6bcc656c-fqcz4\" (UID: \"a2384001-9a49-470f-bd91-e496110e5891\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" Apr 16 14:18:55.009180 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:55.009151 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a2384001-9a49-470f-bd91-e496110e5891-tls-certs\") pod \"custom-route-timeout-test-kserve-7f6bcc656c-fqcz4\" (UID: \"a2384001-9a49-470f-bd91-e496110e5891\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" Apr 16 14:18:55.009333 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:55.009200 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msn4w\" (UniqueName: \"kubernetes.io/projected/a2384001-9a49-470f-bd91-e496110e5891-kube-api-access-msn4w\") pod \"custom-route-timeout-test-kserve-7f6bcc656c-fqcz4\" (UID: \"a2384001-9a49-470f-bd91-e496110e5891\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" Apr 16 14:18:55.110012 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:55.109977 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a2384001-9a49-470f-bd91-e496110e5891-model-cache\") pod \"custom-route-timeout-test-kserve-7f6bcc656c-fqcz4\" (UID: \"a2384001-9a49-470f-bd91-e496110e5891\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" Apr 16 14:18:55.110168 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:55.110048 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a2384001-9a49-470f-bd91-e496110e5891-dshm\") pod \"custom-route-timeout-test-kserve-7f6bcc656c-fqcz4\" (UID: \"a2384001-9a49-470f-bd91-e496110e5891\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" Apr 16 14:18:55.110168 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:55.110066 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a2384001-9a49-470f-bd91-e496110e5891-home\") pod \"custom-route-timeout-test-kserve-7f6bcc656c-fqcz4\" (UID: \"a2384001-9a49-470f-bd91-e496110e5891\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" Apr 16 14:18:55.110168 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:55.110084 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2384001-9a49-470f-bd91-e496110e5891-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-7f6bcc656c-fqcz4\" (UID: \"a2384001-9a49-470f-bd91-e496110e5891\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" Apr 16 14:18:55.110168 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:55.110110 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a2384001-9a49-470f-bd91-e496110e5891-tls-certs\") pod \"custom-route-timeout-test-kserve-7f6bcc656c-fqcz4\" (UID: \"a2384001-9a49-470f-bd91-e496110e5891\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" Apr 16 14:18:55.110376 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:55.110169 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-msn4w\" (UniqueName: \"kubernetes.io/projected/a2384001-9a49-470f-bd91-e496110e5891-kube-api-access-msn4w\") pod \"custom-route-timeout-test-kserve-7f6bcc656c-fqcz4\" (UID: \"a2384001-9a49-470f-bd91-e496110e5891\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" Apr 16 14:18:55.110441 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:55.110417 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a2384001-9a49-470f-bd91-e496110e5891-model-cache\") pod \"custom-route-timeout-test-kserve-7f6bcc656c-fqcz4\" (UID: \"a2384001-9a49-470f-bd91-e496110e5891\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" Apr 16 14:18:55.110502 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:55.110482 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2384001-9a49-470f-bd91-e496110e5891-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-7f6bcc656c-fqcz4\" (UID: \"a2384001-9a49-470f-bd91-e496110e5891\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" Apr 16 14:18:55.110579 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:55.110519 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a2384001-9a49-470f-bd91-e496110e5891-home\") pod \"custom-route-timeout-test-kserve-7f6bcc656c-fqcz4\" (UID: \"a2384001-9a49-470f-bd91-e496110e5891\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" Apr 16 14:18:55.112621 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:55.112590 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a2384001-9a49-470f-bd91-e496110e5891-dshm\") pod \"custom-route-timeout-test-kserve-7f6bcc656c-fqcz4\" (UID: \"a2384001-9a49-470f-bd91-e496110e5891\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" Apr 16 14:18:55.112816 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:55.112797 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a2384001-9a49-470f-bd91-e496110e5891-tls-certs\") pod \"custom-route-timeout-test-kserve-7f6bcc656c-fqcz4\" (UID: \"a2384001-9a49-470f-bd91-e496110e5891\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" Apr 16 14:18:55.119396 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:55.119373 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-msn4w\" (UniqueName: \"kubernetes.io/projected/a2384001-9a49-470f-bd91-e496110e5891-kube-api-access-msn4w\") pod \"custom-route-timeout-test-kserve-7f6bcc656c-fqcz4\" (UID: \"a2384001-9a49-470f-bd91-e496110e5891\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" Apr 16 14:18:55.255955 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:55.255884 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" Apr 16 14:18:55.398065 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:55.397972 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4"] Apr 16 14:18:55.400842 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:18:55.400815 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2384001_9a49_470f_bd91_e496110e5891.slice/crio-e012c9558655ee458c8c37dee4954301a625f0cb2530821ac651d575c484dcb0 WatchSource:0}: Error finding container e012c9558655ee458c8c37dee4954301a625f0cb2530821ac651d575c484dcb0: Status 404 returned error can't find the container with id e012c9558655ee458c8c37dee4954301a625f0cb2530821ac651d575c484dcb0 Apr 16 14:18:56.284143 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:56.284106 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" event={"ID":"a2384001-9a49-470f-bd91-e496110e5891","Type":"ContainerStarted","Data":"8b905ef66a82b713f36f31963fb64cb126222107f5f631a03fa50b2baa1fc5a1"} Apr 16 14:18:56.284582 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:18:56.284150 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" event={"ID":"a2384001-9a49-470f-bd91-e496110e5891","Type":"ContainerStarted","Data":"e012c9558655ee458c8c37dee4954301a625f0cb2530821ac651d575c484dcb0"} Apr 16 14:19:00.301126 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:00.301092 2570 generic.go:358] "Generic (PLEG): container finished" podID="a2384001-9a49-470f-bd91-e496110e5891" containerID="8b905ef66a82b713f36f31963fb64cb126222107f5f631a03fa50b2baa1fc5a1" exitCode=0 Apr 16 14:19:00.301502 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:00.301172 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" event={"ID":"a2384001-9a49-470f-bd91-e496110e5891","Type":"ContainerDied","Data":"8b905ef66a82b713f36f31963fb64cb126222107f5f631a03fa50b2baa1fc5a1"} Apr 16 14:19:01.307151 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:01.307116 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" event={"ID":"a2384001-9a49-470f-bd91-e496110e5891","Type":"ContainerStarted","Data":"fb5e5991f349bcc3b8515322f96fbdfb355ae17d800871e4c74f2f4d36070220"} Apr 16 14:19:01.329959 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:01.329898 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" podStartSLOduration=7.329876187 podStartE2EDuration="7.329876187s" podCreationTimestamp="2026-04-16 14:18:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:19:01.326373196 +0000 UTC m=+1167.532016359" watchObservedRunningTime="2026-04-16 14:19:01.329876187 +0000 UTC m=+1167.535519352" Apr 16 14:19:01.792128 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:01.792083 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" podUID="5b44e880-c138-4d91-8466-f3d0706d5171" containerName="main" probeResult="failure" output="Get \"https://10.134.0.34:8000/health\": dial tcp 10.134.0.34:8000: connect: connection refused" Apr 16 14:19:05.256681 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:05.256644 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" Apr 16 14:19:05.257175 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:05.256984 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" Apr 16 14:19:05.258527 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:05.258501 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" podUID="a2384001-9a49-470f-bd91-e496110e5891" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8000/health\": dial tcp 10.134.0.35:8000: connect: connection refused" Apr 16 14:19:11.792797 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:11.792750 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" podUID="5b44e880-c138-4d91-8466-f3d0706d5171" containerName="main" probeResult="failure" output="Get \"https://10.134.0.34:8000/health\": dial tcp 10.134.0.34:8000: connect: connection refused" Apr 16 14:19:15.256556 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:15.256489 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" podUID="a2384001-9a49-470f-bd91-e496110e5891" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8000/health\": dial tcp 10.134.0.35:8000: connect: connection refused" Apr 16 14:19:15.956053 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:15.956027 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2_d3467049-6924-4132-8a62-5171aa6ce551/main/0.log" Apr 16 14:19:15.956458 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:15.956440 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" Apr 16 14:19:16.114872 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:16.114778 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d3467049-6924-4132-8a62-5171aa6ce551-kserve-provision-location\") pod \"d3467049-6924-4132-8a62-5171aa6ce551\" (UID: \"d3467049-6924-4132-8a62-5171aa6ce551\") " Apr 16 14:19:16.114872 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:16.114834 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d3467049-6924-4132-8a62-5171aa6ce551-model-cache\") pod \"d3467049-6924-4132-8a62-5171aa6ce551\" (UID: \"d3467049-6924-4132-8a62-5171aa6ce551\") " Apr 16 14:19:16.115077 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:16.114925 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d3467049-6924-4132-8a62-5171aa6ce551-tls-certs\") pod \"d3467049-6924-4132-8a62-5171aa6ce551\" (UID: \"d3467049-6924-4132-8a62-5171aa6ce551\") " Apr 16 14:19:16.115077 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:16.114985 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d3467049-6924-4132-8a62-5171aa6ce551-home\") pod \"d3467049-6924-4132-8a62-5171aa6ce551\" (UID: \"d3467049-6924-4132-8a62-5171aa6ce551\") " Apr 16 14:19:16.115077 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:16.115027 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sgglh\" (UniqueName: \"kubernetes.io/projected/d3467049-6924-4132-8a62-5171aa6ce551-kube-api-access-sgglh\") pod \"d3467049-6924-4132-8a62-5171aa6ce551\" (UID: \"d3467049-6924-4132-8a62-5171aa6ce551\") " Apr 16 14:19:16.115077 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:16.115063 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d3467049-6924-4132-8a62-5171aa6ce551-dshm\") pod \"d3467049-6924-4132-8a62-5171aa6ce551\" (UID: \"d3467049-6924-4132-8a62-5171aa6ce551\") " Apr 16 14:19:16.115285 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:16.115117 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3467049-6924-4132-8a62-5171aa6ce551-model-cache" (OuterVolumeSpecName: "model-cache") pod "d3467049-6924-4132-8a62-5171aa6ce551" (UID: "d3467049-6924-4132-8a62-5171aa6ce551"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:19:16.115388 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:16.115358 2570 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d3467049-6924-4132-8a62-5171aa6ce551-model-cache\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:19:16.115472 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:16.115403 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3467049-6924-4132-8a62-5171aa6ce551-home" (OuterVolumeSpecName: "home") pod "d3467049-6924-4132-8a62-5171aa6ce551" (UID: "d3467049-6924-4132-8a62-5171aa6ce551"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:19:16.117496 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:16.117460 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3467049-6924-4132-8a62-5171aa6ce551-kube-api-access-sgglh" (OuterVolumeSpecName: "kube-api-access-sgglh") pod "d3467049-6924-4132-8a62-5171aa6ce551" (UID: "d3467049-6924-4132-8a62-5171aa6ce551"). InnerVolumeSpecName "kube-api-access-sgglh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:19:16.117891 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:16.117855 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3467049-6924-4132-8a62-5171aa6ce551-dshm" (OuterVolumeSpecName: "dshm") pod "d3467049-6924-4132-8a62-5171aa6ce551" (UID: "d3467049-6924-4132-8a62-5171aa6ce551"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:19:16.117891 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:16.117871 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d3467049-6924-4132-8a62-5171aa6ce551-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d3467049-6924-4132-8a62-5171aa6ce551" (UID: "d3467049-6924-4132-8a62-5171aa6ce551"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:19:16.172860 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:16.172815 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d3467049-6924-4132-8a62-5171aa6ce551-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d3467049-6924-4132-8a62-5171aa6ce551" (UID: "d3467049-6924-4132-8a62-5171aa6ce551"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:19:16.216679 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:16.216643 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d3467049-6924-4132-8a62-5171aa6ce551-kserve-provision-location\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:19:16.216679 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:16.216676 2570 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d3467049-6924-4132-8a62-5171aa6ce551-tls-certs\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:19:16.216679 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:16.216688 2570 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d3467049-6924-4132-8a62-5171aa6ce551-home\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:19:16.216992 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:16.216698 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sgglh\" (UniqueName: \"kubernetes.io/projected/d3467049-6924-4132-8a62-5171aa6ce551-kube-api-access-sgglh\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:19:16.216992 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:16.216708 2570 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d3467049-6924-4132-8a62-5171aa6ce551-dshm\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:19:16.367308 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:16.367231 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2_d3467049-6924-4132-8a62-5171aa6ce551/main/0.log" Apr 16 14:19:16.367741 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:16.367588 2570 generic.go:358] "Generic (PLEG): container finished" podID="d3467049-6924-4132-8a62-5171aa6ce551" containerID="b165e48a26a2cc423fae8f66da83ef1b09e72b77ac1c0bc6eaf28cbd2065a280" exitCode=137 Apr 16 14:19:16.367741 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:16.367641 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" event={"ID":"d3467049-6924-4132-8a62-5171aa6ce551","Type":"ContainerDied","Data":"b165e48a26a2cc423fae8f66da83ef1b09e72b77ac1c0bc6eaf28cbd2065a280"} Apr 16 14:19:16.367741 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:16.367659 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" Apr 16 14:19:16.367741 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:16.367673 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2" event={"ID":"d3467049-6924-4132-8a62-5171aa6ce551","Type":"ContainerDied","Data":"784f85db7ba1f99837a66ba09e60780749b31eca2c363b1c2dcc18298c717634"} Apr 16 14:19:16.367741 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:16.367690 2570 scope.go:117] "RemoveContainer" containerID="b165e48a26a2cc423fae8f66da83ef1b09e72b77ac1c0bc6eaf28cbd2065a280" Apr 16 14:19:16.385930 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:16.385901 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2"] Apr 16 14:19:16.390648 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:16.390625 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-75bdbbc645dv6q2"] Apr 16 14:19:16.395658 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:16.395634 2570 scope.go:117] "RemoveContainer" containerID="4156b8e23fe84e1ff79cd9455cfa004ec7be828d6d101befdc67ff3481a899b2" Apr 16 14:19:16.471791 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:16.471757 2570 scope.go:117] "RemoveContainer" containerID="b165e48a26a2cc423fae8f66da83ef1b09e72b77ac1c0bc6eaf28cbd2065a280" Apr 16 14:19:16.472246 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:19:16.472177 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b165e48a26a2cc423fae8f66da83ef1b09e72b77ac1c0bc6eaf28cbd2065a280\": container with ID starting with b165e48a26a2cc423fae8f66da83ef1b09e72b77ac1c0bc6eaf28cbd2065a280 not found: ID does not exist" containerID="b165e48a26a2cc423fae8f66da83ef1b09e72b77ac1c0bc6eaf28cbd2065a280" Apr 16 14:19:16.472377 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:16.472230 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b165e48a26a2cc423fae8f66da83ef1b09e72b77ac1c0bc6eaf28cbd2065a280"} err="failed to get container status \"b165e48a26a2cc423fae8f66da83ef1b09e72b77ac1c0bc6eaf28cbd2065a280\": rpc error: code = NotFound desc = could not find container \"b165e48a26a2cc423fae8f66da83ef1b09e72b77ac1c0bc6eaf28cbd2065a280\": container with ID starting with b165e48a26a2cc423fae8f66da83ef1b09e72b77ac1c0bc6eaf28cbd2065a280 not found: ID does not exist" Apr 16 14:19:16.472377 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:16.472286 2570 scope.go:117] "RemoveContainer" containerID="4156b8e23fe84e1ff79cd9455cfa004ec7be828d6d101befdc67ff3481a899b2" Apr 16 14:19:16.472641 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:19:16.472617 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4156b8e23fe84e1ff79cd9455cfa004ec7be828d6d101befdc67ff3481a899b2\": container with ID starting with 4156b8e23fe84e1ff79cd9455cfa004ec7be828d6d101befdc67ff3481a899b2 not found: ID does not exist" containerID="4156b8e23fe84e1ff79cd9455cfa004ec7be828d6d101befdc67ff3481a899b2" Apr 16 14:19:16.472722 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:16.472650 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4156b8e23fe84e1ff79cd9455cfa004ec7be828d6d101befdc67ff3481a899b2"} err="failed to get container status \"4156b8e23fe84e1ff79cd9455cfa004ec7be828d6d101befdc67ff3481a899b2\": rpc error: code = NotFound desc = could not find container \"4156b8e23fe84e1ff79cd9455cfa004ec7be828d6d101befdc67ff3481a899b2\": container with ID starting with 4156b8e23fe84e1ff79cd9455cfa004ec7be828d6d101befdc67ff3481a899b2 not found: ID does not exist" Apr 16 14:19:18.312238 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:18.312203 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3467049-6924-4132-8a62-5171aa6ce551" path="/var/lib/kubelet/pods/d3467049-6924-4132-8a62-5171aa6ce551/volumes" Apr 16 14:19:21.791946 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:21.791906 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" podUID="5b44e880-c138-4d91-8466-f3d0706d5171" containerName="main" probeResult="failure" output="Get \"https://10.134.0.34:8000/health\": dial tcp 10.134.0.34:8000: connect: connection refused" Apr 16 14:19:25.256658 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:25.256614 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" podUID="a2384001-9a49-470f-bd91-e496110e5891" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8000/health\": dial tcp 10.134.0.35:8000: connect: connection refused" Apr 16 14:19:31.802444 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:31.802416 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" Apr 16 14:19:31.810658 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:31.810630 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" Apr 16 14:19:32.714784 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:32.714754 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl"] Apr 16 14:19:33.431837 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:33.431795 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" podUID="5b44e880-c138-4d91-8466-f3d0706d5171" containerName="main" containerID="cri-o://83eb8eb6ebe9d95b2d432645f85541cf1d47d4455833f4dd5e5580c4ce91f1d2" gracePeriod=30 Apr 16 14:19:35.256333 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:35.256291 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" podUID="a2384001-9a49-470f-bd91-e496110e5891" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8000/health\": dial tcp 10.134.0.35:8000: connect: connection refused" Apr 16 14:19:45.256952 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:45.256911 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" podUID="a2384001-9a49-470f-bd91-e496110e5891" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8000/health\": dial tcp 10.134.0.35:8000: connect: connection refused" Apr 16 14:19:46.238552 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:46.238497 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d"] Apr 16 14:19:46.239113 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:46.239095 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d3467049-6924-4132-8a62-5171aa6ce551" containerName="storage-initializer" Apr 16 14:19:46.239172 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:46.239117 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3467049-6924-4132-8a62-5171aa6ce551" containerName="storage-initializer" Apr 16 14:19:46.239172 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:46.239137 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d3467049-6924-4132-8a62-5171aa6ce551" containerName="main" Apr 16 14:19:46.239172 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:46.239146 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3467049-6924-4132-8a62-5171aa6ce551" containerName="main" Apr 16 14:19:46.239269 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:46.239230 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="d3467049-6924-4132-8a62-5171aa6ce551" containerName="main" Apr 16 14:19:46.242302 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:46.242280 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" Apr 16 14:19:46.251207 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:46.251166 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d"] Apr 16 14:19:46.299593 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:46.299566 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1957e25b-feda-40b5-ba8b-403f314bba3c-tls-certs\") pod \"stop-feature-test-kserve-85cf5c465f-2jt7d\" (UID: \"1957e25b-feda-40b5-ba8b-403f314bba3c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" Apr 16 14:19:46.299983 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:46.299600 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1957e25b-feda-40b5-ba8b-403f314bba3c-model-cache\") pod \"stop-feature-test-kserve-85cf5c465f-2jt7d\" (UID: \"1957e25b-feda-40b5-ba8b-403f314bba3c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" Apr 16 14:19:46.299983 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:46.299624 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1957e25b-feda-40b5-ba8b-403f314bba3c-kserve-provision-location\") pod \"stop-feature-test-kserve-85cf5c465f-2jt7d\" (UID: \"1957e25b-feda-40b5-ba8b-403f314bba3c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" Apr 16 14:19:46.299983 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:46.299647 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1957e25b-feda-40b5-ba8b-403f314bba3c-home\") pod \"stop-feature-test-kserve-85cf5c465f-2jt7d\" (UID: \"1957e25b-feda-40b5-ba8b-403f314bba3c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" Apr 16 14:19:46.299983 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:46.299716 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84snm\" (UniqueName: \"kubernetes.io/projected/1957e25b-feda-40b5-ba8b-403f314bba3c-kube-api-access-84snm\") pod \"stop-feature-test-kserve-85cf5c465f-2jt7d\" (UID: \"1957e25b-feda-40b5-ba8b-403f314bba3c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" Apr 16 14:19:46.299983 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:46.299768 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1957e25b-feda-40b5-ba8b-403f314bba3c-dshm\") pod \"stop-feature-test-kserve-85cf5c465f-2jt7d\" (UID: \"1957e25b-feda-40b5-ba8b-403f314bba3c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" Apr 16 14:19:46.401057 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:46.401026 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1957e25b-feda-40b5-ba8b-403f314bba3c-tls-certs\") pod \"stop-feature-test-kserve-85cf5c465f-2jt7d\" (UID: \"1957e25b-feda-40b5-ba8b-403f314bba3c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" Apr 16 14:19:46.401244 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:46.401071 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1957e25b-feda-40b5-ba8b-403f314bba3c-model-cache\") pod \"stop-feature-test-kserve-85cf5c465f-2jt7d\" (UID: \"1957e25b-feda-40b5-ba8b-403f314bba3c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" Apr 16 14:19:46.401244 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:46.401096 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1957e25b-feda-40b5-ba8b-403f314bba3c-kserve-provision-location\") pod \"stop-feature-test-kserve-85cf5c465f-2jt7d\" (UID: \"1957e25b-feda-40b5-ba8b-403f314bba3c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" Apr 16 14:19:46.401244 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:46.401187 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1957e25b-feda-40b5-ba8b-403f314bba3c-home\") pod \"stop-feature-test-kserve-85cf5c465f-2jt7d\" (UID: \"1957e25b-feda-40b5-ba8b-403f314bba3c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" Apr 16 14:19:46.401470 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:46.401287 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-84snm\" (UniqueName: \"kubernetes.io/projected/1957e25b-feda-40b5-ba8b-403f314bba3c-kube-api-access-84snm\") pod \"stop-feature-test-kserve-85cf5c465f-2jt7d\" (UID: \"1957e25b-feda-40b5-ba8b-403f314bba3c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" Apr 16 14:19:46.401470 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:46.401389 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1957e25b-feda-40b5-ba8b-403f314bba3c-dshm\") pod \"stop-feature-test-kserve-85cf5c465f-2jt7d\" (UID: \"1957e25b-feda-40b5-ba8b-403f314bba3c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" Apr 16 14:19:46.401470 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:46.401466 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1957e25b-feda-40b5-ba8b-403f314bba3c-kserve-provision-location\") pod \"stop-feature-test-kserve-85cf5c465f-2jt7d\" (UID: \"1957e25b-feda-40b5-ba8b-403f314bba3c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" Apr 16 14:19:46.401781 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:46.401521 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1957e25b-feda-40b5-ba8b-403f314bba3c-model-cache\") pod \"stop-feature-test-kserve-85cf5c465f-2jt7d\" (UID: \"1957e25b-feda-40b5-ba8b-403f314bba3c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" Apr 16 14:19:46.401853 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:46.401797 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1957e25b-feda-40b5-ba8b-403f314bba3c-home\") pod \"stop-feature-test-kserve-85cf5c465f-2jt7d\" (UID: \"1957e25b-feda-40b5-ba8b-403f314bba3c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" Apr 16 14:19:46.403901 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:46.403881 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1957e25b-feda-40b5-ba8b-403f314bba3c-dshm\") pod \"stop-feature-test-kserve-85cf5c465f-2jt7d\" (UID: \"1957e25b-feda-40b5-ba8b-403f314bba3c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" Apr 16 14:19:46.404454 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:46.404437 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1957e25b-feda-40b5-ba8b-403f314bba3c-tls-certs\") pod \"stop-feature-test-kserve-85cf5c465f-2jt7d\" (UID: \"1957e25b-feda-40b5-ba8b-403f314bba3c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" Apr 16 14:19:46.409253 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:46.409230 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-84snm\" (UniqueName: \"kubernetes.io/projected/1957e25b-feda-40b5-ba8b-403f314bba3c-kube-api-access-84snm\") pod \"stop-feature-test-kserve-85cf5c465f-2jt7d\" (UID: \"1957e25b-feda-40b5-ba8b-403f314bba3c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" Apr 16 14:19:46.556606 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:46.556502 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" Apr 16 14:19:46.699370 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:46.699345 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d"] Apr 16 14:19:46.701702 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:19:46.701676 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1957e25b_feda_40b5_ba8b_403f314bba3c.slice/crio-d1e1ee3c14c3035fa927c6acd3064698101c77e70755fbfb14b94f5e75ae0b7e WatchSource:0}: Error finding container d1e1ee3c14c3035fa927c6acd3064698101c77e70755fbfb14b94f5e75ae0b7e: Status 404 returned error can't find the container with id d1e1ee3c14c3035fa927c6acd3064698101c77e70755fbfb14b94f5e75ae0b7e Apr 16 14:19:47.481600 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:47.481565 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" event={"ID":"1957e25b-feda-40b5-ba8b-403f314bba3c","Type":"ContainerStarted","Data":"2c64d9b3180f7e466e9ff742f8af049b329b3e3f2a634df9399f3aa46601b52f"} Apr 16 14:19:47.482024 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:47.481606 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" event={"ID":"1957e25b-feda-40b5-ba8b-403f314bba3c","Type":"ContainerStarted","Data":"d1e1ee3c14c3035fa927c6acd3064698101c77e70755fbfb14b94f5e75ae0b7e"} Apr 16 14:19:51.497366 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:51.497336 2570 generic.go:358] "Generic (PLEG): container finished" podID="1957e25b-feda-40b5-ba8b-403f314bba3c" containerID="2c64d9b3180f7e466e9ff742f8af049b329b3e3f2a634df9399f3aa46601b52f" exitCode=0 Apr 16 14:19:51.497366 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:51.497371 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" event={"ID":"1957e25b-feda-40b5-ba8b-403f314bba3c","Type":"ContainerDied","Data":"2c64d9b3180f7e466e9ff742f8af049b329b3e3f2a634df9399f3aa46601b52f"} Apr 16 14:19:52.503056 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:52.503014 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" event={"ID":"1957e25b-feda-40b5-ba8b-403f314bba3c","Type":"ContainerStarted","Data":"7b70c6342830f02c69e0a23726ccbf42a193e362a2aefc32fdacd91adb57580b"} Apr 16 14:19:52.524449 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:52.524400 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" podStartSLOduration=6.524385958 podStartE2EDuration="6.524385958s" podCreationTimestamp="2026-04-16 14:19:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:19:52.521630266 +0000 UTC m=+1218.727273430" watchObservedRunningTime="2026-04-16 14:19:52.524385958 +0000 UTC m=+1218.730029121" Apr 16 14:19:55.257058 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:55.257006 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" podUID="a2384001-9a49-470f-bd91-e496110e5891" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8000/health\": dial tcp 10.134.0.35:8000: connect: connection refused" Apr 16 14:19:56.557462 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:56.557424 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" Apr 16 14:19:56.557462 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:56.557468 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" Apr 16 14:19:56.559131 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:19:56.559102 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" podUID="1957e25b-feda-40b5-ba8b-403f314bba3c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 16 14:20:03.758977 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:03.758949 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-85cf5c465f-l4dnl_5b44e880-c138-4d91-8466-f3d0706d5171/main/0.log" Apr 16 14:20:03.759419 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:03.759401 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" Apr 16 14:20:03.774581 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:03.774557 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5b44e880-c138-4d91-8466-f3d0706d5171-home\") pod \"5b44e880-c138-4d91-8466-f3d0706d5171\" (UID: \"5b44e880-c138-4d91-8466-f3d0706d5171\") " Apr 16 14:20:03.774697 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:03.774599 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5b44e880-c138-4d91-8466-f3d0706d5171-tls-certs\") pod \"5b44e880-c138-4d91-8466-f3d0706d5171\" (UID: \"5b44e880-c138-4d91-8466-f3d0706d5171\") " Apr 16 14:20:03.774697 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:03.774641 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5b44e880-c138-4d91-8466-f3d0706d5171-dshm\") pod \"5b44e880-c138-4d91-8466-f3d0706d5171\" (UID: \"5b44e880-c138-4d91-8466-f3d0706d5171\") " Apr 16 14:20:03.774697 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:03.774684 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zwmcz\" (UniqueName: \"kubernetes.io/projected/5b44e880-c138-4d91-8466-f3d0706d5171-kube-api-access-zwmcz\") pod \"5b44e880-c138-4d91-8466-f3d0706d5171\" (UID: \"5b44e880-c138-4d91-8466-f3d0706d5171\") " Apr 16 14:20:03.774851 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:03.774735 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5b44e880-c138-4d91-8466-f3d0706d5171-model-cache\") pod \"5b44e880-c138-4d91-8466-f3d0706d5171\" (UID: \"5b44e880-c138-4d91-8466-f3d0706d5171\") " Apr 16 14:20:03.774851 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:03.774761 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b44e880-c138-4d91-8466-f3d0706d5171-kserve-provision-location\") pod \"5b44e880-c138-4d91-8466-f3d0706d5171\" (UID: \"5b44e880-c138-4d91-8466-f3d0706d5171\") " Apr 16 14:20:03.774949 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:03.774929 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b44e880-c138-4d91-8466-f3d0706d5171-home" (OuterVolumeSpecName: "home") pod "5b44e880-c138-4d91-8466-f3d0706d5171" (UID: "5b44e880-c138-4d91-8466-f3d0706d5171"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:20:03.775206 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:03.775057 2570 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5b44e880-c138-4d91-8466-f3d0706d5171-home\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:20:03.775362 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:03.775250 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b44e880-c138-4d91-8466-f3d0706d5171-model-cache" (OuterVolumeSpecName: "model-cache") pod "5b44e880-c138-4d91-8466-f3d0706d5171" (UID: "5b44e880-c138-4d91-8466-f3d0706d5171"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:20:03.777165 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:03.777133 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b44e880-c138-4d91-8466-f3d0706d5171-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "5b44e880-c138-4d91-8466-f3d0706d5171" (UID: "5b44e880-c138-4d91-8466-f3d0706d5171"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:20:03.777619 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:03.777593 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b44e880-c138-4d91-8466-f3d0706d5171-dshm" (OuterVolumeSpecName: "dshm") pod "5b44e880-c138-4d91-8466-f3d0706d5171" (UID: "5b44e880-c138-4d91-8466-f3d0706d5171"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:20:03.777744 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:03.777648 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b44e880-c138-4d91-8466-f3d0706d5171-kube-api-access-zwmcz" (OuterVolumeSpecName: "kube-api-access-zwmcz") pod "5b44e880-c138-4d91-8466-f3d0706d5171" (UID: "5b44e880-c138-4d91-8466-f3d0706d5171"). InnerVolumeSpecName "kube-api-access-zwmcz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:20:03.854552 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:03.854479 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5b44e880-c138-4d91-8466-f3d0706d5171-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5b44e880-c138-4d91-8466-f3d0706d5171" (UID: "5b44e880-c138-4d91-8466-f3d0706d5171"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:20:03.876455 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:03.876415 2570 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5b44e880-c138-4d91-8466-f3d0706d5171-tls-certs\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:20:03.876455 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:03.876443 2570 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5b44e880-c138-4d91-8466-f3d0706d5171-dshm\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:20:03.876455 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:03.876452 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zwmcz\" (UniqueName: \"kubernetes.io/projected/5b44e880-c138-4d91-8466-f3d0706d5171-kube-api-access-zwmcz\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:20:03.876455 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:03.876462 2570 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5b44e880-c138-4d91-8466-f3d0706d5171-model-cache\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:20:03.876842 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:03.876473 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5b44e880-c138-4d91-8466-f3d0706d5171-kserve-provision-location\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:20:04.548271 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:04.548241 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-85cf5c465f-l4dnl_5b44e880-c138-4d91-8466-f3d0706d5171/main/0.log" Apr 16 14:20:04.548636 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:04.548606 2570 generic.go:358] "Generic (PLEG): container finished" podID="5b44e880-c138-4d91-8466-f3d0706d5171" containerID="83eb8eb6ebe9d95b2d432645f85541cf1d47d4455833f4dd5e5580c4ce91f1d2" exitCode=137 Apr 16 14:20:04.548791 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:04.548680 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" Apr 16 14:20:04.548791 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:04.548686 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" event={"ID":"5b44e880-c138-4d91-8466-f3d0706d5171","Type":"ContainerDied","Data":"83eb8eb6ebe9d95b2d432645f85541cf1d47d4455833f4dd5e5580c4ce91f1d2"} Apr 16 14:20:04.548791 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:04.548728 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl" event={"ID":"5b44e880-c138-4d91-8466-f3d0706d5171","Type":"ContainerDied","Data":"d4261ce25d810e0632a5051c693e5c8462939cbb4d2e321dd3a915db7a18b591"} Apr 16 14:20:04.548791 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:04.548748 2570 scope.go:117] "RemoveContainer" containerID="83eb8eb6ebe9d95b2d432645f85541cf1d47d4455833f4dd5e5580c4ce91f1d2" Apr 16 14:20:04.568116 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:04.568075 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl"] Apr 16 14:20:04.569563 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:04.569520 2570 scope.go:117] "RemoveContainer" containerID="f1b6be4fd1d00fcf8982e18453ac5b6f3b189c71e2acd40a829816c7ef687e82" Apr 16 14:20:04.571232 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:04.571212 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-l4dnl"] Apr 16 14:20:04.633842 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:04.633824 2570 scope.go:117] "RemoveContainer" containerID="83eb8eb6ebe9d95b2d432645f85541cf1d47d4455833f4dd5e5580c4ce91f1d2" Apr 16 14:20:04.634185 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:20:04.634164 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83eb8eb6ebe9d95b2d432645f85541cf1d47d4455833f4dd5e5580c4ce91f1d2\": container with ID starting with 83eb8eb6ebe9d95b2d432645f85541cf1d47d4455833f4dd5e5580c4ce91f1d2 not found: ID does not exist" containerID="83eb8eb6ebe9d95b2d432645f85541cf1d47d4455833f4dd5e5580c4ce91f1d2" Apr 16 14:20:04.634246 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:04.634194 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83eb8eb6ebe9d95b2d432645f85541cf1d47d4455833f4dd5e5580c4ce91f1d2"} err="failed to get container status \"83eb8eb6ebe9d95b2d432645f85541cf1d47d4455833f4dd5e5580c4ce91f1d2\": rpc error: code = NotFound desc = could not find container \"83eb8eb6ebe9d95b2d432645f85541cf1d47d4455833f4dd5e5580c4ce91f1d2\": container with ID starting with 83eb8eb6ebe9d95b2d432645f85541cf1d47d4455833f4dd5e5580c4ce91f1d2 not found: ID does not exist" Apr 16 14:20:04.634246 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:04.634212 2570 scope.go:117] "RemoveContainer" containerID="f1b6be4fd1d00fcf8982e18453ac5b6f3b189c71e2acd40a829816c7ef687e82" Apr 16 14:20:04.634483 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:20:04.634466 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1b6be4fd1d00fcf8982e18453ac5b6f3b189c71e2acd40a829816c7ef687e82\": container with ID starting with f1b6be4fd1d00fcf8982e18453ac5b6f3b189c71e2acd40a829816c7ef687e82 not found: ID does not exist" containerID="f1b6be4fd1d00fcf8982e18453ac5b6f3b189c71e2acd40a829816c7ef687e82" Apr 16 14:20:04.634562 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:04.634488 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1b6be4fd1d00fcf8982e18453ac5b6f3b189c71e2acd40a829816c7ef687e82"} err="failed to get container status \"f1b6be4fd1d00fcf8982e18453ac5b6f3b189c71e2acd40a829816c7ef687e82\": rpc error: code = NotFound desc = could not find container \"f1b6be4fd1d00fcf8982e18453ac5b6f3b189c71e2acd40a829816c7ef687e82\": container with ID starting with f1b6be4fd1d00fcf8982e18453ac5b6f3b189c71e2acd40a829816c7ef687e82 not found: ID does not exist" Apr 16 14:20:05.256595 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:05.256545 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" podUID="a2384001-9a49-470f-bd91-e496110e5891" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8000/health\": dial tcp 10.134.0.35:8000: connect: connection refused" Apr 16 14:20:06.309843 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:06.309813 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b44e880-c138-4d91-8466-f3d0706d5171" path="/var/lib/kubelet/pods/5b44e880-c138-4d91-8466-f3d0706d5171/volumes" Apr 16 14:20:06.557143 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:06.557104 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" podUID="1957e25b-feda-40b5-ba8b-403f314bba3c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 16 14:20:15.257289 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:15.257226 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" podUID="a2384001-9a49-470f-bd91-e496110e5891" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8000/health\": dial tcp 10.134.0.35:8000: connect: connection refused" Apr 16 14:20:16.558131 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:16.558022 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" podUID="1957e25b-feda-40b5-ba8b-403f314bba3c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 16 14:20:25.257155 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:25.257106 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" podUID="a2384001-9a49-470f-bd91-e496110e5891" containerName="main" probeResult="failure" output="Get \"https://10.134.0.35:8000/health\": dial tcp 10.134.0.35:8000: connect: connection refused" Apr 16 14:20:26.557769 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:26.557723 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" podUID="1957e25b-feda-40b5-ba8b-403f314bba3c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 16 14:20:35.267425 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:35.267385 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" Apr 16 14:20:35.275473 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:35.275442 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" Apr 16 14:20:36.557314 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:36.557268 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" podUID="1957e25b-feda-40b5-ba8b-403f314bba3c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 16 14:20:41.918999 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:41.918962 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4"] Apr 16 14:20:41.919518 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:41.919330 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" podUID="a2384001-9a49-470f-bd91-e496110e5891" containerName="main" containerID="cri-o://fb5e5991f349bcc3b8515322f96fbdfb355ae17d800871e4c74f2f4d36070220" gracePeriod=30 Apr 16 14:20:46.557013 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:46.556960 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" podUID="1957e25b-feda-40b5-ba8b-403f314bba3c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 16 14:20:47.234661 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:47.234628 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb"] Apr 16 14:20:47.235007 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:47.234990 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b44e880-c138-4d91-8466-f3d0706d5171" containerName="main" Apr 16 14:20:47.235105 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:47.235009 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b44e880-c138-4d91-8466-f3d0706d5171" containerName="main" Apr 16 14:20:47.235105 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:47.235025 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5b44e880-c138-4d91-8466-f3d0706d5171" containerName="storage-initializer" Apr 16 14:20:47.235105 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:47.235031 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b44e880-c138-4d91-8466-f3d0706d5171" containerName="storage-initializer" Apr 16 14:20:47.235216 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:47.235124 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="5b44e880-c138-4d91-8466-f3d0706d5171" containerName="main" Apr 16 14:20:47.238658 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:47.238637 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" Apr 16 14:20:47.240824 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:47.240804 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 16 14:20:47.247635 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:47.247366 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb"] Apr 16 14:20:47.294441 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:47.294405 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-model-cache\") pod \"router-with-refs-test-kserve-558b4c5bc9-sshfb\" (UID: \"587e56a3-9a65-42c1-a3f3-43f1172aaf6f\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" Apr 16 14:20:47.294441 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:47.294442 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-home\") pod \"router-with-refs-test-kserve-558b4c5bc9-sshfb\" (UID: \"587e56a3-9a65-42c1-a3f3-43f1172aaf6f\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" Apr 16 14:20:47.294655 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:47.294467 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ppmb\" (UniqueName: \"kubernetes.io/projected/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-kube-api-access-5ppmb\") pod \"router-with-refs-test-kserve-558b4c5bc9-sshfb\" (UID: \"587e56a3-9a65-42c1-a3f3-43f1172aaf6f\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" Apr 16 14:20:47.294655 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:47.294571 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-dshm\") pod \"router-with-refs-test-kserve-558b4c5bc9-sshfb\" (UID: \"587e56a3-9a65-42c1-a3f3-43f1172aaf6f\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" Apr 16 14:20:47.294655 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:47.294630 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-kserve-provision-location\") pod \"router-with-refs-test-kserve-558b4c5bc9-sshfb\" (UID: \"587e56a3-9a65-42c1-a3f3-43f1172aaf6f\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" Apr 16 14:20:47.294810 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:47.294677 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-tls-certs\") pod \"router-with-refs-test-kserve-558b4c5bc9-sshfb\" (UID: \"587e56a3-9a65-42c1-a3f3-43f1172aaf6f\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" Apr 16 14:20:47.395878 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:47.395841 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-kserve-provision-location\") pod \"router-with-refs-test-kserve-558b4c5bc9-sshfb\" (UID: \"587e56a3-9a65-42c1-a3f3-43f1172aaf6f\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" Apr 16 14:20:47.396075 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:47.395901 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-tls-certs\") pod \"router-with-refs-test-kserve-558b4c5bc9-sshfb\" (UID: \"587e56a3-9a65-42c1-a3f3-43f1172aaf6f\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" Apr 16 14:20:47.396075 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:47.395953 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-model-cache\") pod \"router-with-refs-test-kserve-558b4c5bc9-sshfb\" (UID: \"587e56a3-9a65-42c1-a3f3-43f1172aaf6f\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" Apr 16 14:20:47.396075 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:47.395971 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-home\") pod \"router-with-refs-test-kserve-558b4c5bc9-sshfb\" (UID: \"587e56a3-9a65-42c1-a3f3-43f1172aaf6f\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" Apr 16 14:20:47.396075 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:47.395989 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5ppmb\" (UniqueName: \"kubernetes.io/projected/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-kube-api-access-5ppmb\") pod \"router-with-refs-test-kserve-558b4c5bc9-sshfb\" (UID: \"587e56a3-9a65-42c1-a3f3-43f1172aaf6f\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" Apr 16 14:20:47.396075 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:47.396013 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-dshm\") pod \"router-with-refs-test-kserve-558b4c5bc9-sshfb\" (UID: \"587e56a3-9a65-42c1-a3f3-43f1172aaf6f\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" Apr 16 14:20:47.396345 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:47.396319 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-kserve-provision-location\") pod \"router-with-refs-test-kserve-558b4c5bc9-sshfb\" (UID: \"587e56a3-9a65-42c1-a3f3-43f1172aaf6f\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" Apr 16 14:20:47.396413 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:47.396358 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-home\") pod \"router-with-refs-test-kserve-558b4c5bc9-sshfb\" (UID: \"587e56a3-9a65-42c1-a3f3-43f1172aaf6f\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" Apr 16 14:20:47.396763 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:47.396683 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-model-cache\") pod \"router-with-refs-test-kserve-558b4c5bc9-sshfb\" (UID: \"587e56a3-9a65-42c1-a3f3-43f1172aaf6f\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" Apr 16 14:20:47.398560 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:47.398515 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-dshm\") pod \"router-with-refs-test-kserve-558b4c5bc9-sshfb\" (UID: \"587e56a3-9a65-42c1-a3f3-43f1172aaf6f\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" Apr 16 14:20:47.398795 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:47.398774 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-tls-certs\") pod \"router-with-refs-test-kserve-558b4c5bc9-sshfb\" (UID: \"587e56a3-9a65-42c1-a3f3-43f1172aaf6f\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" Apr 16 14:20:47.404686 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:47.404666 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ppmb\" (UniqueName: \"kubernetes.io/projected/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-kube-api-access-5ppmb\") pod \"router-with-refs-test-kserve-558b4c5bc9-sshfb\" (UID: \"587e56a3-9a65-42c1-a3f3-43f1172aaf6f\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" Apr 16 14:20:47.552518 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:47.552430 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" Apr 16 14:20:47.686634 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:47.686572 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb"] Apr 16 14:20:47.690871 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:20:47.690841 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod587e56a3_9a65_42c1_a3f3_43f1172aaf6f.slice/crio-f74fce501e42ab05fc9f093e443df740119f884add05227a0fbc7a5b27e8b726 WatchSource:0}: Error finding container f74fce501e42ab05fc9f093e443df740119f884add05227a0fbc7a5b27e8b726: Status 404 returned error can't find the container with id f74fce501e42ab05fc9f093e443df740119f884add05227a0fbc7a5b27e8b726 Apr 16 14:20:47.719714 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:47.719685 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" event={"ID":"587e56a3-9a65-42c1-a3f3-43f1172aaf6f","Type":"ContainerStarted","Data":"f74fce501e42ab05fc9f093e443df740119f884add05227a0fbc7a5b27e8b726"} Apr 16 14:20:48.725360 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:48.725320 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" event={"ID":"587e56a3-9a65-42c1-a3f3-43f1172aaf6f","Type":"ContainerStarted","Data":"94b48f9423209b36d7427ac8bc9c3b8bd3ad2776d1cd2159c5df316bd544ae64"} Apr 16 14:20:51.739133 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:51.739088 2570 generic.go:358] "Generic (PLEG): container finished" podID="587e56a3-9a65-42c1-a3f3-43f1172aaf6f" containerID="94b48f9423209b36d7427ac8bc9c3b8bd3ad2776d1cd2159c5df316bd544ae64" exitCode=0 Apr 16 14:20:51.739133 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:51.739133 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" event={"ID":"587e56a3-9a65-42c1-a3f3-43f1172aaf6f","Type":"ContainerDied","Data":"94b48f9423209b36d7427ac8bc9c3b8bd3ad2776d1cd2159c5df316bd544ae64"} Apr 16 14:20:52.745585 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:52.745551 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" event={"ID":"587e56a3-9a65-42c1-a3f3-43f1172aaf6f","Type":"ContainerStarted","Data":"049fd2adbdc5c33d82c9bb73e1fcc03fb1ba11275c0d73dc623c40493d26f4fd"} Apr 16 14:20:52.765441 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:52.765387 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" podStartSLOduration=5.765373951 podStartE2EDuration="5.765373951s" podCreationTimestamp="2026-04-16 14:20:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:20:52.763850065 +0000 UTC m=+1278.969493228" watchObservedRunningTime="2026-04-16 14:20:52.765373951 +0000 UTC m=+1278.971017114" Apr 16 14:20:56.557407 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:56.557351 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" podUID="1957e25b-feda-40b5-ba8b-403f314bba3c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 16 14:20:57.553493 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:57.553457 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" Apr 16 14:20:57.553493 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:57.553502 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" Apr 16 14:20:57.554897 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:20:57.554863 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" podUID="587e56a3-9a65-42c1-a3f3-43f1172aaf6f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 16 14:21:06.557760 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:06.557706 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" podUID="1957e25b-feda-40b5-ba8b-403f314bba3c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 16 14:21:07.552987 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:07.552943 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" podUID="587e56a3-9a65-42c1-a3f3-43f1172aaf6f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 16 14:21:12.266877 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:12.266777 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-7f6bcc656c-fqcz4_a2384001-9a49-470f-bd91-e496110e5891/main/0.log" Apr 16 14:21:12.268134 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:12.267734 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" Apr 16 14:21:12.448349 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:12.448295 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a2384001-9a49-470f-bd91-e496110e5891-model-cache\") pod \"a2384001-9a49-470f-bd91-e496110e5891\" (UID: \"a2384001-9a49-470f-bd91-e496110e5891\") " Apr 16 14:21:12.448607 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:12.448361 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2384001-9a49-470f-bd91-e496110e5891-kserve-provision-location\") pod \"a2384001-9a49-470f-bd91-e496110e5891\" (UID: \"a2384001-9a49-470f-bd91-e496110e5891\") " Apr 16 14:21:12.448607 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:12.448413 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a2384001-9a49-470f-bd91-e496110e5891-home\") pod \"a2384001-9a49-470f-bd91-e496110e5891\" (UID: \"a2384001-9a49-470f-bd91-e496110e5891\") " Apr 16 14:21:12.448607 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:12.448442 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a2384001-9a49-470f-bd91-e496110e5891-dshm\") pod \"a2384001-9a49-470f-bd91-e496110e5891\" (UID: \"a2384001-9a49-470f-bd91-e496110e5891\") " Apr 16 14:21:12.448607 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:12.448593 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a2384001-9a49-470f-bd91-e496110e5891-tls-certs\") pod \"a2384001-9a49-470f-bd91-e496110e5891\" (UID: \"a2384001-9a49-470f-bd91-e496110e5891\") " Apr 16 14:21:12.448849 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:12.448626 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msn4w\" (UniqueName: \"kubernetes.io/projected/a2384001-9a49-470f-bd91-e496110e5891-kube-api-access-msn4w\") pod \"a2384001-9a49-470f-bd91-e496110e5891\" (UID: \"a2384001-9a49-470f-bd91-e496110e5891\") " Apr 16 14:21:12.449867 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:12.449830 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2384001-9a49-470f-bd91-e496110e5891-home" (OuterVolumeSpecName: "home") pod "a2384001-9a49-470f-bd91-e496110e5891" (UID: "a2384001-9a49-470f-bd91-e496110e5891"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:21:12.449983 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:12.449927 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2384001-9a49-470f-bd91-e496110e5891-model-cache" (OuterVolumeSpecName: "model-cache") pod "a2384001-9a49-470f-bd91-e496110e5891" (UID: "a2384001-9a49-470f-bd91-e496110e5891"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:21:12.452991 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:12.452948 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2384001-9a49-470f-bd91-e496110e5891-kube-api-access-msn4w" (OuterVolumeSpecName: "kube-api-access-msn4w") pod "a2384001-9a49-470f-bd91-e496110e5891" (UID: "a2384001-9a49-470f-bd91-e496110e5891"). InnerVolumeSpecName "kube-api-access-msn4w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:21:12.457092 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:12.457059 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2384001-9a49-470f-bd91-e496110e5891-dshm" (OuterVolumeSpecName: "dshm") pod "a2384001-9a49-470f-bd91-e496110e5891" (UID: "a2384001-9a49-470f-bd91-e496110e5891"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:21:12.459653 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:12.459620 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2384001-9a49-470f-bd91-e496110e5891-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a2384001-9a49-470f-bd91-e496110e5891" (UID: "a2384001-9a49-470f-bd91-e496110e5891"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:21:12.526364 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:12.526222 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2384001-9a49-470f-bd91-e496110e5891-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a2384001-9a49-470f-bd91-e496110e5891" (UID: "a2384001-9a49-470f-bd91-e496110e5891"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:21:12.550234 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:12.550196 2570 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a2384001-9a49-470f-bd91-e496110e5891-model-cache\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:21:12.550234 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:12.550237 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a2384001-9a49-470f-bd91-e496110e5891-kserve-provision-location\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:21:12.550411 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:12.550252 2570 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a2384001-9a49-470f-bd91-e496110e5891-home\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:21:12.550411 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:12.550265 2570 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a2384001-9a49-470f-bd91-e496110e5891-dshm\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:21:12.550411 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:12.550278 2570 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a2384001-9a49-470f-bd91-e496110e5891-tls-certs\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:21:12.550411 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:12.550291 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-msn4w\" (UniqueName: \"kubernetes.io/projected/a2384001-9a49-470f-bd91-e496110e5891-kube-api-access-msn4w\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:21:12.833404 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:12.833321 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-7f6bcc656c-fqcz4_a2384001-9a49-470f-bd91-e496110e5891/main/0.log" Apr 16 14:21:12.833761 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:12.833734 2570 generic.go:358] "Generic (PLEG): container finished" podID="a2384001-9a49-470f-bd91-e496110e5891" containerID="fb5e5991f349bcc3b8515322f96fbdfb355ae17d800871e4c74f2f4d36070220" exitCode=137 Apr 16 14:21:12.833859 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:12.833829 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" event={"ID":"a2384001-9a49-470f-bd91-e496110e5891","Type":"ContainerDied","Data":"fb5e5991f349bcc3b8515322f96fbdfb355ae17d800871e4c74f2f4d36070220"} Apr 16 14:21:12.833900 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:12.833855 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" Apr 16 14:21:12.833900 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:12.833884 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4" event={"ID":"a2384001-9a49-470f-bd91-e496110e5891","Type":"ContainerDied","Data":"e012c9558655ee458c8c37dee4954301a625f0cb2530821ac651d575c484dcb0"} Apr 16 14:21:12.833966 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:12.833906 2570 scope.go:117] "RemoveContainer" containerID="fb5e5991f349bcc3b8515322f96fbdfb355ae17d800871e4c74f2f4d36070220" Apr 16 14:21:12.860463 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:12.860427 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4"] Apr 16 14:21:12.863602 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:12.863291 2570 scope.go:117] "RemoveContainer" containerID="8b905ef66a82b713f36f31963fb64cb126222107f5f631a03fa50b2baa1fc5a1" Apr 16 14:21:12.864744 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:12.864715 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-7f6bcc656c-fqcz4"] Apr 16 14:21:12.951483 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:12.951286 2570 scope.go:117] "RemoveContainer" containerID="fb5e5991f349bcc3b8515322f96fbdfb355ae17d800871e4c74f2f4d36070220" Apr 16 14:21:12.951729 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:21:12.951706 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb5e5991f349bcc3b8515322f96fbdfb355ae17d800871e4c74f2f4d36070220\": container with ID starting with fb5e5991f349bcc3b8515322f96fbdfb355ae17d800871e4c74f2f4d36070220 not found: ID does not exist" containerID="fb5e5991f349bcc3b8515322f96fbdfb355ae17d800871e4c74f2f4d36070220" Apr 16 14:21:12.951806 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:12.951761 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb5e5991f349bcc3b8515322f96fbdfb355ae17d800871e4c74f2f4d36070220"} err="failed to get container status \"fb5e5991f349bcc3b8515322f96fbdfb355ae17d800871e4c74f2f4d36070220\": rpc error: code = NotFound desc = could not find container \"fb5e5991f349bcc3b8515322f96fbdfb355ae17d800871e4c74f2f4d36070220\": container with ID starting with fb5e5991f349bcc3b8515322f96fbdfb355ae17d800871e4c74f2f4d36070220 not found: ID does not exist" Apr 16 14:21:12.951806 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:12.951783 2570 scope.go:117] "RemoveContainer" containerID="8b905ef66a82b713f36f31963fb64cb126222107f5f631a03fa50b2baa1fc5a1" Apr 16 14:21:12.952133 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:21:12.952106 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b905ef66a82b713f36f31963fb64cb126222107f5f631a03fa50b2baa1fc5a1\": container with ID starting with 8b905ef66a82b713f36f31963fb64cb126222107f5f631a03fa50b2baa1fc5a1 not found: ID does not exist" containerID="8b905ef66a82b713f36f31963fb64cb126222107f5f631a03fa50b2baa1fc5a1" Apr 16 14:21:12.952219 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:12.952145 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b905ef66a82b713f36f31963fb64cb126222107f5f631a03fa50b2baa1fc5a1"} err="failed to get container status \"8b905ef66a82b713f36f31963fb64cb126222107f5f631a03fa50b2baa1fc5a1\": rpc error: code = NotFound desc = could not find container \"8b905ef66a82b713f36f31963fb64cb126222107f5f631a03fa50b2baa1fc5a1\": container with ID starting with 8b905ef66a82b713f36f31963fb64cb126222107f5f631a03fa50b2baa1fc5a1 not found: ID does not exist" Apr 16 14:21:14.311033 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:14.310999 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2384001-9a49-470f-bd91-e496110e5891" path="/var/lib/kubelet/pods/a2384001-9a49-470f-bd91-e496110e5891/volumes" Apr 16 14:21:16.557897 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:16.557853 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" podUID="1957e25b-feda-40b5-ba8b-403f314bba3c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 16 14:21:17.553214 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:17.553168 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" podUID="587e56a3-9a65-42c1-a3f3-43f1172aaf6f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 16 14:21:26.557939 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:26.557887 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" podUID="1957e25b-feda-40b5-ba8b-403f314bba3c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.36:8000/health\": dial tcp 10.134.0.36:8000: connect: connection refused" Apr 16 14:21:27.553769 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:27.553727 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" podUID="587e56a3-9a65-42c1-a3f3-43f1172aaf6f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 16 14:21:36.567215 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:36.567182 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" Apr 16 14:21:36.576033 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:36.575999 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" Apr 16 14:21:37.552949 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:37.552904 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" podUID="587e56a3-9a65-42c1-a3f3-43f1172aaf6f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 16 14:21:37.939527 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:37.939490 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d"] Apr 16 14:21:37.940087 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:37.939825 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" podUID="1957e25b-feda-40b5-ba8b-403f314bba3c" containerName="main" containerID="cri-o://7b70c6342830f02c69e0a23726ccbf42a193e362a2aefc32fdacd91adb57580b" gracePeriod=30 Apr 16 14:21:47.553229 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:47.553112 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" podUID="587e56a3-9a65-42c1-a3f3-43f1172aaf6f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 16 14:21:57.553805 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:21:57.553764 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" podUID="587e56a3-9a65-42c1-a3f3-43f1172aaf6f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 16 14:22:07.553031 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:07.552970 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" podUID="587e56a3-9a65-42c1-a3f3-43f1172aaf6f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 16 14:22:08.349543 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:08.349513 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-85cf5c465f-2jt7d_1957e25b-feda-40b5-ba8b-403f314bba3c/main/0.log" Apr 16 14:22:08.349943 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:08.349924 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" Apr 16 14:22:08.373701 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:08.373664 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1957e25b-feda-40b5-ba8b-403f314bba3c-tls-certs\") pod \"1957e25b-feda-40b5-ba8b-403f314bba3c\" (UID: \"1957e25b-feda-40b5-ba8b-403f314bba3c\") " Apr 16 14:22:08.373883 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:08.373738 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1957e25b-feda-40b5-ba8b-403f314bba3c-dshm\") pod \"1957e25b-feda-40b5-ba8b-403f314bba3c\" (UID: \"1957e25b-feda-40b5-ba8b-403f314bba3c\") " Apr 16 14:22:08.373883 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:08.373779 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84snm\" (UniqueName: \"kubernetes.io/projected/1957e25b-feda-40b5-ba8b-403f314bba3c-kube-api-access-84snm\") pod \"1957e25b-feda-40b5-ba8b-403f314bba3c\" (UID: \"1957e25b-feda-40b5-ba8b-403f314bba3c\") " Apr 16 14:22:08.373883 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:08.373818 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1957e25b-feda-40b5-ba8b-403f314bba3c-kserve-provision-location\") pod \"1957e25b-feda-40b5-ba8b-403f314bba3c\" (UID: \"1957e25b-feda-40b5-ba8b-403f314bba3c\") " Apr 16 14:22:08.373883 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:08.373876 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1957e25b-feda-40b5-ba8b-403f314bba3c-model-cache\") pod \"1957e25b-feda-40b5-ba8b-403f314bba3c\" (UID: \"1957e25b-feda-40b5-ba8b-403f314bba3c\") " Apr 16 14:22:08.374096 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:08.373907 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1957e25b-feda-40b5-ba8b-403f314bba3c-home\") pod \"1957e25b-feda-40b5-ba8b-403f314bba3c\" (UID: \"1957e25b-feda-40b5-ba8b-403f314bba3c\") " Apr 16 14:22:08.374598 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:08.374572 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1957e25b-feda-40b5-ba8b-403f314bba3c-home" (OuterVolumeSpecName: "home") pod "1957e25b-feda-40b5-ba8b-403f314bba3c" (UID: "1957e25b-feda-40b5-ba8b-403f314bba3c"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:22:08.374598 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:08.374582 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1957e25b-feda-40b5-ba8b-403f314bba3c-model-cache" (OuterVolumeSpecName: "model-cache") pod "1957e25b-feda-40b5-ba8b-403f314bba3c" (UID: "1957e25b-feda-40b5-ba8b-403f314bba3c"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:22:08.376652 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:08.376607 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1957e25b-feda-40b5-ba8b-403f314bba3c-kube-api-access-84snm" (OuterVolumeSpecName: "kube-api-access-84snm") pod "1957e25b-feda-40b5-ba8b-403f314bba3c" (UID: "1957e25b-feda-40b5-ba8b-403f314bba3c"). InnerVolumeSpecName "kube-api-access-84snm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:22:08.377719 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:08.377691 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1957e25b-feda-40b5-ba8b-403f314bba3c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "1957e25b-feda-40b5-ba8b-403f314bba3c" (UID: "1957e25b-feda-40b5-ba8b-403f314bba3c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:22:08.377847 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:08.377827 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1957e25b-feda-40b5-ba8b-403f314bba3c-dshm" (OuterVolumeSpecName: "dshm") pod "1957e25b-feda-40b5-ba8b-403f314bba3c" (UID: "1957e25b-feda-40b5-ba8b-403f314bba3c"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:22:08.432548 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:08.432487 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1957e25b-feda-40b5-ba8b-403f314bba3c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1957e25b-feda-40b5-ba8b-403f314bba3c" (UID: "1957e25b-feda-40b5-ba8b-403f314bba3c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:22:08.475510 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:08.475414 2570 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/1957e25b-feda-40b5-ba8b-403f314bba3c-model-cache\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:22:08.475510 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:08.475447 2570 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/1957e25b-feda-40b5-ba8b-403f314bba3c-home\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:22:08.475510 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:08.475456 2570 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1957e25b-feda-40b5-ba8b-403f314bba3c-tls-certs\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:22:08.475510 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:08.475464 2570 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/1957e25b-feda-40b5-ba8b-403f314bba3c-dshm\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:22:08.475510 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:08.475473 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-84snm\" (UniqueName: \"kubernetes.io/projected/1957e25b-feda-40b5-ba8b-403f314bba3c-kube-api-access-84snm\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:22:08.475510 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:08.475482 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1957e25b-feda-40b5-ba8b-403f314bba3c-kserve-provision-location\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:22:09.049161 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:09.049129 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-85cf5c465f-2jt7d_1957e25b-feda-40b5-ba8b-403f314bba3c/main/0.log" Apr 16 14:22:09.049615 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:09.049432 2570 generic.go:358] "Generic (PLEG): container finished" podID="1957e25b-feda-40b5-ba8b-403f314bba3c" containerID="7b70c6342830f02c69e0a23726ccbf42a193e362a2aefc32fdacd91adb57580b" exitCode=137 Apr 16 14:22:09.049615 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:09.049508 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" event={"ID":"1957e25b-feda-40b5-ba8b-403f314bba3c","Type":"ContainerDied","Data":"7b70c6342830f02c69e0a23726ccbf42a193e362a2aefc32fdacd91adb57580b"} Apr 16 14:22:09.049615 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:09.049588 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" event={"ID":"1957e25b-feda-40b5-ba8b-403f314bba3c","Type":"ContainerDied","Data":"d1e1ee3c14c3035fa927c6acd3064698101c77e70755fbfb14b94f5e75ae0b7e"} Apr 16 14:22:09.049615 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:09.049612 2570 scope.go:117] "RemoveContainer" containerID="7b70c6342830f02c69e0a23726ccbf42a193e362a2aefc32fdacd91adb57580b" Apr 16 14:22:09.049834 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:09.049548 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d" Apr 16 14:22:09.072783 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:09.072750 2570 scope.go:117] "RemoveContainer" containerID="2c64d9b3180f7e466e9ff742f8af049b329b3e3f2a634df9399f3aa46601b52f" Apr 16 14:22:09.074386 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:09.074241 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d"] Apr 16 14:22:09.080043 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:09.080013 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-85cf5c465f-2jt7d"] Apr 16 14:22:09.139387 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:09.139367 2570 scope.go:117] "RemoveContainer" containerID="7b70c6342830f02c69e0a23726ccbf42a193e362a2aefc32fdacd91adb57580b" Apr 16 14:22:09.139773 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:22:09.139745 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b70c6342830f02c69e0a23726ccbf42a193e362a2aefc32fdacd91adb57580b\": container with ID starting with 7b70c6342830f02c69e0a23726ccbf42a193e362a2aefc32fdacd91adb57580b not found: ID does not exist" containerID="7b70c6342830f02c69e0a23726ccbf42a193e362a2aefc32fdacd91adb57580b" Apr 16 14:22:09.139831 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:09.139790 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b70c6342830f02c69e0a23726ccbf42a193e362a2aefc32fdacd91adb57580b"} err="failed to get container status \"7b70c6342830f02c69e0a23726ccbf42a193e362a2aefc32fdacd91adb57580b\": rpc error: code = NotFound desc = could not find container \"7b70c6342830f02c69e0a23726ccbf42a193e362a2aefc32fdacd91adb57580b\": container with ID starting with 7b70c6342830f02c69e0a23726ccbf42a193e362a2aefc32fdacd91adb57580b not found: ID does not exist" Apr 16 14:22:09.139831 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:09.139825 2570 scope.go:117] "RemoveContainer" containerID="2c64d9b3180f7e466e9ff742f8af049b329b3e3f2a634df9399f3aa46601b52f" Apr 16 14:22:09.140164 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:22:09.140135 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c64d9b3180f7e466e9ff742f8af049b329b3e3f2a634df9399f3aa46601b52f\": container with ID starting with 2c64d9b3180f7e466e9ff742f8af049b329b3e3f2a634df9399f3aa46601b52f not found: ID does not exist" containerID="2c64d9b3180f7e466e9ff742f8af049b329b3e3f2a634df9399f3aa46601b52f" Apr 16 14:22:09.140261 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:09.140169 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c64d9b3180f7e466e9ff742f8af049b329b3e3f2a634df9399f3aa46601b52f"} err="failed to get container status \"2c64d9b3180f7e466e9ff742f8af049b329b3e3f2a634df9399f3aa46601b52f\": rpc error: code = NotFound desc = could not find container \"2c64d9b3180f7e466e9ff742f8af049b329b3e3f2a634df9399f3aa46601b52f\": container with ID starting with 2c64d9b3180f7e466e9ff742f8af049b329b3e3f2a634df9399f3aa46601b52f not found: ID does not exist" Apr 16 14:22:10.309460 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:10.309423 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1957e25b-feda-40b5-ba8b-403f314bba3c" path="/var/lib/kubelet/pods/1957e25b-feda-40b5-ba8b-403f314bba3c/volumes" Apr 16 14:22:17.553318 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:17.553271 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" podUID="587e56a3-9a65-42c1-a3f3-43f1172aaf6f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 16 14:22:27.553809 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:27.553769 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" podUID="587e56a3-9a65-42c1-a3f3-43f1172aaf6f" containerName="main" probeResult="failure" output="Get \"https://10.134.0.37:8000/health\": dial tcp 10.134.0.37:8000: connect: connection refused" Apr 16 14:22:37.563099 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:37.563061 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" Apr 16 14:22:37.570959 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:37.570924 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" Apr 16 14:22:45.264699 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:45.264658 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb"] Apr 16 14:22:45.265285 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:22:45.264961 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" podUID="587e56a3-9a65-42c1-a3f3-43f1172aaf6f" containerName="main" containerID="cri-o://049fd2adbdc5c33d82c9bb73e1fcc03fb1ba11275c0d73dc623c40493d26f4fd" gracePeriod=30 Apr 16 14:23:01.209398 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.209354 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz"] Apr 16 14:23:01.209803 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.209787 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2384001-9a49-470f-bd91-e496110e5891" containerName="main" Apr 16 14:23:01.209803 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.209800 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2384001-9a49-470f-bd91-e496110e5891" containerName="main" Apr 16 14:23:01.209872 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.209809 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1957e25b-feda-40b5-ba8b-403f314bba3c" containerName="storage-initializer" Apr 16 14:23:01.209872 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.209815 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="1957e25b-feda-40b5-ba8b-403f314bba3c" containerName="storage-initializer" Apr 16 14:23:01.209872 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.209831 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a2384001-9a49-470f-bd91-e496110e5891" containerName="storage-initializer" Apr 16 14:23:01.209872 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.209837 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2384001-9a49-470f-bd91-e496110e5891" containerName="storage-initializer" Apr 16 14:23:01.209872 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.209846 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1957e25b-feda-40b5-ba8b-403f314bba3c" containerName="main" Apr 16 14:23:01.209872 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.209852 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="1957e25b-feda-40b5-ba8b-403f314bba3c" containerName="main" Apr 16 14:23:01.210043 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.209914 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="a2384001-9a49-470f-bd91-e496110e5891" containerName="main" Apr 16 14:23:01.210043 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.209925 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="1957e25b-feda-40b5-ba8b-403f314bba3c" containerName="main" Apr 16 14:23:01.212040 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.212021 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" Apr 16 14:23:01.217652 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.217622 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 16 14:23:01.217983 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.217962 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-dockercfg-x7c2g\"" Apr 16 14:23:01.226837 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.226805 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz"] Apr 16 14:23:01.228790 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.228764 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566"] Apr 16 14:23:01.231825 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.231791 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" Apr 16 14:23:01.242984 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.242938 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566"] Apr 16 14:23:01.278012 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.277979 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4859b\" (UniqueName: \"kubernetes.io/projected/cddae74e-b494-48c5-9040-d9c6af603171-kube-api-access-4859b\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz\" (UID: \"cddae74e-b494-48c5-9040-d9c6af603171\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" Apr 16 14:23:01.278187 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.278100 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cddae74e-b494-48c5-9040-d9c6af603171-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz\" (UID: \"cddae74e-b494-48c5-9040-d9c6af603171\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" Apr 16 14:23:01.278187 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.278133 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cddae74e-b494-48c5-9040-d9c6af603171-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz\" (UID: \"cddae74e-b494-48c5-9040-d9c6af603171\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" Apr 16 14:23:01.278309 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.278239 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cddae74e-b494-48c5-9040-d9c6af603171-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz\" (UID: \"cddae74e-b494-48c5-9040-d9c6af603171\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" Apr 16 14:23:01.278309 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.278277 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cddae74e-b494-48c5-9040-d9c6af603171-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz\" (UID: \"cddae74e-b494-48c5-9040-d9c6af603171\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" Apr 16 14:23:01.278401 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.278309 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cddae74e-b494-48c5-9040-d9c6af603171-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz\" (UID: \"cddae74e-b494-48c5-9040-d9c6af603171\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" Apr 16 14:23:01.379391 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.379356 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566\" (UID: \"9ee896e4-e422-4e33-a8e5-ab3c278a06e8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" Apr 16 14:23:01.379590 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.379414 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cddae74e-b494-48c5-9040-d9c6af603171-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz\" (UID: \"cddae74e-b494-48c5-9040-d9c6af603171\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" Apr 16 14:23:01.379590 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.379473 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cddae74e-b494-48c5-9040-d9c6af603171-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz\" (UID: \"cddae74e-b494-48c5-9040-d9c6af603171\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" Apr 16 14:23:01.379590 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.379516 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566\" (UID: \"9ee896e4-e422-4e33-a8e5-ab3c278a06e8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" Apr 16 14:23:01.379760 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.379589 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566\" (UID: \"9ee896e4-e422-4e33-a8e5-ab3c278a06e8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" Apr 16 14:23:01.379760 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.379675 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566\" (UID: \"9ee896e4-e422-4e33-a8e5-ab3c278a06e8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" Apr 16 14:23:01.379760 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.379719 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9jrf\" (UniqueName: \"kubernetes.io/projected/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-kube-api-access-k9jrf\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566\" (UID: \"9ee896e4-e422-4e33-a8e5-ab3c278a06e8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" Apr 16 14:23:01.379923 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.379761 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cddae74e-b494-48c5-9040-d9c6af603171-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz\" (UID: \"cddae74e-b494-48c5-9040-d9c6af603171\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" Apr 16 14:23:01.379923 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.379790 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cddae74e-b494-48c5-9040-d9c6af603171-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz\" (UID: \"cddae74e-b494-48c5-9040-d9c6af603171\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" Apr 16 14:23:01.379923 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.379831 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cddae74e-b494-48c5-9040-d9c6af603171-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz\" (UID: \"cddae74e-b494-48c5-9040-d9c6af603171\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" Apr 16 14:23:01.379923 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.379915 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566\" (UID: \"9ee896e4-e422-4e33-a8e5-ab3c278a06e8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" Apr 16 14:23:01.380120 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.379951 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4859b\" (UniqueName: \"kubernetes.io/projected/cddae74e-b494-48c5-9040-d9c6af603171-kube-api-access-4859b\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz\" (UID: \"cddae74e-b494-48c5-9040-d9c6af603171\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" Apr 16 14:23:01.380170 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.380122 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cddae74e-b494-48c5-9040-d9c6af603171-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz\" (UID: \"cddae74e-b494-48c5-9040-d9c6af603171\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" Apr 16 14:23:01.380170 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.380135 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cddae74e-b494-48c5-9040-d9c6af603171-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz\" (UID: \"cddae74e-b494-48c5-9040-d9c6af603171\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" Apr 16 14:23:01.380445 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.380425 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cddae74e-b494-48c5-9040-d9c6af603171-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz\" (UID: \"cddae74e-b494-48c5-9040-d9c6af603171\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" Apr 16 14:23:01.382614 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.382587 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cddae74e-b494-48c5-9040-d9c6af603171-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz\" (UID: \"cddae74e-b494-48c5-9040-d9c6af603171\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" Apr 16 14:23:01.382843 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.382826 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cddae74e-b494-48c5-9040-d9c6af603171-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz\" (UID: \"cddae74e-b494-48c5-9040-d9c6af603171\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" Apr 16 14:23:01.389012 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.388981 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4859b\" (UniqueName: \"kubernetes.io/projected/cddae74e-b494-48c5-9040-d9c6af603171-kube-api-access-4859b\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz\" (UID: \"cddae74e-b494-48c5-9040-d9c6af603171\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" Apr 16 14:23:01.481485 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.481389 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566\" (UID: \"9ee896e4-e422-4e33-a8e5-ab3c278a06e8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" Apr 16 14:23:01.481485 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.481480 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566\" (UID: \"9ee896e4-e422-4e33-a8e5-ab3c278a06e8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" Apr 16 14:23:01.481714 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.481508 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566\" (UID: \"9ee896e4-e422-4e33-a8e5-ab3c278a06e8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" Apr 16 14:23:01.481714 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.481564 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566\" (UID: \"9ee896e4-e422-4e33-a8e5-ab3c278a06e8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" Apr 16 14:23:01.481714 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.481582 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k9jrf\" (UniqueName: \"kubernetes.io/projected/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-kube-api-access-k9jrf\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566\" (UID: \"9ee896e4-e422-4e33-a8e5-ab3c278a06e8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" Apr 16 14:23:01.481714 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.481623 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566\" (UID: \"9ee896e4-e422-4e33-a8e5-ab3c278a06e8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" Apr 16 14:23:01.481915 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.481886 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566\" (UID: \"9ee896e4-e422-4e33-a8e5-ab3c278a06e8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" Apr 16 14:23:01.481971 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.481938 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566\" (UID: \"9ee896e4-e422-4e33-a8e5-ab3c278a06e8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" Apr 16 14:23:01.482013 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.481968 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566\" (UID: \"9ee896e4-e422-4e33-a8e5-ab3c278a06e8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" Apr 16 14:23:01.484074 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.484051 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566\" (UID: \"9ee896e4-e422-4e33-a8e5-ab3c278a06e8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" Apr 16 14:23:01.484446 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.484423 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566\" (UID: \"9ee896e4-e422-4e33-a8e5-ab3c278a06e8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" Apr 16 14:23:01.503453 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.500745 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9jrf\" (UniqueName: \"kubernetes.io/projected/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-kube-api-access-k9jrf\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566\" (UID: \"9ee896e4-e422-4e33-a8e5-ab3c278a06e8\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" Apr 16 14:23:01.523092 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.523047 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" Apr 16 14:23:01.544479 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.544443 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" Apr 16 14:23:01.675227 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.675194 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz"] Apr 16 14:23:01.678901 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:23:01.678864 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcddae74e_b494_48c5_9040_d9c6af603171.slice/crio-cee8356c6960013342a702d7bc2c11d97887b9f2a420da200e4cda30960fa248 WatchSource:0}: Error finding container cee8356c6960013342a702d7bc2c11d97887b9f2a420da200e4cda30960fa248: Status 404 returned error can't find the container with id cee8356c6960013342a702d7bc2c11d97887b9f2a420da200e4cda30960fa248 Apr 16 14:23:01.680689 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.680673 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:23:01.694316 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:01.694280 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566"] Apr 16 14:23:01.697865 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:23:01.697822 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ee896e4_e422_4e33_a8e5_ab3c278a06e8.slice/crio-403b4b9169d55094fb8630fac10eb728e84f03cd5f40b0bf3b320f3fc42ddf39 WatchSource:0}: Error finding container 403b4b9169d55094fb8630fac10eb728e84f03cd5f40b0bf3b320f3fc42ddf39: Status 404 returned error can't find the container with id 403b4b9169d55094fb8630fac10eb728e84f03cd5f40b0bf3b320f3fc42ddf39 Apr 16 14:23:02.233486 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:02.233431 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" event={"ID":"cddae74e-b494-48c5-9040-d9c6af603171","Type":"ContainerStarted","Data":"cee8356c6960013342a702d7bc2c11d97887b9f2a420da200e4cda30960fa248"} Apr 16 14:23:02.235066 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:02.235029 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" event={"ID":"9ee896e4-e422-4e33-a8e5-ab3c278a06e8","Type":"ContainerStarted","Data":"c15886af513c92af2a01223d26cae7372d76c845a35ab7c7e5e0dc176f29e1b7"} Apr 16 14:23:02.235194 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:02.235070 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" event={"ID":"9ee896e4-e422-4e33-a8e5-ab3c278a06e8","Type":"ContainerStarted","Data":"403b4b9169d55094fb8630fac10eb728e84f03cd5f40b0bf3b320f3fc42ddf39"} Apr 16 14:23:03.241628 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:03.241510 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" event={"ID":"cddae74e-b494-48c5-9040-d9c6af603171","Type":"ContainerStarted","Data":"6e699e3cea9d1e5d7caafe4a8735b5b4e009bdf77df431723aae629ce503f0bf"} Apr 16 14:23:03.242084 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:03.241819 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" Apr 16 14:23:04.247826 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:04.247780 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" event={"ID":"cddae74e-b494-48c5-9040-d9c6af603171","Type":"ContainerStarted","Data":"5c46ad9b0fab6715debfbabc3987d65d925368065c9f42cef52cf70f70be8649"} Apr 16 14:23:08.264062 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:08.264019 2570 generic.go:358] "Generic (PLEG): container finished" podID="9ee896e4-e422-4e33-a8e5-ab3c278a06e8" containerID="c15886af513c92af2a01223d26cae7372d76c845a35ab7c7e5e0dc176f29e1b7" exitCode=0 Apr 16 14:23:08.264443 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:08.264129 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" event={"ID":"9ee896e4-e422-4e33-a8e5-ab3c278a06e8","Type":"ContainerDied","Data":"c15886af513c92af2a01223d26cae7372d76c845a35ab7c7e5e0dc176f29e1b7"} Apr 16 14:23:09.269037 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:09.268996 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" event={"ID":"9ee896e4-e422-4e33-a8e5-ab3c278a06e8","Type":"ContainerStarted","Data":"dd51d114aeec71208281c559039a611bbf2c5e0380eecc96f45921bed93d7902"} Apr 16 14:23:09.270632 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:09.270603 2570 generic.go:358] "Generic (PLEG): container finished" podID="cddae74e-b494-48c5-9040-d9c6af603171" containerID="5c46ad9b0fab6715debfbabc3987d65d925368065c9f42cef52cf70f70be8649" exitCode=0 Apr 16 14:23:09.270774 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:09.270666 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" event={"ID":"cddae74e-b494-48c5-9040-d9c6af603171","Type":"ContainerDied","Data":"5c46ad9b0fab6715debfbabc3987d65d925368065c9f42cef52cf70f70be8649"} Apr 16 14:23:09.292764 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:09.292699 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" podStartSLOduration=8.292678945 podStartE2EDuration="8.292678945s" podCreationTimestamp="2026-04-16 14:23:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:23:09.288811322 +0000 UTC m=+1415.494454487" watchObservedRunningTime="2026-04-16 14:23:09.292678945 +0000 UTC m=+1415.498322110" Apr 16 14:23:10.276093 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:10.276050 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" event={"ID":"cddae74e-b494-48c5-9040-d9c6af603171","Type":"ContainerStarted","Data":"2df00991c33a68879e4223661fbff31481e4787a8f75c4397a7ffab73340eac0"} Apr 16 14:23:10.300208 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:10.300146 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" podStartSLOduration=8.360716473 podStartE2EDuration="9.300125329s" podCreationTimestamp="2026-04-16 14:23:01 +0000 UTC" firstStartedPulling="2026-04-16 14:23:01.680796239 +0000 UTC m=+1407.886439382" lastFinishedPulling="2026-04-16 14:23:02.620205092 +0000 UTC m=+1408.825848238" observedRunningTime="2026-04-16 14:23:10.299158217 +0000 UTC m=+1416.504801381" watchObservedRunningTime="2026-04-16 14:23:10.300125329 +0000 UTC m=+1416.505768495" Apr 16 14:23:11.523596 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:11.523558 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" Apr 16 14:23:11.524039 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:11.523611 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" Apr 16 14:23:11.525054 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:11.525028 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" podUID="cddae74e-b494-48c5-9040-d9c6af603171" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8001/health\": dial tcp 10.134.0.38:8001: connect: connection refused" Apr 16 14:23:11.544748 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:11.544711 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" Apr 16 14:23:11.544748 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:11.544755 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" Apr 16 14:23:11.546333 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:11.546304 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" podUID="9ee896e4-e422-4e33-a8e5-ab3c278a06e8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 16 14:23:15.588728 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:15.588695 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-558b4c5bc9-sshfb_587e56a3-9a65-42c1-a3f3-43f1172aaf6f/main/0.log" Apr 16 14:23:15.589195 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:15.589176 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" Apr 16 14:23:15.628884 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:15.628835 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-model-cache\") pod \"587e56a3-9a65-42c1-a3f3-43f1172aaf6f\" (UID: \"587e56a3-9a65-42c1-a3f3-43f1172aaf6f\") " Apr 16 14:23:15.629071 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:15.628895 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ppmb\" (UniqueName: \"kubernetes.io/projected/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-kube-api-access-5ppmb\") pod \"587e56a3-9a65-42c1-a3f3-43f1172aaf6f\" (UID: \"587e56a3-9a65-42c1-a3f3-43f1172aaf6f\") " Apr 16 14:23:15.629071 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:15.628934 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-kserve-provision-location\") pod \"587e56a3-9a65-42c1-a3f3-43f1172aaf6f\" (UID: \"587e56a3-9a65-42c1-a3f3-43f1172aaf6f\") " Apr 16 14:23:15.629071 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:15.628986 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-dshm\") pod \"587e56a3-9a65-42c1-a3f3-43f1172aaf6f\" (UID: \"587e56a3-9a65-42c1-a3f3-43f1172aaf6f\") " Apr 16 14:23:15.629243 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:15.629121 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-model-cache" (OuterVolumeSpecName: "model-cache") pod "587e56a3-9a65-42c1-a3f3-43f1172aaf6f" (UID: "587e56a3-9a65-42c1-a3f3-43f1172aaf6f"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:23:15.629486 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:15.629438 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-home\") pod \"587e56a3-9a65-42c1-a3f3-43f1172aaf6f\" (UID: \"587e56a3-9a65-42c1-a3f3-43f1172aaf6f\") " Apr 16 14:23:15.629593 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:15.629549 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-home" (OuterVolumeSpecName: "home") pod "587e56a3-9a65-42c1-a3f3-43f1172aaf6f" (UID: "587e56a3-9a65-42c1-a3f3-43f1172aaf6f"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:23:15.629850 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:15.629827 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-tls-certs\") pod \"587e56a3-9a65-42c1-a3f3-43f1172aaf6f\" (UID: \"587e56a3-9a65-42c1-a3f3-43f1172aaf6f\") " Apr 16 14:23:15.630354 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:15.630330 2570 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-home\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:23:15.630488 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:15.630471 2570 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-model-cache\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:23:15.631878 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:15.631842 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-kube-api-access-5ppmb" (OuterVolumeSpecName: "kube-api-access-5ppmb") pod "587e56a3-9a65-42c1-a3f3-43f1172aaf6f" (UID: "587e56a3-9a65-42c1-a3f3-43f1172aaf6f"). InnerVolumeSpecName "kube-api-access-5ppmb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:23:15.632674 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:15.632591 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "587e56a3-9a65-42c1-a3f3-43f1172aaf6f" (UID: "587e56a3-9a65-42c1-a3f3-43f1172aaf6f"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:23:15.633328 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:15.633298 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-dshm" (OuterVolumeSpecName: "dshm") pod "587e56a3-9a65-42c1-a3f3-43f1172aaf6f" (UID: "587e56a3-9a65-42c1-a3f3-43f1172aaf6f"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:23:15.692490 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:15.692423 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "587e56a3-9a65-42c1-a3f3-43f1172aaf6f" (UID: "587e56a3-9a65-42c1-a3f3-43f1172aaf6f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:23:15.731558 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:15.731485 2570 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-dshm\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:23:15.731558 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:15.731526 2570 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-tls-certs\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:23:15.731772 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:15.731574 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5ppmb\" (UniqueName: \"kubernetes.io/projected/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-kube-api-access-5ppmb\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:23:15.731772 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:15.731589 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/587e56a3-9a65-42c1-a3f3-43f1172aaf6f-kserve-provision-location\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:23:16.304866 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:16.304787 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-558b4c5bc9-sshfb_587e56a3-9a65-42c1-a3f3-43f1172aaf6f/main/0.log" Apr 16 14:23:16.305227 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:16.305200 2570 generic.go:358] "Generic (PLEG): container finished" podID="587e56a3-9a65-42c1-a3f3-43f1172aaf6f" containerID="049fd2adbdc5c33d82c9bb73e1fcc03fb1ba11275c0d73dc623c40493d26f4fd" exitCode=137 Apr 16 14:23:16.305403 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:16.305380 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" Apr 16 14:23:16.310071 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:16.310027 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" event={"ID":"587e56a3-9a65-42c1-a3f3-43f1172aaf6f","Type":"ContainerDied","Data":"049fd2adbdc5c33d82c9bb73e1fcc03fb1ba11275c0d73dc623c40493d26f4fd"} Apr 16 14:23:16.310071 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:16.310068 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb" event={"ID":"587e56a3-9a65-42c1-a3f3-43f1172aaf6f","Type":"ContainerDied","Data":"f74fce501e42ab05fc9f093e443df740119f884add05227a0fbc7a5b27e8b726"} Apr 16 14:23:16.310314 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:16.310095 2570 scope.go:117] "RemoveContainer" containerID="049fd2adbdc5c33d82c9bb73e1fcc03fb1ba11275c0d73dc623c40493d26f4fd" Apr 16 14:23:16.335805 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:16.335769 2570 scope.go:117] "RemoveContainer" containerID="94b48f9423209b36d7427ac8bc9c3b8bd3ad2776d1cd2159c5df316bd544ae64" Apr 16 14:23:16.336723 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:16.336559 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb"] Apr 16 14:23:16.340752 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:16.340724 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-558b4c5bc9-sshfb"] Apr 16 14:23:16.420405 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:16.420381 2570 scope.go:117] "RemoveContainer" containerID="049fd2adbdc5c33d82c9bb73e1fcc03fb1ba11275c0d73dc623c40493d26f4fd" Apr 16 14:23:16.420852 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:23:16.420821 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"049fd2adbdc5c33d82c9bb73e1fcc03fb1ba11275c0d73dc623c40493d26f4fd\": container with ID starting with 049fd2adbdc5c33d82c9bb73e1fcc03fb1ba11275c0d73dc623c40493d26f4fd not found: ID does not exist" containerID="049fd2adbdc5c33d82c9bb73e1fcc03fb1ba11275c0d73dc623c40493d26f4fd" Apr 16 14:23:16.420982 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:16.420866 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"049fd2adbdc5c33d82c9bb73e1fcc03fb1ba11275c0d73dc623c40493d26f4fd"} err="failed to get container status \"049fd2adbdc5c33d82c9bb73e1fcc03fb1ba11275c0d73dc623c40493d26f4fd\": rpc error: code = NotFound desc = could not find container \"049fd2adbdc5c33d82c9bb73e1fcc03fb1ba11275c0d73dc623c40493d26f4fd\": container with ID starting with 049fd2adbdc5c33d82c9bb73e1fcc03fb1ba11275c0d73dc623c40493d26f4fd not found: ID does not exist" Apr 16 14:23:16.420982 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:16.420896 2570 scope.go:117] "RemoveContainer" containerID="94b48f9423209b36d7427ac8bc9c3b8bd3ad2776d1cd2159c5df316bd544ae64" Apr 16 14:23:16.421218 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:23:16.421193 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"94b48f9423209b36d7427ac8bc9c3b8bd3ad2776d1cd2159c5df316bd544ae64\": container with ID starting with 94b48f9423209b36d7427ac8bc9c3b8bd3ad2776d1cd2159c5df316bd544ae64 not found: ID does not exist" containerID="94b48f9423209b36d7427ac8bc9c3b8bd3ad2776d1cd2159c5df316bd544ae64" Apr 16 14:23:16.421292 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:16.421227 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"94b48f9423209b36d7427ac8bc9c3b8bd3ad2776d1cd2159c5df316bd544ae64"} err="failed to get container status \"94b48f9423209b36d7427ac8bc9c3b8bd3ad2776d1cd2159c5df316bd544ae64\": rpc error: code = NotFound desc = could not find container \"94b48f9423209b36d7427ac8bc9c3b8bd3ad2776d1cd2159c5df316bd544ae64\": container with ID starting with 94b48f9423209b36d7427ac8bc9c3b8bd3ad2776d1cd2159c5df316bd544ae64 not found: ID does not exist" Apr 16 14:23:18.310374 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:18.310317 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="587e56a3-9a65-42c1-a3f3-43f1172aaf6f" path="/var/lib/kubelet/pods/587e56a3-9a65-42c1-a3f3-43f1172aaf6f/volumes" Apr 16 14:23:18.462945 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:18.462901 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c"] Apr 16 14:23:18.463771 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:18.463741 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="587e56a3-9a65-42c1-a3f3-43f1172aaf6f" containerName="storage-initializer" Apr 16 14:23:18.463972 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:18.463956 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="587e56a3-9a65-42c1-a3f3-43f1172aaf6f" containerName="storage-initializer" Apr 16 14:23:18.464112 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:18.464099 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="587e56a3-9a65-42c1-a3f3-43f1172aaf6f" containerName="main" Apr 16 14:23:18.464204 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:18.464192 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="587e56a3-9a65-42c1-a3f3-43f1172aaf6f" containerName="main" Apr 16 14:23:18.464434 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:18.464417 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="587e56a3-9a65-42c1-a3f3-43f1172aaf6f" containerName="main" Apr 16 14:23:18.498625 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:18.498582 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c"] Apr 16 14:23:18.498835 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:18.498770 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" Apr 16 14:23:18.501795 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:18.501762 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8de1d74aab16d9cabd8b5aafeb5248e8-kserve-self-signed-certs\"" Apr 16 14:23:18.558454 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:18.558408 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/517ef72a-d8c6-4729-b4fc-037c8b19a357-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c\" (UID: \"517ef72a-d8c6-4729-b4fc-037c8b19a357\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" Apr 16 14:23:18.558454 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:18.558454 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/517ef72a-d8c6-4729-b4fc-037c8b19a357-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c\" (UID: \"517ef72a-d8c6-4729-b4fc-037c8b19a357\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" Apr 16 14:23:18.558738 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:18.558489 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/517ef72a-d8c6-4729-b4fc-037c8b19a357-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c\" (UID: \"517ef72a-d8c6-4729-b4fc-037c8b19a357\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" Apr 16 14:23:18.558738 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:18.558598 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/517ef72a-d8c6-4729-b4fc-037c8b19a357-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c\" (UID: \"517ef72a-d8c6-4729-b4fc-037c8b19a357\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" Apr 16 14:23:18.558738 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:18.558651 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/517ef72a-d8c6-4729-b4fc-037c8b19a357-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c\" (UID: \"517ef72a-d8c6-4729-b4fc-037c8b19a357\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" Apr 16 14:23:18.558738 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:18.558708 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpfr5\" (UniqueName: \"kubernetes.io/projected/517ef72a-d8c6-4729-b4fc-037c8b19a357-kube-api-access-zpfr5\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c\" (UID: \"517ef72a-d8c6-4729-b4fc-037c8b19a357\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" Apr 16 14:23:18.659863 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:18.659819 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/517ef72a-d8c6-4729-b4fc-037c8b19a357-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c\" (UID: \"517ef72a-d8c6-4729-b4fc-037c8b19a357\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" Apr 16 14:23:18.659863 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:18.659857 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/517ef72a-d8c6-4729-b4fc-037c8b19a357-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c\" (UID: \"517ef72a-d8c6-4729-b4fc-037c8b19a357\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" Apr 16 14:23:18.660130 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:18.659890 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/517ef72a-d8c6-4729-b4fc-037c8b19a357-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c\" (UID: \"517ef72a-d8c6-4729-b4fc-037c8b19a357\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" Apr 16 14:23:18.660130 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:18.659930 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/517ef72a-d8c6-4729-b4fc-037c8b19a357-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c\" (UID: \"517ef72a-d8c6-4729-b4fc-037c8b19a357\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" Apr 16 14:23:18.660130 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:18.659956 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/517ef72a-d8c6-4729-b4fc-037c8b19a357-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c\" (UID: \"517ef72a-d8c6-4729-b4fc-037c8b19a357\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" Apr 16 14:23:18.660130 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:18.660004 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zpfr5\" (UniqueName: \"kubernetes.io/projected/517ef72a-d8c6-4729-b4fc-037c8b19a357-kube-api-access-zpfr5\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c\" (UID: \"517ef72a-d8c6-4729-b4fc-037c8b19a357\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" Apr 16 14:23:18.660353 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:18.660306 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/517ef72a-d8c6-4729-b4fc-037c8b19a357-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c\" (UID: \"517ef72a-d8c6-4729-b4fc-037c8b19a357\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" Apr 16 14:23:18.660408 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:18.660341 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/517ef72a-d8c6-4729-b4fc-037c8b19a357-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c\" (UID: \"517ef72a-d8c6-4729-b4fc-037c8b19a357\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" Apr 16 14:23:18.660607 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:18.660569 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/517ef72a-d8c6-4729-b4fc-037c8b19a357-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c\" (UID: \"517ef72a-d8c6-4729-b4fc-037c8b19a357\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" Apr 16 14:23:18.662567 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:18.662512 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/517ef72a-d8c6-4729-b4fc-037c8b19a357-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c\" (UID: \"517ef72a-d8c6-4729-b4fc-037c8b19a357\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" Apr 16 14:23:18.662681 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:18.662654 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/517ef72a-d8c6-4729-b4fc-037c8b19a357-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c\" (UID: \"517ef72a-d8c6-4729-b4fc-037c8b19a357\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" Apr 16 14:23:18.668958 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:18.668928 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpfr5\" (UniqueName: \"kubernetes.io/projected/517ef72a-d8c6-4729-b4fc-037c8b19a357-kube-api-access-zpfr5\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c\" (UID: \"517ef72a-d8c6-4729-b4fc-037c8b19a357\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" Apr 16 14:23:18.814313 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:18.814276 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" Apr 16 14:23:18.968280 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:18.968231 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c"] Apr 16 14:23:18.973184 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:23:18.973152 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod517ef72a_d8c6_4729_b4fc_037c8b19a357.slice/crio-2bd7c3aa25c6cd4da19116bc8b45ecc97fd49859a780d5ca982aaaeeadbb0270 WatchSource:0}: Error finding container 2bd7c3aa25c6cd4da19116bc8b45ecc97fd49859a780d5ca982aaaeeadbb0270: Status 404 returned error can't find the container with id 2bd7c3aa25c6cd4da19116bc8b45ecc97fd49859a780d5ca982aaaeeadbb0270 Apr 16 14:23:19.320857 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:19.320721 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" event={"ID":"517ef72a-d8c6-4729-b4fc-037c8b19a357","Type":"ContainerStarted","Data":"8352b473e6ccd1a4355b876e3586b348e4cc0ba83aa221fa57a01658f4f29ec1"} Apr 16 14:23:19.320857 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:19.320773 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" event={"ID":"517ef72a-d8c6-4729-b4fc-037c8b19a357","Type":"ContainerStarted","Data":"2bd7c3aa25c6cd4da19116bc8b45ecc97fd49859a780d5ca982aaaeeadbb0270"} Apr 16 14:23:21.525054 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:21.524647 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" podUID="cddae74e-b494-48c5-9040-d9c6af603171" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8001/health\": dial tcp 10.134.0.38:8001: connect: connection refused" Apr 16 14:23:21.545499 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:21.545073 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" podUID="9ee896e4-e422-4e33-a8e5-ab3c278a06e8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 16 14:23:21.547189 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:21.547154 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" Apr 16 14:23:24.344222 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:24.344180 2570 generic.go:358] "Generic (PLEG): container finished" podID="517ef72a-d8c6-4729-b4fc-037c8b19a357" containerID="8352b473e6ccd1a4355b876e3586b348e4cc0ba83aa221fa57a01658f4f29ec1" exitCode=0 Apr 16 14:23:24.344702 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:24.344259 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" event={"ID":"517ef72a-d8c6-4729-b4fc-037c8b19a357","Type":"ContainerDied","Data":"8352b473e6ccd1a4355b876e3586b348e4cc0ba83aa221fa57a01658f4f29ec1"} Apr 16 14:23:25.350983 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:25.350947 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" event={"ID":"517ef72a-d8c6-4729-b4fc-037c8b19a357","Type":"ContainerStarted","Data":"e6d1a8dcc23c286893c349d14d993a1fdbce5305640aede3bcb6b9778f5f188e"} Apr 16 14:23:25.374712 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:25.374637 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" podStartSLOduration=7.374613533 podStartE2EDuration="7.374613533s" podCreationTimestamp="2026-04-16 14:23:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:23:25.373001823 +0000 UTC m=+1431.578644989" watchObservedRunningTime="2026-04-16 14:23:25.374613533 +0000 UTC m=+1431.580256699" Apr 16 14:23:28.814944 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:28.814902 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" Apr 16 14:23:28.814944 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:28.814958 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" Apr 16 14:23:28.816842 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:28.816805 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" podUID="517ef72a-d8c6-4729-b4fc-037c8b19a357" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 16 14:23:31.523744 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:31.523694 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" podUID="cddae74e-b494-48c5-9040-d9c6af603171" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8001/health\": dial tcp 10.134.0.38:8001: connect: connection refused" Apr 16 14:23:31.544943 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:31.544896 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" podUID="9ee896e4-e422-4e33-a8e5-ab3c278a06e8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 16 14:23:38.815280 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:38.815230 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" podUID="517ef72a-d8c6-4729-b4fc-037c8b19a357" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 16 14:23:41.523670 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:41.523582 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" podUID="cddae74e-b494-48c5-9040-d9c6af603171" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8001/health\": dial tcp 10.134.0.38:8001: connect: connection refused" Apr 16 14:23:41.545907 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:41.545860 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" podUID="9ee896e4-e422-4e33-a8e5-ab3c278a06e8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 16 14:23:48.815288 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:48.815234 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" podUID="517ef72a-d8c6-4729-b4fc-037c8b19a357" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 16 14:23:51.523628 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:51.523564 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" podUID="cddae74e-b494-48c5-9040-d9c6af603171" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8001/health\": dial tcp 10.134.0.38:8001: connect: connection refused" Apr 16 14:23:51.545962 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:51.545910 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" podUID="9ee896e4-e422-4e33-a8e5-ab3c278a06e8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 16 14:23:58.815599 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:23:58.815550 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" podUID="517ef72a-d8c6-4729-b4fc-037c8b19a357" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 16 14:24:01.524069 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:24:01.524011 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" podUID="cddae74e-b494-48c5-9040-d9c6af603171" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8001/health\": dial tcp 10.134.0.38:8001: connect: connection refused" Apr 16 14:24:01.545494 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:24:01.545449 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" podUID="9ee896e4-e422-4e33-a8e5-ab3c278a06e8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 16 14:24:08.815290 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:24:08.815244 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" podUID="517ef72a-d8c6-4729-b4fc-037c8b19a357" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 16 14:24:11.524564 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:24:11.524115 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" podUID="cddae74e-b494-48c5-9040-d9c6af603171" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8001/health\": dial tcp 10.134.0.38:8001: connect: connection refused" Apr 16 14:24:11.545497 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:24:11.545451 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" podUID="9ee896e4-e422-4e33-a8e5-ab3c278a06e8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 16 14:24:18.815361 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:24:18.815308 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" podUID="517ef72a-d8c6-4729-b4fc-037c8b19a357" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 16 14:24:21.523881 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:24:21.523821 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" podUID="cddae74e-b494-48c5-9040-d9c6af603171" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8001/health\": dial tcp 10.134.0.38:8001: connect: connection refused" Apr 16 14:24:21.545242 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:24:21.545196 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" podUID="9ee896e4-e422-4e33-a8e5-ab3c278a06e8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 16 14:24:28.815007 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:24:28.814955 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" podUID="517ef72a-d8c6-4729-b4fc-037c8b19a357" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 16 14:24:31.523738 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:24:31.523684 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" podUID="cddae74e-b494-48c5-9040-d9c6af603171" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8001/health\": dial tcp 10.134.0.38:8001: connect: connection refused" Apr 16 14:24:31.544851 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:24:31.544809 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" podUID="9ee896e4-e422-4e33-a8e5-ab3c278a06e8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 16 14:24:38.815684 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:24:38.815640 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" podUID="517ef72a-d8c6-4729-b4fc-037c8b19a357" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 16 14:24:41.523612 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:24:41.523564 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" podUID="cddae74e-b494-48c5-9040-d9c6af603171" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8001/health\": dial tcp 10.134.0.38:8001: connect: connection refused" Apr 16 14:24:41.544850 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:24:41.544814 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" podUID="9ee896e4-e422-4e33-a8e5-ab3c278a06e8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 16 14:24:48.815418 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:24:48.815322 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" podUID="517ef72a-d8c6-4729-b4fc-037c8b19a357" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 16 14:24:51.523922 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:24:51.523869 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" podUID="cddae74e-b494-48c5-9040-d9c6af603171" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8001/health\": dial tcp 10.134.0.38:8001: connect: connection refused" Apr 16 14:24:51.545212 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:24:51.545169 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" podUID="9ee896e4-e422-4e33-a8e5-ab3c278a06e8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 16 14:24:58.814938 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:24:58.814888 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" podUID="517ef72a-d8c6-4729-b4fc-037c8b19a357" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 16 14:25:01.524387 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:25:01.524338 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" podUID="cddae74e-b494-48c5-9040-d9c6af603171" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8001/health\": dial tcp 10.134.0.38:8001: connect: connection refused" Apr 16 14:25:01.545253 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:25:01.545208 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" podUID="9ee896e4-e422-4e33-a8e5-ab3c278a06e8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 16 14:25:08.814764 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:25:08.814716 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" podUID="517ef72a-d8c6-4729-b4fc-037c8b19a357" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 16 14:25:11.523442 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:25:11.523401 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" podUID="cddae74e-b494-48c5-9040-d9c6af603171" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8001/health\": dial tcp 10.134.0.38:8001: connect: connection refused" Apr 16 14:25:11.545230 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:25:11.545184 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" podUID="9ee896e4-e422-4e33-a8e5-ab3c278a06e8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 16 14:25:18.815684 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:25:18.815641 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" podUID="517ef72a-d8c6-4729-b4fc-037c8b19a357" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 16 14:25:21.524145 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:25:21.524098 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" podUID="cddae74e-b494-48c5-9040-d9c6af603171" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8001/health\": dial tcp 10.134.0.38:8001: connect: connection refused" Apr 16 14:25:21.545123 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:25:21.545082 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" podUID="9ee896e4-e422-4e33-a8e5-ab3c278a06e8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 16 14:25:28.815160 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:25:28.815112 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" podUID="517ef72a-d8c6-4729-b4fc-037c8b19a357" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 16 14:25:31.524169 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:25:31.524119 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" podUID="cddae74e-b494-48c5-9040-d9c6af603171" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8001/health\": dial tcp 10.134.0.38:8001: connect: connection refused" Apr 16 14:25:31.545283 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:25:31.545241 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" podUID="9ee896e4-e422-4e33-a8e5-ab3c278a06e8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 16 14:25:38.815113 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:25:38.815066 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" podUID="517ef72a-d8c6-4729-b4fc-037c8b19a357" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 16 14:25:41.523581 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:25:41.523512 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" podUID="cddae74e-b494-48c5-9040-d9c6af603171" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8001/health\": dial tcp 10.134.0.38:8001: connect: connection refused" Apr 16 14:25:41.545038 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:25:41.545001 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" podUID="9ee896e4-e422-4e33-a8e5-ab3c278a06e8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 16 14:25:48.815692 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:25:48.815646 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" podUID="517ef72a-d8c6-4729-b4fc-037c8b19a357" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 16 14:25:51.523670 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:25:51.523623 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" podUID="cddae74e-b494-48c5-9040-d9c6af603171" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8001/health\": dial tcp 10.134.0.38:8001: connect: connection refused" Apr 16 14:25:51.545736 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:25:51.545692 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" podUID="9ee896e4-e422-4e33-a8e5-ab3c278a06e8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 16 14:25:58.815451 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:25:58.815403 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" podUID="517ef72a-d8c6-4729-b4fc-037c8b19a357" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 16 14:26:01.523845 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:01.523796 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" podUID="cddae74e-b494-48c5-9040-d9c6af603171" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8001/health\": dial tcp 10.134.0.38:8001: connect: connection refused" Apr 16 14:26:01.545046 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:01.544997 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" podUID="9ee896e4-e422-4e33-a8e5-ab3c278a06e8" containerName="main" probeResult="failure" output="Get \"https://10.134.0.39:8000/health\": dial tcp 10.134.0.39:8000: connect: connection refused" Apr 16 14:26:08.815726 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:08.815677 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" podUID="517ef72a-d8c6-4729-b4fc-037c8b19a357" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 16 14:26:11.523965 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:11.523910 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" podUID="cddae74e-b494-48c5-9040-d9c6af603171" containerName="main" probeResult="failure" output="Get \"https://10.134.0.38:8001/health\": dial tcp 10.134.0.38:8001: connect: connection refused" Apr 16 14:26:11.554664 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:11.554631 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" Apr 16 14:26:11.566613 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:11.566562 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" Apr 16 14:26:18.814819 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:18.814716 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" podUID="517ef72a-d8c6-4729-b4fc-037c8b19a357" containerName="main" probeResult="failure" output="Get \"https://10.134.0.40:8000/health\": dial tcp 10.134.0.40:8000: connect: connection refused" Apr 16 14:26:21.533258 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:21.533225 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" Apr 16 14:26:21.551850 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:21.551815 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" Apr 16 14:26:28.824948 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:28.824905 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" Apr 16 14:26:28.833357 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:28.833324 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" Apr 16 14:26:33.997060 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:33.996998 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz"] Apr 16 14:26:33.997699 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:33.997597 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" podUID="cddae74e-b494-48c5-9040-d9c6af603171" containerName="main" containerID="cri-o://2df00991c33a68879e4223661fbff31481e4787a8f75c4397a7ffab73340eac0" gracePeriod=30 Apr 16 14:26:34.003080 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:34.003054 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566"] Apr 16 14:26:34.003418 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:34.003390 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" podUID="9ee896e4-e422-4e33-a8e5-ab3c278a06e8" containerName="main" containerID="cri-o://dd51d114aeec71208281c559039a611bbf2c5e0380eecc96f45921bed93d7902" gracePeriod=30 Apr 16 14:26:53.986642 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:53.986594 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5"] Apr 16 14:26:53.992571 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:53.992522 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" Apr 16 14:26:53.993217 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:53.993188 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l"] Apr 16 14:26:53.994700 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:53.994677 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-dockercfg-zw787\"" Apr 16 14:26:53.994812 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:53.994778 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 16 14:26:53.997395 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:53.997374 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" Apr 16 14:26:54.001567 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.001508 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5"] Apr 16 14:26:54.007353 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.007319 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l"] Apr 16 14:26:54.086706 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.086669 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhjxm\" (UniqueName: \"kubernetes.io/projected/7d1606b0-5579-4e9f-aac0-c69e69108b32-kube-api-access-jhjxm\") pod \"custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l\" (UID: \"7d1606b0-5579-4e9f-aac0-c69e69108b32\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" Apr 16 14:26:54.086706 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.086713 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7d1606b0-5579-4e9f-aac0-c69e69108b32-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l\" (UID: \"7d1606b0-5579-4e9f-aac0-c69e69108b32\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" Apr 16 14:26:54.086947 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.086742 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d1606b0-5579-4e9f-aac0-c69e69108b32-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l\" (UID: \"7d1606b0-5579-4e9f-aac0-c69e69108b32\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" Apr 16 14:26:54.086947 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.086802 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c24e0dd2-e96f-4029-9a40-9c9ba988128a-dshm\") pod \"custom-route-timeout-pd-test-kserve-586598877f-mdss5\" (UID: \"c24e0dd2-e96f-4029-9a40-9c9ba988128a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" Apr 16 14:26:54.086947 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.086838 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7d1606b0-5579-4e9f-aac0-c69e69108b32-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l\" (UID: \"7d1606b0-5579-4e9f-aac0-c69e69108b32\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" Apr 16 14:26:54.086947 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.086861 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7d1606b0-5579-4e9f-aac0-c69e69108b32-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l\" (UID: \"7d1606b0-5579-4e9f-aac0-c69e69108b32\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" Apr 16 14:26:54.086947 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.086877 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c24e0dd2-e96f-4029-9a40-9c9ba988128a-home\") pod \"custom-route-timeout-pd-test-kserve-586598877f-mdss5\" (UID: \"c24e0dd2-e96f-4029-9a40-9c9ba988128a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" Apr 16 14:26:54.086947 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.086912 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c24e0dd2-e96f-4029-9a40-9c9ba988128a-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-586598877f-mdss5\" (UID: \"c24e0dd2-e96f-4029-9a40-9c9ba988128a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" Apr 16 14:26:54.087139 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.086951 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7d1606b0-5579-4e9f-aac0-c69e69108b32-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l\" (UID: \"7d1606b0-5579-4e9f-aac0-c69e69108b32\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" Apr 16 14:26:54.087139 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.086975 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c24e0dd2-e96f-4029-9a40-9c9ba988128a-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-586598877f-mdss5\" (UID: \"c24e0dd2-e96f-4029-9a40-9c9ba988128a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" Apr 16 14:26:54.087139 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.087031 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c24e0dd2-e96f-4029-9a40-9c9ba988128a-model-cache\") pod \"custom-route-timeout-pd-test-kserve-586598877f-mdss5\" (UID: \"c24e0dd2-e96f-4029-9a40-9c9ba988128a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" Apr 16 14:26:54.087139 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.087059 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dskjs\" (UniqueName: \"kubernetes.io/projected/c24e0dd2-e96f-4029-9a40-9c9ba988128a-kube-api-access-dskjs\") pod \"custom-route-timeout-pd-test-kserve-586598877f-mdss5\" (UID: \"c24e0dd2-e96f-4029-9a40-9c9ba988128a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" Apr 16 14:26:54.188291 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.188239 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7d1606b0-5579-4e9f-aac0-c69e69108b32-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l\" (UID: \"7d1606b0-5579-4e9f-aac0-c69e69108b32\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" Apr 16 14:26:54.188291 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.188298 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c24e0dd2-e96f-4029-9a40-9c9ba988128a-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-586598877f-mdss5\" (UID: \"c24e0dd2-e96f-4029-9a40-9c9ba988128a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" Apr 16 14:26:54.188569 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.188335 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c24e0dd2-e96f-4029-9a40-9c9ba988128a-model-cache\") pod \"custom-route-timeout-pd-test-kserve-586598877f-mdss5\" (UID: \"c24e0dd2-e96f-4029-9a40-9c9ba988128a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" Apr 16 14:26:54.188569 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.188358 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dskjs\" (UniqueName: \"kubernetes.io/projected/c24e0dd2-e96f-4029-9a40-9c9ba988128a-kube-api-access-dskjs\") pod \"custom-route-timeout-pd-test-kserve-586598877f-mdss5\" (UID: \"c24e0dd2-e96f-4029-9a40-9c9ba988128a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" Apr 16 14:26:54.188569 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.188390 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jhjxm\" (UniqueName: \"kubernetes.io/projected/7d1606b0-5579-4e9f-aac0-c69e69108b32-kube-api-access-jhjxm\") pod \"custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l\" (UID: \"7d1606b0-5579-4e9f-aac0-c69e69108b32\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" Apr 16 14:26:54.188569 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.188423 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7d1606b0-5579-4e9f-aac0-c69e69108b32-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l\" (UID: \"7d1606b0-5579-4e9f-aac0-c69e69108b32\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" Apr 16 14:26:54.188569 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.188473 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d1606b0-5579-4e9f-aac0-c69e69108b32-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l\" (UID: \"7d1606b0-5579-4e9f-aac0-c69e69108b32\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" Apr 16 14:26:54.188569 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.188505 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c24e0dd2-e96f-4029-9a40-9c9ba988128a-dshm\") pod \"custom-route-timeout-pd-test-kserve-586598877f-mdss5\" (UID: \"c24e0dd2-e96f-4029-9a40-9c9ba988128a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" Apr 16 14:26:54.188569 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.188564 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7d1606b0-5579-4e9f-aac0-c69e69108b32-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l\" (UID: \"7d1606b0-5579-4e9f-aac0-c69e69108b32\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" Apr 16 14:26:54.189171 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.188590 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7d1606b0-5579-4e9f-aac0-c69e69108b32-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l\" (UID: \"7d1606b0-5579-4e9f-aac0-c69e69108b32\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" Apr 16 14:26:54.189171 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.188613 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c24e0dd2-e96f-4029-9a40-9c9ba988128a-home\") pod \"custom-route-timeout-pd-test-kserve-586598877f-mdss5\" (UID: \"c24e0dd2-e96f-4029-9a40-9c9ba988128a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" Apr 16 14:26:54.189171 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.188651 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c24e0dd2-e96f-4029-9a40-9c9ba988128a-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-586598877f-mdss5\" (UID: \"c24e0dd2-e96f-4029-9a40-9c9ba988128a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" Apr 16 14:26:54.189171 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.188800 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c24e0dd2-e96f-4029-9a40-9c9ba988128a-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-586598877f-mdss5\" (UID: \"c24e0dd2-e96f-4029-9a40-9c9ba988128a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" Apr 16 14:26:54.189171 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.188828 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c24e0dd2-e96f-4029-9a40-9c9ba988128a-model-cache\") pod \"custom-route-timeout-pd-test-kserve-586598877f-mdss5\" (UID: \"c24e0dd2-e96f-4029-9a40-9c9ba988128a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" Apr 16 14:26:54.189171 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.189061 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d1606b0-5579-4e9f-aac0-c69e69108b32-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l\" (UID: \"7d1606b0-5579-4e9f-aac0-c69e69108b32\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" Apr 16 14:26:54.189435 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.189282 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c24e0dd2-e96f-4029-9a40-9c9ba988128a-home\") pod \"custom-route-timeout-pd-test-kserve-586598877f-mdss5\" (UID: \"c24e0dd2-e96f-4029-9a40-9c9ba988128a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" Apr 16 14:26:54.189435 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.189295 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7d1606b0-5579-4e9f-aac0-c69e69108b32-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l\" (UID: \"7d1606b0-5579-4e9f-aac0-c69e69108b32\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" Apr 16 14:26:54.189435 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.189318 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7d1606b0-5579-4e9f-aac0-c69e69108b32-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l\" (UID: \"7d1606b0-5579-4e9f-aac0-c69e69108b32\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" Apr 16 14:26:54.191583 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.191526 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7d1606b0-5579-4e9f-aac0-c69e69108b32-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l\" (UID: \"7d1606b0-5579-4e9f-aac0-c69e69108b32\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" Apr 16 14:26:54.191711 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.191563 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c24e0dd2-e96f-4029-9a40-9c9ba988128a-dshm\") pod \"custom-route-timeout-pd-test-kserve-586598877f-mdss5\" (UID: \"c24e0dd2-e96f-4029-9a40-9c9ba988128a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" Apr 16 14:26:54.191775 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.191743 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c24e0dd2-e96f-4029-9a40-9c9ba988128a-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-586598877f-mdss5\" (UID: \"c24e0dd2-e96f-4029-9a40-9c9ba988128a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" Apr 16 14:26:54.191848 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.191817 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7d1606b0-5579-4e9f-aac0-c69e69108b32-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l\" (UID: \"7d1606b0-5579-4e9f-aac0-c69e69108b32\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" Apr 16 14:26:54.202697 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.202668 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dskjs\" (UniqueName: \"kubernetes.io/projected/c24e0dd2-e96f-4029-9a40-9c9ba988128a-kube-api-access-dskjs\") pod \"custom-route-timeout-pd-test-kserve-586598877f-mdss5\" (UID: \"c24e0dd2-e96f-4029-9a40-9c9ba988128a\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" Apr 16 14:26:54.202820 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.202762 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhjxm\" (UniqueName: \"kubernetes.io/projected/7d1606b0-5579-4e9f-aac0-c69e69108b32-kube-api-access-jhjxm\") pod \"custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l\" (UID: \"7d1606b0-5579-4e9f-aac0-c69e69108b32\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" Apr 16 14:26:54.307224 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.307136 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" Apr 16 14:26:54.315010 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.314985 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" Apr 16 14:26:54.452016 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.451983 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5"] Apr 16 14:26:54.454280 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:26:54.454234 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc24e0dd2_e96f_4029_9a40_9c9ba988128a.slice/crio-161d419164ac78efc6470ad53b16ae2bfe6375f7fc5febecba8bce7105bfcfc7 WatchSource:0}: Error finding container 161d419164ac78efc6470ad53b16ae2bfe6375f7fc5febecba8bce7105bfcfc7: Status 404 returned error can't find the container with id 161d419164ac78efc6470ad53b16ae2bfe6375f7fc5febecba8bce7105bfcfc7 Apr 16 14:26:54.471608 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:54.471586 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l"] Apr 16 14:26:54.473568 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:26:54.473509 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d1606b0_5579_4e9f_aac0_c69e69108b32.slice/crio-a137ea943e9f2732b3dcd22a96a487b64f9f635a965004c372cf06dc1fa40568 WatchSource:0}: Error finding container a137ea943e9f2732b3dcd22a96a487b64f9f635a965004c372cf06dc1fa40568: Status 404 returned error can't find the container with id a137ea943e9f2732b3dcd22a96a487b64f9f635a965004c372cf06dc1fa40568 Apr 16 14:26:55.157981 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:55.157940 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" event={"ID":"c24e0dd2-e96f-4029-9a40-9c9ba988128a","Type":"ContainerStarted","Data":"fe805b4e21947f6173981c2d4fa727a41d75d77bdbc298adfabe75cec43e4649"} Apr 16 14:26:55.158469 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:55.157993 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" event={"ID":"c24e0dd2-e96f-4029-9a40-9c9ba988128a","Type":"ContainerStarted","Data":"161d419164ac78efc6470ad53b16ae2bfe6375f7fc5febecba8bce7105bfcfc7"} Apr 16 14:26:55.158469 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:55.158044 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" Apr 16 14:26:55.159468 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:55.159442 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" event={"ID":"7d1606b0-5579-4e9f-aac0-c69e69108b32","Type":"ContainerStarted","Data":"10501ae393c8fe7c03da5565911bbd0258850ee7b06833210a87d04e38d406f9"} Apr 16 14:26:55.159634 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:55.159472 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" event={"ID":"7d1606b0-5579-4e9f-aac0-c69e69108b32","Type":"ContainerStarted","Data":"a137ea943e9f2732b3dcd22a96a487b64f9f635a965004c372cf06dc1fa40568"} Apr 16 14:26:55.825146 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:55.825109 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c"] Apr 16 14:26:55.825557 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:55.825467 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" podUID="517ef72a-d8c6-4729-b4fc-037c8b19a357" containerName="main" containerID="cri-o://e6d1a8dcc23c286893c349d14d993a1fdbce5305640aede3bcb6b9778f5f188e" gracePeriod=30 Apr 16 14:26:56.166084 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:56.166039 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" event={"ID":"c24e0dd2-e96f-4029-9a40-9c9ba988128a","Type":"ContainerStarted","Data":"35cfccdaf0e50fdc1749118ec5aaf2a847caf7b574706589be4484ebda3238a7"} Apr 16 14:26:59.178087 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:59.178054 2570 generic.go:358] "Generic (PLEG): container finished" podID="7d1606b0-5579-4e9f-aac0-c69e69108b32" containerID="10501ae393c8fe7c03da5565911bbd0258850ee7b06833210a87d04e38d406f9" exitCode=0 Apr 16 14:26:59.178521 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:26:59.178130 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" event={"ID":"7d1606b0-5579-4e9f-aac0-c69e69108b32","Type":"ContainerDied","Data":"10501ae393c8fe7c03da5565911bbd0258850ee7b06833210a87d04e38d406f9"} Apr 16 14:27:00.183219 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:00.183183 2570 generic.go:358] "Generic (PLEG): container finished" podID="c24e0dd2-e96f-4029-9a40-9c9ba988128a" containerID="35cfccdaf0e50fdc1749118ec5aaf2a847caf7b574706589be4484ebda3238a7" exitCode=0 Apr 16 14:27:00.183757 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:00.183256 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" event={"ID":"c24e0dd2-e96f-4029-9a40-9c9ba988128a","Type":"ContainerDied","Data":"35cfccdaf0e50fdc1749118ec5aaf2a847caf7b574706589be4484ebda3238a7"} Apr 16 14:27:00.185146 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:00.185125 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" event={"ID":"7d1606b0-5579-4e9f-aac0-c69e69108b32","Type":"ContainerStarted","Data":"9c994075b22b70af73ff84f4d08333a4295bceb508ef63db003b8134d6268dd2"} Apr 16 14:27:00.226312 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:00.226257 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" podStartSLOduration=7.22623802 podStartE2EDuration="7.22623802s" podCreationTimestamp="2026-04-16 14:26:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:27:00.224506775 +0000 UTC m=+1646.430149940" watchObservedRunningTime="2026-04-16 14:27:00.22623802 +0000 UTC m=+1646.431881226" Apr 16 14:27:01.192017 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:01.191977 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" event={"ID":"c24e0dd2-e96f-4029-9a40-9c9ba988128a","Type":"ContainerStarted","Data":"bacfaed84451c05d067bf574f595f51ff50b825d6f20bb97ec2e258c029276e6"} Apr 16 14:27:01.213553 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:01.213494 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" podStartSLOduration=8.213477613 podStartE2EDuration="8.213477613s" podCreationTimestamp="2026-04-16 14:26:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:27:01.21271564 +0000 UTC m=+1647.418358809" watchObservedRunningTime="2026-04-16 14:27:01.213477613 +0000 UTC m=+1647.419120778" Apr 16 14:27:03.998634 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:03.998592 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" podUID="cddae74e-b494-48c5-9040-d9c6af603171" containerName="llm-d-routing-sidecar" containerID="cri-o://6e699e3cea9d1e5d7caafe4a8735b5b4e009bdf77df431723aae629ce503f0bf" gracePeriod=2 Apr 16 14:27:04.207171 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.206206 2570 generic.go:358] "Generic (PLEG): container finished" podID="9ee896e4-e422-4e33-a8e5-ab3c278a06e8" containerID="dd51d114aeec71208281c559039a611bbf2c5e0380eecc96f45921bed93d7902" exitCode=137 Apr 16 14:27:04.207171 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.206377 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" event={"ID":"9ee896e4-e422-4e33-a8e5-ab3c278a06e8","Type":"ContainerDied","Data":"dd51d114aeec71208281c559039a611bbf2c5e0380eecc96f45921bed93d7902"} Apr 16 14:27:04.208286 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.208264 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz_cddae74e-b494-48c5-9040-d9c6af603171/main/0.log" Apr 16 14:27:04.209145 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.209013 2570 generic.go:358] "Generic (PLEG): container finished" podID="cddae74e-b494-48c5-9040-d9c6af603171" containerID="2df00991c33a68879e4223661fbff31481e4787a8f75c4397a7ffab73340eac0" exitCode=137 Apr 16 14:27:04.209145 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.209033 2570 generic.go:358] "Generic (PLEG): container finished" podID="cddae74e-b494-48c5-9040-d9c6af603171" containerID="6e699e3cea9d1e5d7caafe4a8735b5b4e009bdf77df431723aae629ce503f0bf" exitCode=0 Apr 16 14:27:04.209145 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.209073 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" event={"ID":"cddae74e-b494-48c5-9040-d9c6af603171","Type":"ContainerDied","Data":"2df00991c33a68879e4223661fbff31481e4787a8f75c4397a7ffab73340eac0"} Apr 16 14:27:04.209145 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.209094 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" event={"ID":"cddae74e-b494-48c5-9040-d9c6af603171","Type":"ContainerDied","Data":"6e699e3cea9d1e5d7caafe4a8735b5b4e009bdf77df431723aae629ce503f0bf"} Apr 16 14:27:04.309166 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.309123 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" podUID="c24e0dd2-e96f-4029-9a40-9c9ba988128a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8001/health\": dial tcp 10.134.0.41:8001: connect: connection refused" Apr 16 14:27:04.317617 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.317579 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" podUID="7d1606b0-5579-4e9f-aac0-c69e69108b32" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 16 14:27:04.318741 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.318515 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" Apr 16 14:27:04.318741 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.318576 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" Apr 16 14:27:04.318741 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.318592 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" Apr 16 14:27:04.318741 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.318604 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" Apr 16 14:27:04.334390 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.334369 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" Apr 16 14:27:04.353446 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.353409 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz_cddae74e-b494-48c5-9040-d9c6af603171/main/0.log" Apr 16 14:27:04.354290 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.354268 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" Apr 16 14:27:04.358272 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.358176 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" Apr 16 14:27:04.408512 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.408467 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cddae74e-b494-48c5-9040-d9c6af603171-model-cache\") pod \"cddae74e-b494-48c5-9040-d9c6af603171\" (UID: \"cddae74e-b494-48c5-9040-d9c6af603171\") " Apr 16 14:27:04.408721 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.408574 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cddae74e-b494-48c5-9040-d9c6af603171-home\") pod \"cddae74e-b494-48c5-9040-d9c6af603171\" (UID: \"cddae74e-b494-48c5-9040-d9c6af603171\") " Apr 16 14:27:04.408721 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.408640 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cddae74e-b494-48c5-9040-d9c6af603171-kserve-provision-location\") pod \"cddae74e-b494-48c5-9040-d9c6af603171\" (UID: \"cddae74e-b494-48c5-9040-d9c6af603171\") " Apr 16 14:27:04.408721 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.408683 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4859b\" (UniqueName: \"kubernetes.io/projected/cddae74e-b494-48c5-9040-d9c6af603171-kube-api-access-4859b\") pod \"cddae74e-b494-48c5-9040-d9c6af603171\" (UID: \"cddae74e-b494-48c5-9040-d9c6af603171\") " Apr 16 14:27:04.408721 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.408709 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cddae74e-b494-48c5-9040-d9c6af603171-tls-certs\") pod \"cddae74e-b494-48c5-9040-d9c6af603171\" (UID: \"cddae74e-b494-48c5-9040-d9c6af603171\") " Apr 16 14:27:04.408960 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.408788 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cddae74e-b494-48c5-9040-d9c6af603171-dshm\") pod \"cddae74e-b494-48c5-9040-d9c6af603171\" (UID: \"cddae74e-b494-48c5-9040-d9c6af603171\") " Apr 16 14:27:04.409917 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.409889 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cddae74e-b494-48c5-9040-d9c6af603171-model-cache" (OuterVolumeSpecName: "model-cache") pod "cddae74e-b494-48c5-9040-d9c6af603171" (UID: "cddae74e-b494-48c5-9040-d9c6af603171"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:27:04.412234 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.412196 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cddae74e-b494-48c5-9040-d9c6af603171-home" (OuterVolumeSpecName: "home") pod "cddae74e-b494-48c5-9040-d9c6af603171" (UID: "cddae74e-b494-48c5-9040-d9c6af603171"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:27:04.412764 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.412735 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cddae74e-b494-48c5-9040-d9c6af603171-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "cddae74e-b494-48c5-9040-d9c6af603171" (UID: "cddae74e-b494-48c5-9040-d9c6af603171"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:27:04.412882 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.412829 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cddae74e-b494-48c5-9040-d9c6af603171-kube-api-access-4859b" (OuterVolumeSpecName: "kube-api-access-4859b") pod "cddae74e-b494-48c5-9040-d9c6af603171" (UID: "cddae74e-b494-48c5-9040-d9c6af603171"). InnerVolumeSpecName "kube-api-access-4859b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:27:04.414024 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.413989 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cddae74e-b494-48c5-9040-d9c6af603171-dshm" (OuterVolumeSpecName: "dshm") pod "cddae74e-b494-48c5-9040-d9c6af603171" (UID: "cddae74e-b494-48c5-9040-d9c6af603171"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:27:04.440649 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.440584 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cddae74e-b494-48c5-9040-d9c6af603171-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "cddae74e-b494-48c5-9040-d9c6af603171" (UID: "cddae74e-b494-48c5-9040-d9c6af603171"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:27:04.510397 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.510308 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-tls-certs\") pod \"9ee896e4-e422-4e33-a8e5-ab3c278a06e8\" (UID: \"9ee896e4-e422-4e33-a8e5-ab3c278a06e8\") " Apr 16 14:27:04.510397 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.510357 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-dshm\") pod \"9ee896e4-e422-4e33-a8e5-ab3c278a06e8\" (UID: \"9ee896e4-e422-4e33-a8e5-ab3c278a06e8\") " Apr 16 14:27:04.510397 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.510397 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-home\") pod \"9ee896e4-e422-4e33-a8e5-ab3c278a06e8\" (UID: \"9ee896e4-e422-4e33-a8e5-ab3c278a06e8\") " Apr 16 14:27:04.510767 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.510461 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-kserve-provision-location\") pod \"9ee896e4-e422-4e33-a8e5-ab3c278a06e8\" (UID: \"9ee896e4-e422-4e33-a8e5-ab3c278a06e8\") " Apr 16 14:27:04.510767 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.510574 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k9jrf\" (UniqueName: \"kubernetes.io/projected/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-kube-api-access-k9jrf\") pod \"9ee896e4-e422-4e33-a8e5-ab3c278a06e8\" (UID: \"9ee896e4-e422-4e33-a8e5-ab3c278a06e8\") " Apr 16 14:27:04.510767 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.510620 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-model-cache\") pod \"9ee896e4-e422-4e33-a8e5-ab3c278a06e8\" (UID: \"9ee896e4-e422-4e33-a8e5-ab3c278a06e8\") " Apr 16 14:27:04.510943 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.510887 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-home" (OuterVolumeSpecName: "home") pod "9ee896e4-e422-4e33-a8e5-ab3c278a06e8" (UID: "9ee896e4-e422-4e33-a8e5-ab3c278a06e8"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:27:04.511014 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.510994 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/cddae74e-b494-48c5-9040-d9c6af603171-kserve-provision-location\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:27:04.511067 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.511020 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4859b\" (UniqueName: \"kubernetes.io/projected/cddae74e-b494-48c5-9040-d9c6af603171-kube-api-access-4859b\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:27:04.511067 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.511036 2570 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/cddae74e-b494-48c5-9040-d9c6af603171-tls-certs\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:27:04.511172 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.511123 2570 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-home\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:27:04.511172 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.511138 2570 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/cddae74e-b494-48c5-9040-d9c6af603171-dshm\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:27:04.511172 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.511151 2570 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/cddae74e-b494-48c5-9040-d9c6af603171-model-cache\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:27:04.511172 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.511164 2570 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/cddae74e-b494-48c5-9040-d9c6af603171-home\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:27:04.511397 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.511181 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-model-cache" (OuterVolumeSpecName: "model-cache") pod "9ee896e4-e422-4e33-a8e5-ab3c278a06e8" (UID: "9ee896e4-e422-4e33-a8e5-ab3c278a06e8"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:27:04.512934 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.512903 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "9ee896e4-e422-4e33-a8e5-ab3c278a06e8" (UID: "9ee896e4-e422-4e33-a8e5-ab3c278a06e8"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:27:04.513315 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.513293 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-dshm" (OuterVolumeSpecName: "dshm") pod "9ee896e4-e422-4e33-a8e5-ab3c278a06e8" (UID: "9ee896e4-e422-4e33-a8e5-ab3c278a06e8"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:27:04.513574 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.513496 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-kube-api-access-k9jrf" (OuterVolumeSpecName: "kube-api-access-k9jrf") pod "9ee896e4-e422-4e33-a8e5-ab3c278a06e8" (UID: "9ee896e4-e422-4e33-a8e5-ab3c278a06e8"). InnerVolumeSpecName "kube-api-access-k9jrf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:27:04.539162 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.539118 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9ee896e4-e422-4e33-a8e5-ab3c278a06e8" (UID: "9ee896e4-e422-4e33-a8e5-ab3c278a06e8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:27:04.612472 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.612430 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k9jrf\" (UniqueName: \"kubernetes.io/projected/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-kube-api-access-k9jrf\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:27:04.612472 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.612471 2570 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-model-cache\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:27:04.612687 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.612489 2570 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-tls-certs\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:27:04.612687 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.612503 2570 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-dshm\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:27:04.612687 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:04.612517 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9ee896e4-e422-4e33-a8e5-ab3c278a06e8-kserve-provision-location\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:27:05.214443 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:05.214411 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" Apr 16 14:27:05.214910 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:05.214409 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566" event={"ID":"9ee896e4-e422-4e33-a8e5-ab3c278a06e8","Type":"ContainerDied","Data":"403b4b9169d55094fb8630fac10eb728e84f03cd5f40b0bf3b320f3fc42ddf39"} Apr 16 14:27:05.214910 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:05.214575 2570 scope.go:117] "RemoveContainer" containerID="dd51d114aeec71208281c559039a611bbf2c5e0380eecc96f45921bed93d7902" Apr 16 14:27:05.216020 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:05.215977 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz_cddae74e-b494-48c5-9040-d9c6af603171/main/0.log" Apr 16 14:27:05.216960 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:05.216938 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" Apr 16 14:27:05.217093 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:05.216952 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz" event={"ID":"cddae74e-b494-48c5-9040-d9c6af603171","Type":"ContainerDied","Data":"cee8356c6960013342a702d7bc2c11d97887b9f2a420da200e4cda30960fa248"} Apr 16 14:27:05.243163 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:05.243112 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566"] Apr 16 14:27:05.245995 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:05.245966 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-6bxp566"] Apr 16 14:27:05.246091 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:05.246084 2570 scope.go:117] "RemoveContainer" containerID="c15886af513c92af2a01223d26cae7372d76c845a35ab7c7e5e0dc176f29e1b7" Apr 16 14:27:05.256882 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:05.256856 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz"] Apr 16 14:27:05.262176 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:05.262141 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5bd6f8cb66gzbtz"] Apr 16 14:27:05.286127 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:05.286093 2570 scope.go:117] "RemoveContainer" containerID="2df00991c33a68879e4223661fbff31481e4787a8f75c4397a7ffab73340eac0" Apr 16 14:27:05.315589 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:05.315565 2570 scope.go:117] "RemoveContainer" containerID="5c46ad9b0fab6715debfbabc3987d65d925368065c9f42cef52cf70f70be8649" Apr 16 14:27:05.353216 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:05.353185 2570 scope.go:117] "RemoveContainer" containerID="6e699e3cea9d1e5d7caafe4a8735b5b4e009bdf77df431723aae629ce503f0bf" Apr 16 14:27:06.310947 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:06.310918 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ee896e4-e422-4e33-a8e5-ab3c278a06e8" path="/var/lib/kubelet/pods/9ee896e4-e422-4e33-a8e5-ab3c278a06e8/volumes" Apr 16 14:27:06.311351 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:06.311337 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cddae74e-b494-48c5-9040-d9c6af603171" path="/var/lib/kubelet/pods/cddae74e-b494-48c5-9040-d9c6af603171/volumes" Apr 16 14:27:07.671052 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.671016 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 14:27:07.671642 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.671622 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ee896e4-e422-4e33-a8e5-ab3c278a06e8" containerName="storage-initializer" Apr 16 14:27:07.671642 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.671643 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ee896e4-e422-4e33-a8e5-ab3c278a06e8" containerName="storage-initializer" Apr 16 14:27:07.671821 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.671656 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9ee896e4-e422-4e33-a8e5-ab3c278a06e8" containerName="main" Apr 16 14:27:07.671821 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.671663 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ee896e4-e422-4e33-a8e5-ab3c278a06e8" containerName="main" Apr 16 14:27:07.671821 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.671677 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cddae74e-b494-48c5-9040-d9c6af603171" containerName="llm-d-routing-sidecar" Apr 16 14:27:07.671821 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.671685 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="cddae74e-b494-48c5-9040-d9c6af603171" containerName="llm-d-routing-sidecar" Apr 16 14:27:07.671821 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.671695 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cddae74e-b494-48c5-9040-d9c6af603171" containerName="storage-initializer" Apr 16 14:27:07.671821 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.671705 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="cddae74e-b494-48c5-9040-d9c6af603171" containerName="storage-initializer" Apr 16 14:27:07.671821 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.671722 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="cddae74e-b494-48c5-9040-d9c6af603171" containerName="main" Apr 16 14:27:07.671821 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.671729 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="cddae74e-b494-48c5-9040-d9c6af603171" containerName="main" Apr 16 14:27:07.672219 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.671853 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="cddae74e-b494-48c5-9040-d9c6af603171" containerName="llm-d-routing-sidecar" Apr 16 14:27:07.672219 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.671866 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="9ee896e4-e422-4e33-a8e5-ab3c278a06e8" containerName="main" Apr 16 14:27:07.672219 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.671880 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="cddae74e-b494-48c5-9040-d9c6af603171" containerName="main" Apr 16 14:27:07.676736 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.676711 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:27:07.680464 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.680435 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-xjkzv\"" Apr 16 14:27:07.680614 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.680476 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 16 14:27:07.690785 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.690754 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 14:27:07.740714 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.740678 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ndfx\" (UniqueName: \"kubernetes.io/projected/f055b746-2dc2-4bde-94af-40023ed29112-kube-api-access-8ndfx\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f055b746-2dc2-4bde-94af-40023ed29112\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:27:07.740895 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.740734 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f055b746-2dc2-4bde-94af-40023ed29112-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f055b746-2dc2-4bde-94af-40023ed29112\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:27:07.740895 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.740813 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f055b746-2dc2-4bde-94af-40023ed29112-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f055b746-2dc2-4bde-94af-40023ed29112\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:27:07.740895 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.740865 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f055b746-2dc2-4bde-94af-40023ed29112-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f055b746-2dc2-4bde-94af-40023ed29112\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:27:07.741067 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.741011 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f055b746-2dc2-4bde-94af-40023ed29112-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f055b746-2dc2-4bde-94af-40023ed29112\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:27:07.741067 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.741048 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f055b746-2dc2-4bde-94af-40023ed29112-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f055b746-2dc2-4bde-94af-40023ed29112\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:27:07.842199 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.842159 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8ndfx\" (UniqueName: \"kubernetes.io/projected/f055b746-2dc2-4bde-94af-40023ed29112-kube-api-access-8ndfx\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f055b746-2dc2-4bde-94af-40023ed29112\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:27:07.842199 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.842214 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f055b746-2dc2-4bde-94af-40023ed29112-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f055b746-2dc2-4bde-94af-40023ed29112\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:27:07.842454 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.842247 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f055b746-2dc2-4bde-94af-40023ed29112-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f055b746-2dc2-4bde-94af-40023ed29112\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:27:07.842454 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.842289 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f055b746-2dc2-4bde-94af-40023ed29112-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f055b746-2dc2-4bde-94af-40023ed29112\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:27:07.842454 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.842368 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f055b746-2dc2-4bde-94af-40023ed29112-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f055b746-2dc2-4bde-94af-40023ed29112\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:27:07.842454 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.842398 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f055b746-2dc2-4bde-94af-40023ed29112-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f055b746-2dc2-4bde-94af-40023ed29112\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:27:07.842698 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.842673 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f055b746-2dc2-4bde-94af-40023ed29112-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f055b746-2dc2-4bde-94af-40023ed29112\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:27:07.842754 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.842701 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f055b746-2dc2-4bde-94af-40023ed29112-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f055b746-2dc2-4bde-94af-40023ed29112\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:27:07.842809 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.842752 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f055b746-2dc2-4bde-94af-40023ed29112-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f055b746-2dc2-4bde-94af-40023ed29112\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:27:07.844954 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.844918 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f055b746-2dc2-4bde-94af-40023ed29112-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f055b746-2dc2-4bde-94af-40023ed29112\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:27:07.844954 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.844949 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f055b746-2dc2-4bde-94af-40023ed29112-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f055b746-2dc2-4bde-94af-40023ed29112\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:27:07.850363 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.850331 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ndfx\" (UniqueName: \"kubernetes.io/projected/f055b746-2dc2-4bde-94af-40023ed29112-kube-api-access-8ndfx\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"f055b746-2dc2-4bde-94af-40023ed29112\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:27:07.988762 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:07.988669 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:27:08.139895 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:08.139861 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 14:27:08.141912 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:27:08.141876 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf055b746_2dc2_4bde_94af_40023ed29112.slice/crio-1a6bfff83355fe77dd8629f8b7352a0b53276f064586b64c1a02a010a8d13948 WatchSource:0}: Error finding container 1a6bfff83355fe77dd8629f8b7352a0b53276f064586b64c1a02a010a8d13948: Status 404 returned error can't find the container with id 1a6bfff83355fe77dd8629f8b7352a0b53276f064586b64c1a02a010a8d13948 Apr 16 14:27:08.232422 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:08.232377 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"f055b746-2dc2-4bde-94af-40023ed29112","Type":"ContainerStarted","Data":"46958227a546f9484c6f804b9ee65df09023aef2e383f185cb5b204c6098c94e"} Apr 16 14:27:08.232422 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:08.232424 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"f055b746-2dc2-4bde-94af-40023ed29112","Type":"ContainerStarted","Data":"1a6bfff83355fe77dd8629f8b7352a0b53276f064586b64c1a02a010a8d13948"} Apr 16 14:27:13.262037 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:13.261999 2570 generic.go:358] "Generic (PLEG): container finished" podID="f055b746-2dc2-4bde-94af-40023ed29112" containerID="46958227a546f9484c6f804b9ee65df09023aef2e383f185cb5b204c6098c94e" exitCode=0 Apr 16 14:27:13.262516 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:13.262059 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"f055b746-2dc2-4bde-94af-40023ed29112","Type":"ContainerDied","Data":"46958227a546f9484c6f804b9ee65df09023aef2e383f185cb5b204c6098c94e"} Apr 16 14:27:14.269544 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:14.269479 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"f055b746-2dc2-4bde-94af-40023ed29112","Type":"ContainerStarted","Data":"6a4cb93bc66602fec0b187437ec9c54dba0188044c5ca4b92804ce9ea45ec831"} Apr 16 14:27:14.289895 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:14.289832 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podStartSLOduration=7.289817213 podStartE2EDuration="7.289817213s" podCreationTimestamp="2026-04-16 14:27:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:27:14.28877822 +0000 UTC m=+1660.494421384" watchObservedRunningTime="2026-04-16 14:27:14.289817213 +0000 UTC m=+1660.495460377" Apr 16 14:27:14.307732 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:14.307692 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" podUID="c24e0dd2-e96f-4029-9a40-9c9ba988128a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8001/health\": dial tcp 10.134.0.41:8001: connect: connection refused" Apr 16 14:27:14.316054 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:14.316024 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" podUID="7d1606b0-5579-4e9f-aac0-c69e69108b32" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 16 14:27:17.989285 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:17.989231 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:27:17.990698 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:17.990669 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f055b746-2dc2-4bde-94af-40023ed29112" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 16 14:27:24.307824 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:24.307769 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" podUID="c24e0dd2-e96f-4029-9a40-9c9ba988128a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8001/health\": dial tcp 10.134.0.41:8001: connect: connection refused" Apr 16 14:27:24.316117 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:24.316079 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" podUID="7d1606b0-5579-4e9f-aac0-c69e69108b32" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 16 14:27:26.272051 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:26.272019 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c_517ef72a-d8c6-4729-b4fc-037c8b19a357/main/0.log" Apr 16 14:27:26.272507 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:26.272486 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" Apr 16 14:27:26.319138 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:26.319109 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c_517ef72a-d8c6-4729-b4fc-037c8b19a357/main/0.log" Apr 16 14:27:26.319506 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:26.319473 2570 generic.go:358] "Generic (PLEG): container finished" podID="517ef72a-d8c6-4729-b4fc-037c8b19a357" containerID="e6d1a8dcc23c286893c349d14d993a1fdbce5305640aede3bcb6b9778f5f188e" exitCode=137 Apr 16 14:27:26.319677 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:26.319563 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" event={"ID":"517ef72a-d8c6-4729-b4fc-037c8b19a357","Type":"ContainerDied","Data":"e6d1a8dcc23c286893c349d14d993a1fdbce5305640aede3bcb6b9778f5f188e"} Apr 16 14:27:26.319677 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:26.319588 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" Apr 16 14:27:26.319677 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:26.319606 2570 scope.go:117] "RemoveContainer" containerID="e6d1a8dcc23c286893c349d14d993a1fdbce5305640aede3bcb6b9778f5f188e" Apr 16 14:27:26.319868 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:26.319594 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c" event={"ID":"517ef72a-d8c6-4729-b4fc-037c8b19a357","Type":"ContainerDied","Data":"2bd7c3aa25c6cd4da19116bc8b45ecc97fd49859a780d5ca982aaaeeadbb0270"} Apr 16 14:27:26.347044 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:26.347013 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/517ef72a-d8c6-4729-b4fc-037c8b19a357-tls-certs\") pod \"517ef72a-d8c6-4729-b4fc-037c8b19a357\" (UID: \"517ef72a-d8c6-4729-b4fc-037c8b19a357\") " Apr 16 14:27:26.347217 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:26.347069 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/517ef72a-d8c6-4729-b4fc-037c8b19a357-kserve-provision-location\") pod \"517ef72a-d8c6-4729-b4fc-037c8b19a357\" (UID: \"517ef72a-d8c6-4729-b4fc-037c8b19a357\") " Apr 16 14:27:26.347217 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:26.347106 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/517ef72a-d8c6-4729-b4fc-037c8b19a357-dshm\") pod \"517ef72a-d8c6-4729-b4fc-037c8b19a357\" (UID: \"517ef72a-d8c6-4729-b4fc-037c8b19a357\") " Apr 16 14:27:26.347627 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:26.347599 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpfr5\" (UniqueName: \"kubernetes.io/projected/517ef72a-d8c6-4729-b4fc-037c8b19a357-kube-api-access-zpfr5\") pod \"517ef72a-d8c6-4729-b4fc-037c8b19a357\" (UID: \"517ef72a-d8c6-4729-b4fc-037c8b19a357\") " Apr 16 14:27:26.348066 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:26.348039 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/517ef72a-d8c6-4729-b4fc-037c8b19a357-home\") pod \"517ef72a-d8c6-4729-b4fc-037c8b19a357\" (UID: \"517ef72a-d8c6-4729-b4fc-037c8b19a357\") " Apr 16 14:27:26.348182 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:26.348084 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/517ef72a-d8c6-4729-b4fc-037c8b19a357-model-cache\") pod \"517ef72a-d8c6-4729-b4fc-037c8b19a357\" (UID: \"517ef72a-d8c6-4729-b4fc-037c8b19a357\") " Apr 16 14:27:26.348673 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:26.348627 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/517ef72a-d8c6-4729-b4fc-037c8b19a357-home" (OuterVolumeSpecName: "home") pod "517ef72a-d8c6-4729-b4fc-037c8b19a357" (UID: "517ef72a-d8c6-4729-b4fc-037c8b19a357"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:27:26.348868 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:26.348734 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/517ef72a-d8c6-4729-b4fc-037c8b19a357-model-cache" (OuterVolumeSpecName: "model-cache") pod "517ef72a-d8c6-4729-b4fc-037c8b19a357" (UID: "517ef72a-d8c6-4729-b4fc-037c8b19a357"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:27:26.351499 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:26.350586 2570 scope.go:117] "RemoveContainer" containerID="8352b473e6ccd1a4355b876e3586b348e4cc0ba83aa221fa57a01658f4f29ec1" Apr 16 14:27:26.351862 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:26.351801 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/517ef72a-d8c6-4729-b4fc-037c8b19a357-dshm" (OuterVolumeSpecName: "dshm") pod "517ef72a-d8c6-4729-b4fc-037c8b19a357" (UID: "517ef72a-d8c6-4729-b4fc-037c8b19a357"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:27:26.351862 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:26.351802 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/517ef72a-d8c6-4729-b4fc-037c8b19a357-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "517ef72a-d8c6-4729-b4fc-037c8b19a357" (UID: "517ef72a-d8c6-4729-b4fc-037c8b19a357"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:27:26.352431 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:26.352189 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/517ef72a-d8c6-4729-b4fc-037c8b19a357-kube-api-access-zpfr5" (OuterVolumeSpecName: "kube-api-access-zpfr5") pod "517ef72a-d8c6-4729-b4fc-037c8b19a357" (UID: "517ef72a-d8c6-4729-b4fc-037c8b19a357"). InnerVolumeSpecName "kube-api-access-zpfr5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:27:26.415860 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:26.415828 2570 scope.go:117] "RemoveContainer" containerID="e6d1a8dcc23c286893c349d14d993a1fdbce5305640aede3bcb6b9778f5f188e" Apr 16 14:27:26.416319 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:27:26.416291 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6d1a8dcc23c286893c349d14d993a1fdbce5305640aede3bcb6b9778f5f188e\": container with ID starting with e6d1a8dcc23c286893c349d14d993a1fdbce5305640aede3bcb6b9778f5f188e not found: ID does not exist" containerID="e6d1a8dcc23c286893c349d14d993a1fdbce5305640aede3bcb6b9778f5f188e" Apr 16 14:27:26.416423 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:26.416333 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6d1a8dcc23c286893c349d14d993a1fdbce5305640aede3bcb6b9778f5f188e"} err="failed to get container status \"e6d1a8dcc23c286893c349d14d993a1fdbce5305640aede3bcb6b9778f5f188e\": rpc error: code = NotFound desc = could not find container \"e6d1a8dcc23c286893c349d14d993a1fdbce5305640aede3bcb6b9778f5f188e\": container with ID starting with e6d1a8dcc23c286893c349d14d993a1fdbce5305640aede3bcb6b9778f5f188e not found: ID does not exist" Apr 16 14:27:26.416423 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:26.416363 2570 scope.go:117] "RemoveContainer" containerID="8352b473e6ccd1a4355b876e3586b348e4cc0ba83aa221fa57a01658f4f29ec1" Apr 16 14:27:26.416694 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:27:26.416670 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8352b473e6ccd1a4355b876e3586b348e4cc0ba83aa221fa57a01658f4f29ec1\": container with ID starting with 8352b473e6ccd1a4355b876e3586b348e4cc0ba83aa221fa57a01658f4f29ec1 not found: ID does not exist" containerID="8352b473e6ccd1a4355b876e3586b348e4cc0ba83aa221fa57a01658f4f29ec1" Apr 16 14:27:26.416771 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:26.416702 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8352b473e6ccd1a4355b876e3586b348e4cc0ba83aa221fa57a01658f4f29ec1"} err="failed to get container status \"8352b473e6ccd1a4355b876e3586b348e4cc0ba83aa221fa57a01658f4f29ec1\": rpc error: code = NotFound desc = could not find container \"8352b473e6ccd1a4355b876e3586b348e4cc0ba83aa221fa57a01658f4f29ec1\": container with ID starting with 8352b473e6ccd1a4355b876e3586b348e4cc0ba83aa221fa57a01658f4f29ec1 not found: ID does not exist" Apr 16 14:27:26.430021 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:26.429971 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/517ef72a-d8c6-4729-b4fc-037c8b19a357-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "517ef72a-d8c6-4729-b4fc-037c8b19a357" (UID: "517ef72a-d8c6-4729-b4fc-037c8b19a357"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:27:26.449440 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:26.449398 2570 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/517ef72a-d8c6-4729-b4fc-037c8b19a357-home\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:27:26.449440 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:26.449428 2570 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/517ef72a-d8c6-4729-b4fc-037c8b19a357-model-cache\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:27:26.449440 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:26.449439 2570 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/517ef72a-d8c6-4729-b4fc-037c8b19a357-tls-certs\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:27:26.449732 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:26.449448 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/517ef72a-d8c6-4729-b4fc-037c8b19a357-kserve-provision-location\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:27:26.449732 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:26.449457 2570 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/517ef72a-d8c6-4729-b4fc-037c8b19a357-dshm\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:27:26.449732 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:26.449466 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zpfr5\" (UniqueName: \"kubernetes.io/projected/517ef72a-d8c6-4729-b4fc-037c8b19a357-kube-api-access-zpfr5\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:27:26.646582 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:26.646548 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c"] Apr 16 14:27:26.650265 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:26.650225 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-866fd6f965hc74c"] Apr 16 14:27:27.989595 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:27.989548 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f055b746-2dc2-4bde-94af-40023ed29112" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 16 14:27:28.311863 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:28.311772 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="517ef72a-d8c6-4729-b4fc-037c8b19a357" path="/var/lib/kubelet/pods/517ef72a-d8c6-4729-b4fc-037c8b19a357/volumes" Apr 16 14:27:34.308466 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:34.308036 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" podUID="c24e0dd2-e96f-4029-9a40-9c9ba988128a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8001/health\": dial tcp 10.134.0.41:8001: connect: connection refused" Apr 16 14:27:34.316204 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:34.316160 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" podUID="7d1606b0-5579-4e9f-aac0-c69e69108b32" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 16 14:27:37.989250 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:37.989211 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:27:37.989736 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:37.989605 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f055b746-2dc2-4bde-94af-40023ed29112" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 16 14:27:44.308552 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:44.308492 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" podUID="c24e0dd2-e96f-4029-9a40-9c9ba988128a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8001/health\": dial tcp 10.134.0.41:8001: connect: connection refused" Apr 16 14:27:44.315402 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:44.315357 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" podUID="7d1606b0-5579-4e9f-aac0-c69e69108b32" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 16 14:27:47.989288 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:47.989164 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f055b746-2dc2-4bde-94af-40023ed29112" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 16 14:27:54.307914 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:54.307871 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" podUID="c24e0dd2-e96f-4029-9a40-9c9ba988128a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8001/health\": dial tcp 10.134.0.41:8001: connect: connection refused" Apr 16 14:27:54.315576 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:54.315520 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" podUID="7d1606b0-5579-4e9f-aac0-c69e69108b32" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 16 14:27:57.989399 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:27:57.989349 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f055b746-2dc2-4bde-94af-40023ed29112" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 16 14:28:04.307802 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:28:04.307740 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" podUID="c24e0dd2-e96f-4029-9a40-9c9ba988128a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8001/health\": dial tcp 10.134.0.41:8001: connect: connection refused" Apr 16 14:28:04.315428 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:28:04.315390 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" podUID="7d1606b0-5579-4e9f-aac0-c69e69108b32" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 16 14:28:07.990022 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:28:07.989979 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f055b746-2dc2-4bde-94af-40023ed29112" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 16 14:28:14.307891 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:28:14.307839 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" podUID="c24e0dd2-e96f-4029-9a40-9c9ba988128a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8001/health\": dial tcp 10.134.0.41:8001: connect: connection refused" Apr 16 14:28:14.316389 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:28:14.316357 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" podUID="7d1606b0-5579-4e9f-aac0-c69e69108b32" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 16 14:28:17.989086 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:28:17.989042 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f055b746-2dc2-4bde-94af-40023ed29112" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 16 14:28:24.307683 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:28:24.307612 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" podUID="c24e0dd2-e96f-4029-9a40-9c9ba988128a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8001/health\": dial tcp 10.134.0.41:8001: connect: connection refused" Apr 16 14:28:24.316293 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:28:24.316261 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" podUID="7d1606b0-5579-4e9f-aac0-c69e69108b32" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 16 14:28:27.989769 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:28:27.989726 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f055b746-2dc2-4bde-94af-40023ed29112" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 16 14:28:34.307873 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:28:34.307812 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" podUID="c24e0dd2-e96f-4029-9a40-9c9ba988128a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8001/health\": dial tcp 10.134.0.41:8001: connect: connection refused" Apr 16 14:28:34.315672 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:28:34.315639 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" podUID="7d1606b0-5579-4e9f-aac0-c69e69108b32" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 16 14:28:37.989664 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:28:37.989623 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f055b746-2dc2-4bde-94af-40023ed29112" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 16 14:28:44.307634 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:28:44.307591 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" podUID="c24e0dd2-e96f-4029-9a40-9c9ba988128a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8001/health\": dial tcp 10.134.0.41:8001: connect: connection refused" Apr 16 14:28:44.315942 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:28:44.315899 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" podUID="7d1606b0-5579-4e9f-aac0-c69e69108b32" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 16 14:28:47.989207 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:28:47.989162 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f055b746-2dc2-4bde-94af-40023ed29112" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 16 14:28:54.307751 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:28:54.307707 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" podUID="c24e0dd2-e96f-4029-9a40-9c9ba988128a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8001/health\": dial tcp 10.134.0.41:8001: connect: connection refused" Apr 16 14:28:54.316124 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:28:54.316098 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" podUID="7d1606b0-5579-4e9f-aac0-c69e69108b32" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 16 14:28:57.989346 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:28:57.989297 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f055b746-2dc2-4bde-94af-40023ed29112" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 16 14:29:04.307992 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:29:04.307942 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" podUID="c24e0dd2-e96f-4029-9a40-9c9ba988128a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8001/health\": dial tcp 10.134.0.41:8001: connect: connection refused" Apr 16 14:29:04.315547 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:29:04.315499 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" podUID="7d1606b0-5579-4e9f-aac0-c69e69108b32" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 16 14:29:07.989324 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:29:07.989280 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f055b746-2dc2-4bde-94af-40023ed29112" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 16 14:29:14.308314 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:29:14.308252 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" podUID="c24e0dd2-e96f-4029-9a40-9c9ba988128a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8001/health\": dial tcp 10.134.0.41:8001: connect: connection refused" Apr 16 14:29:14.315386 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:29:14.315349 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" podUID="7d1606b0-5579-4e9f-aac0-c69e69108b32" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 16 14:29:17.989657 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:29:17.989570 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f055b746-2dc2-4bde-94af-40023ed29112" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 16 14:29:24.307823 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:29:24.307774 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" podUID="c24e0dd2-e96f-4029-9a40-9c9ba988128a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8001/health\": dial tcp 10.134.0.41:8001: connect: connection refused" Apr 16 14:29:24.315328 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:29:24.315296 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" podUID="7d1606b0-5579-4e9f-aac0-c69e69108b32" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 16 14:29:27.989815 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:29:27.989773 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f055b746-2dc2-4bde-94af-40023ed29112" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 16 14:29:34.307904 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:29:34.307850 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" podUID="c24e0dd2-e96f-4029-9a40-9c9ba988128a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8001/health\": dial tcp 10.134.0.41:8001: connect: connection refused" Apr 16 14:29:34.315321 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:29:34.315292 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" podUID="7d1606b0-5579-4e9f-aac0-c69e69108b32" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 16 14:29:37.989354 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:29:37.989310 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f055b746-2dc2-4bde-94af-40023ed29112" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 16 14:29:44.313776 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:29:44.313724 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" podUID="c24e0dd2-e96f-4029-9a40-9c9ba988128a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.41:8001/health\": dial tcp 10.134.0.41:8001: connect: connection refused" Apr 16 14:29:44.315889 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:29:44.315854 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" podUID="7d1606b0-5579-4e9f-aac0-c69e69108b32" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 16 14:29:47.989746 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:29:47.989696 2570 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f055b746-2dc2-4bde-94af-40023ed29112" containerName="main" probeResult="failure" output="Get \"https://10.134.0.43:8000/health\": dial tcp 10.134.0.43:8000: connect: connection refused" Apr 16 14:29:54.322285 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:29:54.322252 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" Apr 16 14:29:54.325935 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:29:54.325909 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" Apr 16 14:29:54.333728 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:29:54.333703 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" Apr 16 14:29:54.339065 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:29:54.339038 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" Apr 16 14:29:57.998308 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:29:57.998279 2570 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:29:58.005643 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:29:58.005617 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:30:08.739728 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:30:08.739693 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 14:30:08.740103 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:30:08.739984 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="f055b746-2dc2-4bde-94af-40023ed29112" containerName="main" containerID="cri-o://6a4cb93bc66602fec0b187437ec9c54dba0188044c5ca4b92804ce9ea45ec831" gracePeriod=30 Apr 16 14:30:09.957894 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:30:09.957854 2570 generic.go:358] "Generic (PLEG): container finished" podID="f055b746-2dc2-4bde-94af-40023ed29112" containerID="6a4cb93bc66602fec0b187437ec9c54dba0188044c5ca4b92804ce9ea45ec831" exitCode=0 Apr 16 14:30:09.958275 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:30:09.957925 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"f055b746-2dc2-4bde-94af-40023ed29112","Type":"ContainerDied","Data":"6a4cb93bc66602fec0b187437ec9c54dba0188044c5ca4b92804ce9ea45ec831"} Apr 16 14:30:10.205115 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:30:10.205082 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:30:10.334171 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:30:10.334101 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f055b746-2dc2-4bde-94af-40023ed29112-dshm\") pod \"f055b746-2dc2-4bde-94af-40023ed29112\" (UID: \"f055b746-2dc2-4bde-94af-40023ed29112\") " Apr 16 14:30:10.334171 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:30:10.334160 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ndfx\" (UniqueName: \"kubernetes.io/projected/f055b746-2dc2-4bde-94af-40023ed29112-kube-api-access-8ndfx\") pod \"f055b746-2dc2-4bde-94af-40023ed29112\" (UID: \"f055b746-2dc2-4bde-94af-40023ed29112\") " Apr 16 14:30:10.334363 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:30:10.334183 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f055b746-2dc2-4bde-94af-40023ed29112-kserve-provision-location\") pod \"f055b746-2dc2-4bde-94af-40023ed29112\" (UID: \"f055b746-2dc2-4bde-94af-40023ed29112\") " Apr 16 14:30:10.334363 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:30:10.334220 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f055b746-2dc2-4bde-94af-40023ed29112-tls-certs\") pod \"f055b746-2dc2-4bde-94af-40023ed29112\" (UID: \"f055b746-2dc2-4bde-94af-40023ed29112\") " Apr 16 14:30:10.334363 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:30:10.334269 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f055b746-2dc2-4bde-94af-40023ed29112-home\") pod \"f055b746-2dc2-4bde-94af-40023ed29112\" (UID: \"f055b746-2dc2-4bde-94af-40023ed29112\") " Apr 16 14:30:10.334363 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:30:10.334311 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f055b746-2dc2-4bde-94af-40023ed29112-model-cache\") pod \"f055b746-2dc2-4bde-94af-40023ed29112\" (UID: \"f055b746-2dc2-4bde-94af-40023ed29112\") " Apr 16 14:30:10.334657 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:30:10.334625 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f055b746-2dc2-4bde-94af-40023ed29112-model-cache" (OuterVolumeSpecName: "model-cache") pod "f055b746-2dc2-4bde-94af-40023ed29112" (UID: "f055b746-2dc2-4bde-94af-40023ed29112"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:30:10.334720 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:30:10.334646 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f055b746-2dc2-4bde-94af-40023ed29112-home" (OuterVolumeSpecName: "home") pod "f055b746-2dc2-4bde-94af-40023ed29112" (UID: "f055b746-2dc2-4bde-94af-40023ed29112"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:30:10.334942 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:30:10.334925 2570 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f055b746-2dc2-4bde-94af-40023ed29112-home\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:30:10.334983 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:30:10.334951 2570 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f055b746-2dc2-4bde-94af-40023ed29112-model-cache\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:30:10.336371 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:30:10.336349 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f055b746-2dc2-4bde-94af-40023ed29112-kube-api-access-8ndfx" (OuterVolumeSpecName: "kube-api-access-8ndfx") pod "f055b746-2dc2-4bde-94af-40023ed29112" (UID: "f055b746-2dc2-4bde-94af-40023ed29112"). InnerVolumeSpecName "kube-api-access-8ndfx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:30:10.336829 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:30:10.336806 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f055b746-2dc2-4bde-94af-40023ed29112-dshm" (OuterVolumeSpecName: "dshm") pod "f055b746-2dc2-4bde-94af-40023ed29112" (UID: "f055b746-2dc2-4bde-94af-40023ed29112"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:30:10.336913 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:30:10.336886 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f055b746-2dc2-4bde-94af-40023ed29112-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f055b746-2dc2-4bde-94af-40023ed29112" (UID: "f055b746-2dc2-4bde-94af-40023ed29112"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:30:10.413897 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:30:10.413850 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f055b746-2dc2-4bde-94af-40023ed29112-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f055b746-2dc2-4bde-94af-40023ed29112" (UID: "f055b746-2dc2-4bde-94af-40023ed29112"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:30:10.436037 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:30:10.436004 2570 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f055b746-2dc2-4bde-94af-40023ed29112-dshm\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:30:10.436037 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:30:10.436033 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8ndfx\" (UniqueName: \"kubernetes.io/projected/f055b746-2dc2-4bde-94af-40023ed29112-kube-api-access-8ndfx\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:30:10.436215 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:30:10.436048 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f055b746-2dc2-4bde-94af-40023ed29112-kserve-provision-location\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:30:10.436215 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:30:10.436061 2570 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f055b746-2dc2-4bde-94af-40023ed29112-tls-certs\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:30:10.963987 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:30:10.963956 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 16 14:30:10.964379 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:30:10.963956 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"f055b746-2dc2-4bde-94af-40023ed29112","Type":"ContainerDied","Data":"1a6bfff83355fe77dd8629f8b7352a0b53276f064586b64c1a02a010a8d13948"} Apr 16 14:30:10.964379 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:30:10.964083 2570 scope.go:117] "RemoveContainer" containerID="6a4cb93bc66602fec0b187437ec9c54dba0188044c5ca4b92804ce9ea45ec831" Apr 16 14:30:10.986077 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:30:10.986055 2570 scope.go:117] "RemoveContainer" containerID="46958227a546f9484c6f804b9ee65df09023aef2e383f185cb5b204c6098c94e" Apr 16 14:30:10.993592 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:30:10.993497 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 14:30:10.999737 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:30:10.999707 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 16 14:30:12.309236 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:30:12.309206 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f055b746-2dc2-4bde-94af-40023ed29112" path="/var/lib/kubelet/pods/f055b746-2dc2-4bde-94af-40023ed29112/volumes" Apr 16 14:30:31.194128 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:30:31.194075 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5"] Apr 16 14:30:31.194618 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:30:31.194557 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" podUID="c24e0dd2-e96f-4029-9a40-9c9ba988128a" containerName="main" containerID="cri-o://bacfaed84451c05d067bf574f595f51ff50b825d6f20bb97ec2e258c029276e6" gracePeriod=30 Apr 16 14:30:31.196781 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:30:31.196750 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l"] Apr 16 14:30:31.197345 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:30:31.197296 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" podUID="7d1606b0-5579-4e9f-aac0-c69e69108b32" containerName="main" containerID="cri-o://9c994075b22b70af73ff84f4d08333a4295bceb508ef63db003b8134d6268dd2" gracePeriod=30 Apr 16 14:31:01.194634 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.194513 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" podUID="c24e0dd2-e96f-4029-9a40-9c9ba988128a" containerName="llm-d-routing-sidecar" containerID="cri-o://fe805b4e21947f6173981c2d4fa727a41d75d77bdbc298adfabe75cec43e4649" gracePeriod=2 Apr 16 14:31:01.505660 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.505637 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-586598877f-mdss5_c24e0dd2-e96f-4029-9a40-9c9ba988128a/main/0.log" Apr 16 14:31:01.506435 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.506415 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" Apr 16 14:31:01.509421 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.509401 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" Apr 16 14:31:01.552699 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.552676 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c24e0dd2-e96f-4029-9a40-9c9ba988128a-tls-certs\") pod \"c24e0dd2-e96f-4029-9a40-9c9ba988128a\" (UID: \"c24e0dd2-e96f-4029-9a40-9c9ba988128a\") " Apr 16 14:31:01.552827 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.552713 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7d1606b0-5579-4e9f-aac0-c69e69108b32-dshm\") pod \"7d1606b0-5579-4e9f-aac0-c69e69108b32\" (UID: \"7d1606b0-5579-4e9f-aac0-c69e69108b32\") " Apr 16 14:31:01.552827 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.552742 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dskjs\" (UniqueName: \"kubernetes.io/projected/c24e0dd2-e96f-4029-9a40-9c9ba988128a-kube-api-access-dskjs\") pod \"c24e0dd2-e96f-4029-9a40-9c9ba988128a\" (UID: \"c24e0dd2-e96f-4029-9a40-9c9ba988128a\") " Apr 16 14:31:01.552827 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.552803 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c24e0dd2-e96f-4029-9a40-9c9ba988128a-kserve-provision-location\") pod \"c24e0dd2-e96f-4029-9a40-9c9ba988128a\" (UID: \"c24e0dd2-e96f-4029-9a40-9c9ba988128a\") " Apr 16 14:31:01.552994 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.552829 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c24e0dd2-e96f-4029-9a40-9c9ba988128a-dshm\") pod \"c24e0dd2-e96f-4029-9a40-9c9ba988128a\" (UID: \"c24e0dd2-e96f-4029-9a40-9c9ba988128a\") " Apr 16 14:31:01.552994 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.552858 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7d1606b0-5579-4e9f-aac0-c69e69108b32-home\") pod \"7d1606b0-5579-4e9f-aac0-c69e69108b32\" (UID: \"7d1606b0-5579-4e9f-aac0-c69e69108b32\") " Apr 16 14:31:01.552994 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.552902 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c24e0dd2-e96f-4029-9a40-9c9ba988128a-model-cache\") pod \"c24e0dd2-e96f-4029-9a40-9c9ba988128a\" (UID: \"c24e0dd2-e96f-4029-9a40-9c9ba988128a\") " Apr 16 14:31:01.552994 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.552937 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d1606b0-5579-4e9f-aac0-c69e69108b32-kserve-provision-location\") pod \"7d1606b0-5579-4e9f-aac0-c69e69108b32\" (UID: \"7d1606b0-5579-4e9f-aac0-c69e69108b32\") " Apr 16 14:31:01.552994 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.552960 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7d1606b0-5579-4e9f-aac0-c69e69108b32-model-cache\") pod \"7d1606b0-5579-4e9f-aac0-c69e69108b32\" (UID: \"7d1606b0-5579-4e9f-aac0-c69e69108b32\") " Apr 16 14:31:01.553259 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.553009 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7d1606b0-5579-4e9f-aac0-c69e69108b32-tls-certs\") pod \"7d1606b0-5579-4e9f-aac0-c69e69108b32\" (UID: \"7d1606b0-5579-4e9f-aac0-c69e69108b32\") " Apr 16 14:31:01.553259 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.553032 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c24e0dd2-e96f-4029-9a40-9c9ba988128a-home\") pod \"c24e0dd2-e96f-4029-9a40-9c9ba988128a\" (UID: \"c24e0dd2-e96f-4029-9a40-9c9ba988128a\") " Apr 16 14:31:01.553259 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.553072 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhjxm\" (UniqueName: \"kubernetes.io/projected/7d1606b0-5579-4e9f-aac0-c69e69108b32-kube-api-access-jhjxm\") pod \"7d1606b0-5579-4e9f-aac0-c69e69108b32\" (UID: \"7d1606b0-5579-4e9f-aac0-c69e69108b32\") " Apr 16 14:31:01.553404 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.553267 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c24e0dd2-e96f-4029-9a40-9c9ba988128a-model-cache" (OuterVolumeSpecName: "model-cache") pod "c24e0dd2-e96f-4029-9a40-9c9ba988128a" (UID: "c24e0dd2-e96f-4029-9a40-9c9ba988128a"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:31:01.553459 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.553426 2570 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c24e0dd2-e96f-4029-9a40-9c9ba988128a-model-cache\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:31:01.553459 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.553432 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d1606b0-5579-4e9f-aac0-c69e69108b32-home" (OuterVolumeSpecName: "home") pod "7d1606b0-5579-4e9f-aac0-c69e69108b32" (UID: "7d1606b0-5579-4e9f-aac0-c69e69108b32"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:31:01.555645 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.554361 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d1606b0-5579-4e9f-aac0-c69e69108b32-model-cache" (OuterVolumeSpecName: "model-cache") pod "7d1606b0-5579-4e9f-aac0-c69e69108b32" (UID: "7d1606b0-5579-4e9f-aac0-c69e69108b32"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:31:01.556011 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.555977 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c24e0dd2-e96f-4029-9a40-9c9ba988128a-home" (OuterVolumeSpecName: "home") pod "c24e0dd2-e96f-4029-9a40-9c9ba988128a" (UID: "c24e0dd2-e96f-4029-9a40-9c9ba988128a"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:31:01.556109 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.555985 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c24e0dd2-e96f-4029-9a40-9c9ba988128a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c24e0dd2-e96f-4029-9a40-9c9ba988128a" (UID: "c24e0dd2-e96f-4029-9a40-9c9ba988128a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:31:01.556109 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.556061 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c24e0dd2-e96f-4029-9a40-9c9ba988128a-kube-api-access-dskjs" (OuterVolumeSpecName: "kube-api-access-dskjs") pod "c24e0dd2-e96f-4029-9a40-9c9ba988128a" (UID: "c24e0dd2-e96f-4029-9a40-9c9ba988128a"). InnerVolumeSpecName "kube-api-access-dskjs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:31:01.556229 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.556157 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d1606b0-5579-4e9f-aac0-c69e69108b32-kube-api-access-jhjxm" (OuterVolumeSpecName: "kube-api-access-jhjxm") pod "7d1606b0-5579-4e9f-aac0-c69e69108b32" (UID: "7d1606b0-5579-4e9f-aac0-c69e69108b32"). InnerVolumeSpecName "kube-api-access-jhjxm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:31:01.557578 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.557552 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d1606b0-5579-4e9f-aac0-c69e69108b32-dshm" (OuterVolumeSpecName: "dshm") pod "7d1606b0-5579-4e9f-aac0-c69e69108b32" (UID: "7d1606b0-5579-4e9f-aac0-c69e69108b32"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:31:01.557578 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.557562 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c24e0dd2-e96f-4029-9a40-9c9ba988128a-dshm" (OuterVolumeSpecName: "dshm") pod "c24e0dd2-e96f-4029-9a40-9c9ba988128a" (UID: "c24e0dd2-e96f-4029-9a40-9c9ba988128a"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:31:01.558224 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.558201 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d1606b0-5579-4e9f-aac0-c69e69108b32-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "7d1606b0-5579-4e9f-aac0-c69e69108b32" (UID: "7d1606b0-5579-4e9f-aac0-c69e69108b32"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:31:01.617797 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.617763 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c24e0dd2-e96f-4029-9a40-9c9ba988128a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c24e0dd2-e96f-4029-9a40-9c9ba988128a" (UID: "c24e0dd2-e96f-4029-9a40-9c9ba988128a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:31:01.621804 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.621776 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d1606b0-5579-4e9f-aac0-c69e69108b32-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7d1606b0-5579-4e9f-aac0-c69e69108b32" (UID: "7d1606b0-5579-4e9f-aac0-c69e69108b32"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:31:01.654799 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.654774 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dskjs\" (UniqueName: \"kubernetes.io/projected/c24e0dd2-e96f-4029-9a40-9c9ba988128a-kube-api-access-dskjs\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:31:01.654799 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.654798 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c24e0dd2-e96f-4029-9a40-9c9ba988128a-kserve-provision-location\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:31:01.654931 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.654808 2570 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c24e0dd2-e96f-4029-9a40-9c9ba988128a-dshm\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:31:01.654931 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.654818 2570 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7d1606b0-5579-4e9f-aac0-c69e69108b32-home\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:31:01.654931 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.654827 2570 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7d1606b0-5579-4e9f-aac0-c69e69108b32-kserve-provision-location\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:31:01.654931 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.654836 2570 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7d1606b0-5579-4e9f-aac0-c69e69108b32-model-cache\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:31:01.654931 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.654844 2570 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7d1606b0-5579-4e9f-aac0-c69e69108b32-tls-certs\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:31:01.654931 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.654852 2570 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c24e0dd2-e96f-4029-9a40-9c9ba988128a-home\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:31:01.654931 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.654859 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jhjxm\" (UniqueName: \"kubernetes.io/projected/7d1606b0-5579-4e9f-aac0-c69e69108b32-kube-api-access-jhjxm\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:31:01.654931 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.654867 2570 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c24e0dd2-e96f-4029-9a40-9c9ba988128a-tls-certs\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:31:01.654931 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:01.654874 2570 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7d1606b0-5579-4e9f-aac0-c69e69108b32-dshm\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:31:02.139402 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.139377 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-586598877f-mdss5_c24e0dd2-e96f-4029-9a40-9c9ba988128a/main/0.log" Apr 16 14:31:02.142700 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.142668 2570 generic.go:358] "Generic (PLEG): container finished" podID="c24e0dd2-e96f-4029-9a40-9c9ba988128a" containerID="bacfaed84451c05d067bf574f595f51ff50b825d6f20bb97ec2e258c029276e6" exitCode=137 Apr 16 14:31:02.142700 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.142698 2570 generic.go:358] "Generic (PLEG): container finished" podID="c24e0dd2-e96f-4029-9a40-9c9ba988128a" containerID="fe805b4e21947f6173981c2d4fa727a41d75d77bdbc298adfabe75cec43e4649" exitCode=0 Apr 16 14:31:02.142887 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.142758 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" Apr 16 14:31:02.142887 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.142756 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" event={"ID":"c24e0dd2-e96f-4029-9a40-9c9ba988128a","Type":"ContainerDied","Data":"bacfaed84451c05d067bf574f595f51ff50b825d6f20bb97ec2e258c029276e6"} Apr 16 14:31:02.142887 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.142800 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" event={"ID":"c24e0dd2-e96f-4029-9a40-9c9ba988128a","Type":"ContainerDied","Data":"fe805b4e21947f6173981c2d4fa727a41d75d77bdbc298adfabe75cec43e4649"} Apr 16 14:31:02.142887 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.142813 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5" event={"ID":"c24e0dd2-e96f-4029-9a40-9c9ba988128a","Type":"ContainerDied","Data":"161d419164ac78efc6470ad53b16ae2bfe6375f7fc5febecba8bce7105bfcfc7"} Apr 16 14:31:02.142887 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.142829 2570 scope.go:117] "RemoveContainer" containerID="bacfaed84451c05d067bf574f595f51ff50b825d6f20bb97ec2e258c029276e6" Apr 16 14:31:02.144509 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.144483 2570 generic.go:358] "Generic (PLEG): container finished" podID="7d1606b0-5579-4e9f-aac0-c69e69108b32" containerID="9c994075b22b70af73ff84f4d08333a4295bceb508ef63db003b8134d6268dd2" exitCode=137 Apr 16 14:31:02.144643 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.144583 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" Apr 16 14:31:02.144643 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.144576 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" event={"ID":"7d1606b0-5579-4e9f-aac0-c69e69108b32","Type":"ContainerDied","Data":"9c994075b22b70af73ff84f4d08333a4295bceb508ef63db003b8134d6268dd2"} Apr 16 14:31:02.144771 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.144752 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l" event={"ID":"7d1606b0-5579-4e9f-aac0-c69e69108b32","Type":"ContainerDied","Data":"a137ea943e9f2732b3dcd22a96a487b64f9f635a965004c372cf06dc1fa40568"} Apr 16 14:31:02.165879 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.165857 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5"] Apr 16 14:31:02.167389 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.167365 2570 scope.go:117] "RemoveContainer" containerID="35cfccdaf0e50fdc1749118ec5aaf2a847caf7b574706589be4484ebda3238a7" Apr 16 14:31:02.170334 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.170315 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-586598877f-mdss5"] Apr 16 14:31:02.181252 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.181233 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l"] Apr 16 14:31:02.184731 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.184710 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-74777c5d4-d768l"] Apr 16 14:31:02.227361 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.227260 2570 scope.go:117] "RemoveContainer" containerID="fe805b4e21947f6173981c2d4fa727a41d75d77bdbc298adfabe75cec43e4649" Apr 16 14:31:02.234704 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.234686 2570 scope.go:117] "RemoveContainer" containerID="bacfaed84451c05d067bf574f595f51ff50b825d6f20bb97ec2e258c029276e6" Apr 16 14:31:02.234964 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:31:02.234945 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bacfaed84451c05d067bf574f595f51ff50b825d6f20bb97ec2e258c029276e6\": container with ID starting with bacfaed84451c05d067bf574f595f51ff50b825d6f20bb97ec2e258c029276e6 not found: ID does not exist" containerID="bacfaed84451c05d067bf574f595f51ff50b825d6f20bb97ec2e258c029276e6" Apr 16 14:31:02.235025 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.234971 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bacfaed84451c05d067bf574f595f51ff50b825d6f20bb97ec2e258c029276e6"} err="failed to get container status \"bacfaed84451c05d067bf574f595f51ff50b825d6f20bb97ec2e258c029276e6\": rpc error: code = NotFound desc = could not find container \"bacfaed84451c05d067bf574f595f51ff50b825d6f20bb97ec2e258c029276e6\": container with ID starting with bacfaed84451c05d067bf574f595f51ff50b825d6f20bb97ec2e258c029276e6 not found: ID does not exist" Apr 16 14:31:02.235025 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.234989 2570 scope.go:117] "RemoveContainer" containerID="35cfccdaf0e50fdc1749118ec5aaf2a847caf7b574706589be4484ebda3238a7" Apr 16 14:31:02.235240 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:31:02.235219 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35cfccdaf0e50fdc1749118ec5aaf2a847caf7b574706589be4484ebda3238a7\": container with ID starting with 35cfccdaf0e50fdc1749118ec5aaf2a847caf7b574706589be4484ebda3238a7 not found: ID does not exist" containerID="35cfccdaf0e50fdc1749118ec5aaf2a847caf7b574706589be4484ebda3238a7" Apr 16 14:31:02.235281 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.235246 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35cfccdaf0e50fdc1749118ec5aaf2a847caf7b574706589be4484ebda3238a7"} err="failed to get container status \"35cfccdaf0e50fdc1749118ec5aaf2a847caf7b574706589be4484ebda3238a7\": rpc error: code = NotFound desc = could not find container \"35cfccdaf0e50fdc1749118ec5aaf2a847caf7b574706589be4484ebda3238a7\": container with ID starting with 35cfccdaf0e50fdc1749118ec5aaf2a847caf7b574706589be4484ebda3238a7 not found: ID does not exist" Apr 16 14:31:02.235281 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.235263 2570 scope.go:117] "RemoveContainer" containerID="fe805b4e21947f6173981c2d4fa727a41d75d77bdbc298adfabe75cec43e4649" Apr 16 14:31:02.235482 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:31:02.235467 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe805b4e21947f6173981c2d4fa727a41d75d77bdbc298adfabe75cec43e4649\": container with ID starting with fe805b4e21947f6173981c2d4fa727a41d75d77bdbc298adfabe75cec43e4649 not found: ID does not exist" containerID="fe805b4e21947f6173981c2d4fa727a41d75d77bdbc298adfabe75cec43e4649" Apr 16 14:31:02.235523 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.235487 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe805b4e21947f6173981c2d4fa727a41d75d77bdbc298adfabe75cec43e4649"} err="failed to get container status \"fe805b4e21947f6173981c2d4fa727a41d75d77bdbc298adfabe75cec43e4649\": rpc error: code = NotFound desc = could not find container \"fe805b4e21947f6173981c2d4fa727a41d75d77bdbc298adfabe75cec43e4649\": container with ID starting with fe805b4e21947f6173981c2d4fa727a41d75d77bdbc298adfabe75cec43e4649 not found: ID does not exist" Apr 16 14:31:02.235523 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.235508 2570 scope.go:117] "RemoveContainer" containerID="bacfaed84451c05d067bf574f595f51ff50b825d6f20bb97ec2e258c029276e6" Apr 16 14:31:02.235727 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.235702 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bacfaed84451c05d067bf574f595f51ff50b825d6f20bb97ec2e258c029276e6"} err="failed to get container status \"bacfaed84451c05d067bf574f595f51ff50b825d6f20bb97ec2e258c029276e6\": rpc error: code = NotFound desc = could not find container \"bacfaed84451c05d067bf574f595f51ff50b825d6f20bb97ec2e258c029276e6\": container with ID starting with bacfaed84451c05d067bf574f595f51ff50b825d6f20bb97ec2e258c029276e6 not found: ID does not exist" Apr 16 14:31:02.235825 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.235727 2570 scope.go:117] "RemoveContainer" containerID="35cfccdaf0e50fdc1749118ec5aaf2a847caf7b574706589be4484ebda3238a7" Apr 16 14:31:02.235999 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.235980 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35cfccdaf0e50fdc1749118ec5aaf2a847caf7b574706589be4484ebda3238a7"} err="failed to get container status \"35cfccdaf0e50fdc1749118ec5aaf2a847caf7b574706589be4484ebda3238a7\": rpc error: code = NotFound desc = could not find container \"35cfccdaf0e50fdc1749118ec5aaf2a847caf7b574706589be4484ebda3238a7\": container with ID starting with 35cfccdaf0e50fdc1749118ec5aaf2a847caf7b574706589be4484ebda3238a7 not found: ID does not exist" Apr 16 14:31:02.236052 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.236001 2570 scope.go:117] "RemoveContainer" containerID="fe805b4e21947f6173981c2d4fa727a41d75d77bdbc298adfabe75cec43e4649" Apr 16 14:31:02.236237 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.236217 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe805b4e21947f6173981c2d4fa727a41d75d77bdbc298adfabe75cec43e4649"} err="failed to get container status \"fe805b4e21947f6173981c2d4fa727a41d75d77bdbc298adfabe75cec43e4649\": rpc error: code = NotFound desc = could not find container \"fe805b4e21947f6173981c2d4fa727a41d75d77bdbc298adfabe75cec43e4649\": container with ID starting with fe805b4e21947f6173981c2d4fa727a41d75d77bdbc298adfabe75cec43e4649 not found: ID does not exist" Apr 16 14:31:02.236283 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.236239 2570 scope.go:117] "RemoveContainer" containerID="9c994075b22b70af73ff84f4d08333a4295bceb508ef63db003b8134d6268dd2" Apr 16 14:31:02.254566 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.254544 2570 scope.go:117] "RemoveContainer" containerID="10501ae393c8fe7c03da5565911bbd0258850ee7b06833210a87d04e38d406f9" Apr 16 14:31:02.309922 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.309900 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d1606b0-5579-4e9f-aac0-c69e69108b32" path="/var/lib/kubelet/pods/7d1606b0-5579-4e9f-aac0-c69e69108b32/volumes" Apr 16 14:31:02.310300 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.310287 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c24e0dd2-e96f-4029-9a40-9c9ba988128a" path="/var/lib/kubelet/pods/c24e0dd2-e96f-4029-9a40-9c9ba988128a/volumes" Apr 16 14:31:02.319947 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.319929 2570 scope.go:117] "RemoveContainer" containerID="9c994075b22b70af73ff84f4d08333a4295bceb508ef63db003b8134d6268dd2" Apr 16 14:31:02.320198 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:31:02.320180 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c994075b22b70af73ff84f4d08333a4295bceb508ef63db003b8134d6268dd2\": container with ID starting with 9c994075b22b70af73ff84f4d08333a4295bceb508ef63db003b8134d6268dd2 not found: ID does not exist" containerID="9c994075b22b70af73ff84f4d08333a4295bceb508ef63db003b8134d6268dd2" Apr 16 14:31:02.320264 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.320202 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c994075b22b70af73ff84f4d08333a4295bceb508ef63db003b8134d6268dd2"} err="failed to get container status \"9c994075b22b70af73ff84f4d08333a4295bceb508ef63db003b8134d6268dd2\": rpc error: code = NotFound desc = could not find container \"9c994075b22b70af73ff84f4d08333a4295bceb508ef63db003b8134d6268dd2\": container with ID starting with 9c994075b22b70af73ff84f4d08333a4295bceb508ef63db003b8134d6268dd2 not found: ID does not exist" Apr 16 14:31:02.320264 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.320218 2570 scope.go:117] "RemoveContainer" containerID="10501ae393c8fe7c03da5565911bbd0258850ee7b06833210a87d04e38d406f9" Apr 16 14:31:02.320417 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:31:02.320401 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"10501ae393c8fe7c03da5565911bbd0258850ee7b06833210a87d04e38d406f9\": container with ID starting with 10501ae393c8fe7c03da5565911bbd0258850ee7b06833210a87d04e38d406f9 not found: ID does not exist" containerID="10501ae393c8fe7c03da5565911bbd0258850ee7b06833210a87d04e38d406f9" Apr 16 14:31:02.320454 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:31:02.320420 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"10501ae393c8fe7c03da5565911bbd0258850ee7b06833210a87d04e38d406f9"} err="failed to get container status \"10501ae393c8fe7c03da5565911bbd0258850ee7b06833210a87d04e38d406f9\": rpc error: code = NotFound desc = could not find container \"10501ae393c8fe7c03da5565911bbd0258850ee7b06833210a87d04e38d406f9\": container with ID starting with 10501ae393c8fe7c03da5565911bbd0258850ee7b06833210a87d04e38d406f9 not found: ID does not exist" Apr 16 14:33:20.224290 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.224255 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x6dwb/must-gather-pxpq8"] Apr 16 14:33:20.224742 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.224726 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="517ef72a-d8c6-4729-b4fc-037c8b19a357" containerName="storage-initializer" Apr 16 14:33:20.224820 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.224746 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="517ef72a-d8c6-4729-b4fc-037c8b19a357" containerName="storage-initializer" Apr 16 14:33:20.224820 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.224763 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f055b746-2dc2-4bde-94af-40023ed29112" containerName="main" Apr 16 14:33:20.224820 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.224772 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="f055b746-2dc2-4bde-94af-40023ed29112" containerName="main" Apr 16 14:33:20.224820 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.224780 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f055b746-2dc2-4bde-94af-40023ed29112" containerName="storage-initializer" Apr 16 14:33:20.224820 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.224785 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="f055b746-2dc2-4bde-94af-40023ed29112" containerName="storage-initializer" Apr 16 14:33:20.224820 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.224799 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="517ef72a-d8c6-4729-b4fc-037c8b19a357" containerName="main" Apr 16 14:33:20.225165 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.224807 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="517ef72a-d8c6-4729-b4fc-037c8b19a357" containerName="main" Apr 16 14:33:20.225165 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.224857 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c24e0dd2-e96f-4029-9a40-9c9ba988128a" containerName="main" Apr 16 14:33:20.225165 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.224865 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24e0dd2-e96f-4029-9a40-9c9ba988128a" containerName="main" Apr 16 14:33:20.225165 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.224890 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c24e0dd2-e96f-4029-9a40-9c9ba988128a" containerName="llm-d-routing-sidecar" Apr 16 14:33:20.225165 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.224898 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24e0dd2-e96f-4029-9a40-9c9ba988128a" containerName="llm-d-routing-sidecar" Apr 16 14:33:20.225165 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.224909 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c24e0dd2-e96f-4029-9a40-9c9ba988128a" containerName="storage-initializer" Apr 16 14:33:20.225165 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.224916 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="c24e0dd2-e96f-4029-9a40-9c9ba988128a" containerName="storage-initializer" Apr 16 14:33:20.225165 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.224927 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d1606b0-5579-4e9f-aac0-c69e69108b32" containerName="main" Apr 16 14:33:20.225165 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.224935 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d1606b0-5579-4e9f-aac0-c69e69108b32" containerName="main" Apr 16 14:33:20.225165 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.224951 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7d1606b0-5579-4e9f-aac0-c69e69108b32" containerName="storage-initializer" Apr 16 14:33:20.225165 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.224961 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d1606b0-5579-4e9f-aac0-c69e69108b32" containerName="storage-initializer" Apr 16 14:33:20.225165 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.225056 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="7d1606b0-5579-4e9f-aac0-c69e69108b32" containerName="main" Apr 16 14:33:20.225165 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.225070 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="517ef72a-d8c6-4729-b4fc-037c8b19a357" containerName="main" Apr 16 14:33:20.225165 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.225082 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="f055b746-2dc2-4bde-94af-40023ed29112" containerName="main" Apr 16 14:33:20.225165 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.225093 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="c24e0dd2-e96f-4029-9a40-9c9ba988128a" containerName="main" Apr 16 14:33:20.225165 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.225106 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="c24e0dd2-e96f-4029-9a40-9c9ba988128a" containerName="llm-d-routing-sidecar" Apr 16 14:33:20.228248 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.228229 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x6dwb/must-gather-pxpq8" Apr 16 14:33:20.230372 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.230353 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-x6dwb\"/\"openshift-service-ca.crt\"" Apr 16 14:33:20.230457 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.230381 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-x6dwb\"/\"default-dockercfg-97w6g\"" Apr 16 14:33:20.230939 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.230915 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-x6dwb\"/\"kube-root-ca.crt\"" Apr 16 14:33:20.235002 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.234979 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x6dwb/must-gather-pxpq8"] Apr 16 14:33:20.312095 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.312058 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0308299a-cfb8-489a-97d5-cca064016418-must-gather-output\") pod \"must-gather-pxpq8\" (UID: \"0308299a-cfb8-489a-97d5-cca064016418\") " pod="openshift-must-gather-x6dwb/must-gather-pxpq8" Apr 16 14:33:20.312242 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.312136 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl8tm\" (UniqueName: \"kubernetes.io/projected/0308299a-cfb8-489a-97d5-cca064016418-kube-api-access-fl8tm\") pod \"must-gather-pxpq8\" (UID: \"0308299a-cfb8-489a-97d5-cca064016418\") " pod="openshift-must-gather-x6dwb/must-gather-pxpq8" Apr 16 14:33:20.413259 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.413223 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0308299a-cfb8-489a-97d5-cca064016418-must-gather-output\") pod \"must-gather-pxpq8\" (UID: \"0308299a-cfb8-489a-97d5-cca064016418\") " pod="openshift-must-gather-x6dwb/must-gather-pxpq8" Apr 16 14:33:20.413418 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.413298 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fl8tm\" (UniqueName: \"kubernetes.io/projected/0308299a-cfb8-489a-97d5-cca064016418-kube-api-access-fl8tm\") pod \"must-gather-pxpq8\" (UID: \"0308299a-cfb8-489a-97d5-cca064016418\") " pod="openshift-must-gather-x6dwb/must-gather-pxpq8" Apr 16 14:33:20.413695 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.413675 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0308299a-cfb8-489a-97d5-cca064016418-must-gather-output\") pod \"must-gather-pxpq8\" (UID: \"0308299a-cfb8-489a-97d5-cca064016418\") " pod="openshift-must-gather-x6dwb/must-gather-pxpq8" Apr 16 14:33:20.422158 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.422134 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl8tm\" (UniqueName: \"kubernetes.io/projected/0308299a-cfb8-489a-97d5-cca064016418-kube-api-access-fl8tm\") pod \"must-gather-pxpq8\" (UID: \"0308299a-cfb8-489a-97d5-cca064016418\") " pod="openshift-must-gather-x6dwb/must-gather-pxpq8" Apr 16 14:33:20.538377 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.538295 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x6dwb/must-gather-pxpq8" Apr 16 14:33:20.663590 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.663557 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x6dwb/must-gather-pxpq8"] Apr 16 14:33:20.665836 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:33:20.665812 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0308299a_cfb8_489a_97d5_cca064016418.slice/crio-b57aba464dd7fc15be9bcb0e5e1cbb3523a60610370e472e1487619a8cdd86dc WatchSource:0}: Error finding container b57aba464dd7fc15be9bcb0e5e1cbb3523a60610370e472e1487619a8cdd86dc: Status 404 returned error can't find the container with id b57aba464dd7fc15be9bcb0e5e1cbb3523a60610370e472e1487619a8cdd86dc Apr 16 14:33:20.667606 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:20.667590 2570 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:33:21.617107 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:21.617075 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x6dwb/must-gather-pxpq8" event={"ID":"0308299a-cfb8-489a-97d5-cca064016418","Type":"ContainerStarted","Data":"b57aba464dd7fc15be9bcb0e5e1cbb3523a60610370e472e1487619a8cdd86dc"} Apr 16 14:33:25.634228 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:25.634184 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x6dwb/must-gather-pxpq8" event={"ID":"0308299a-cfb8-489a-97d5-cca064016418","Type":"ContainerStarted","Data":"c3e20559a445ee47efea66924085eac3318cecb9e06032e9846755b582c029a1"} Apr 16 14:33:25.634635 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:25.634236 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x6dwb/must-gather-pxpq8" event={"ID":"0308299a-cfb8-489a-97d5-cca064016418","Type":"ContainerStarted","Data":"155ee5db244b3be386478fb72c544e3aa70513ca77c0ee78728f0b728797ed85"} Apr 16 14:33:25.650046 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:25.649999 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-x6dwb/must-gather-pxpq8" podStartSLOduration=0.984880526 podStartE2EDuration="5.649981734s" podCreationTimestamp="2026-04-16 14:33:20 +0000 UTC" firstStartedPulling="2026-04-16 14:33:20.667723063 +0000 UTC m=+2026.873366205" lastFinishedPulling="2026-04-16 14:33:25.332824265 +0000 UTC m=+2031.538467413" observedRunningTime="2026-04-16 14:33:25.64846117 +0000 UTC m=+2031.854104335" watchObservedRunningTime="2026-04-16 14:33:25.649981734 +0000 UTC m=+2031.855624897" Apr 16 14:33:50.325986 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:50.325908 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-2l2lq_8d0b7d74-4c84-4b40-9ef3-5c1e6641a116/istio-proxy/0.log" Apr 16 14:33:51.390732 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:51.390696 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-2l2lq_8d0b7d74-4c84-4b40-9ef3-5c1e6641a116/istio-proxy/0.log" Apr 16 14:33:52.842779 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:52.842735 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-pkszh_a3cf85e1-1240-4bae-af6b-e2a75f3bd779/limitador/0.log" Apr 16 14:33:52.862498 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:52.862467 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-hltvm_b04cc4a4-6e86-47ab-a3be-7800cdf133d0/manager/0.log" Apr 16 14:33:53.740180 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:53.740151 2570 generic.go:358] "Generic (PLEG): container finished" podID="0308299a-cfb8-489a-97d5-cca064016418" containerID="155ee5db244b3be386478fb72c544e3aa70513ca77c0ee78728f0b728797ed85" exitCode=0 Apr 16 14:33:53.740410 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:53.740237 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x6dwb/must-gather-pxpq8" event={"ID":"0308299a-cfb8-489a-97d5-cca064016418","Type":"ContainerDied","Data":"155ee5db244b3be386478fb72c544e3aa70513ca77c0ee78728f0b728797ed85"} Apr 16 14:33:53.740618 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:53.740604 2570 scope.go:117] "RemoveContainer" containerID="155ee5db244b3be386478fb72c544e3aa70513ca77c0ee78728f0b728797ed85" Apr 16 14:33:54.595514 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:54.595488 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-x6dwb_must-gather-pxpq8_0308299a-cfb8-489a-97d5-cca064016418/gather/0.log" Apr 16 14:33:58.260740 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:58.260704 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-hztbp_c0a1d356-9c19-4e1d-9761-5d6982c13212/global-pull-secret-syncer/0.log" Apr 16 14:33:58.319474 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:58.319442 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-wkbwm_529a37ce-5549-43e3-bcab-ff0f9a6e46d6/konnectivity-agent/0.log" Apr 16 14:33:58.405630 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:33:58.405576 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-133.ec2.internal_02f569f2777966beb695e394b803ecc2/haproxy/0.log" Apr 16 14:34:00.167553 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:00.167497 2570 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-x6dwb/must-gather-pxpq8"] Apr 16 14:34:00.167925 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:00.167804 2570 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-x6dwb/must-gather-pxpq8" podUID="0308299a-cfb8-489a-97d5-cca064016418" containerName="copy" containerID="cri-o://c3e20559a445ee47efea66924085eac3318cecb9e06032e9846755b582c029a1" gracePeriod=2 Apr 16 14:34:00.169164 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:00.169145 2570 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-x6dwb/must-gather-pxpq8"] Apr 16 14:34:00.173989 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:00.173960 2570 status_manager.go:895] "Failed to get status for pod" podUID="0308299a-cfb8-489a-97d5-cca064016418" pod="openshift-must-gather-x6dwb/must-gather-pxpq8" err="pods \"must-gather-pxpq8\" is forbidden: User \"system:node:ip-10-0-133-133.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-x6dwb\": no relationship found between node 'ip-10-0-133-133.ec2.internal' and this object" Apr 16 14:34:00.223019 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:34:00.222983 2570 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0308299a_cfb8_489a_97d5_cca064016418.slice/crio-conmon-c3e20559a445ee47efea66924085eac3318cecb9e06032e9846755b582c029a1.scope\": RecentStats: unable to find data in memory cache]" Apr 16 14:34:00.404618 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:00.404597 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-x6dwb_must-gather-pxpq8_0308299a-cfb8-489a-97d5-cca064016418/copy/0.log" Apr 16 14:34:00.404959 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:00.404943 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x6dwb/must-gather-pxpq8" Apr 16 14:34:00.482302 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:00.482213 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl8tm\" (UniqueName: \"kubernetes.io/projected/0308299a-cfb8-489a-97d5-cca064016418-kube-api-access-fl8tm\") pod \"0308299a-cfb8-489a-97d5-cca064016418\" (UID: \"0308299a-cfb8-489a-97d5-cca064016418\") " Apr 16 14:34:00.482458 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:00.482342 2570 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0308299a-cfb8-489a-97d5-cca064016418-must-gather-output\") pod \"0308299a-cfb8-489a-97d5-cca064016418\" (UID: \"0308299a-cfb8-489a-97d5-cca064016418\") " Apr 16 14:34:00.484503 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:00.484467 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0308299a-cfb8-489a-97d5-cca064016418-kube-api-access-fl8tm" (OuterVolumeSpecName: "kube-api-access-fl8tm") pod "0308299a-cfb8-489a-97d5-cca064016418" (UID: "0308299a-cfb8-489a-97d5-cca064016418"). InnerVolumeSpecName "kube-api-access-fl8tm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:34:00.488515 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:00.488491 2570 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0308299a-cfb8-489a-97d5-cca064016418-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "0308299a-cfb8-489a-97d5-cca064016418" (UID: "0308299a-cfb8-489a-97d5-cca064016418"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:34:00.583198 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:00.583159 2570 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fl8tm\" (UniqueName: \"kubernetes.io/projected/0308299a-cfb8-489a-97d5-cca064016418-kube-api-access-fl8tm\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:34:00.583198 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:00.583192 2570 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/0308299a-cfb8-489a-97d5-cca064016418-must-gather-output\") on node \"ip-10-0-133-133.ec2.internal\" DevicePath \"\"" Apr 16 14:34:00.765002 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:00.764926 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-x6dwb_must-gather-pxpq8_0308299a-cfb8-489a-97d5-cca064016418/copy/0.log" Apr 16 14:34:00.765283 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:00.765261 2570 generic.go:358] "Generic (PLEG): container finished" podID="0308299a-cfb8-489a-97d5-cca064016418" containerID="c3e20559a445ee47efea66924085eac3318cecb9e06032e9846755b582c029a1" exitCode=143 Apr 16 14:34:00.765364 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:00.765308 2570 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x6dwb/must-gather-pxpq8" Apr 16 14:34:00.765412 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:00.765362 2570 scope.go:117] "RemoveContainer" containerID="c3e20559a445ee47efea66924085eac3318cecb9e06032e9846755b582c029a1" Apr 16 14:34:00.774268 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:00.774253 2570 scope.go:117] "RemoveContainer" containerID="155ee5db244b3be386478fb72c544e3aa70513ca77c0ee78728f0b728797ed85" Apr 16 14:34:00.788620 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:00.788597 2570 scope.go:117] "RemoveContainer" containerID="c3e20559a445ee47efea66924085eac3318cecb9e06032e9846755b582c029a1" Apr 16 14:34:00.788922 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:34:00.788903 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3e20559a445ee47efea66924085eac3318cecb9e06032e9846755b582c029a1\": container with ID starting with c3e20559a445ee47efea66924085eac3318cecb9e06032e9846755b582c029a1 not found: ID does not exist" containerID="c3e20559a445ee47efea66924085eac3318cecb9e06032e9846755b582c029a1" Apr 16 14:34:00.788972 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:00.788931 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3e20559a445ee47efea66924085eac3318cecb9e06032e9846755b582c029a1"} err="failed to get container status \"c3e20559a445ee47efea66924085eac3318cecb9e06032e9846755b582c029a1\": rpc error: code = NotFound desc = could not find container \"c3e20559a445ee47efea66924085eac3318cecb9e06032e9846755b582c029a1\": container with ID starting with c3e20559a445ee47efea66924085eac3318cecb9e06032e9846755b582c029a1 not found: ID does not exist" Apr 16 14:34:00.788972 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:00.788950 2570 scope.go:117] "RemoveContainer" containerID="155ee5db244b3be386478fb72c544e3aa70513ca77c0ee78728f0b728797ed85" Apr 16 14:34:00.789177 ip-10-0-133-133 kubenswrapper[2570]: E0416 14:34:00.789155 2570 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"155ee5db244b3be386478fb72c544e3aa70513ca77c0ee78728f0b728797ed85\": container with ID starting with 155ee5db244b3be386478fb72c544e3aa70513ca77c0ee78728f0b728797ed85 not found: ID does not exist" containerID="155ee5db244b3be386478fb72c544e3aa70513ca77c0ee78728f0b728797ed85" Apr 16 14:34:00.789264 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:00.789194 2570 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"155ee5db244b3be386478fb72c544e3aa70513ca77c0ee78728f0b728797ed85"} err="failed to get container status \"155ee5db244b3be386478fb72c544e3aa70513ca77c0ee78728f0b728797ed85\": rpc error: code = NotFound desc = could not find container \"155ee5db244b3be386478fb72c544e3aa70513ca77c0ee78728f0b728797ed85\": container with ID starting with 155ee5db244b3be386478fb72c544e3aa70513ca77c0ee78728f0b728797ed85 not found: ID does not exist" Apr 16 14:34:02.308888 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:02.308860 2570 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0308299a-cfb8-489a-97d5-cca064016418" path="/var/lib/kubelet/pods/0308299a-cfb8-489a-97d5-cca064016418/volumes" Apr 16 14:34:02.625909 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:02.625822 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-pkszh_a3cf85e1-1240-4bae-af6b-e2a75f3bd779/limitador/0.log" Apr 16 14:34:02.656207 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:02.656181 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-hltvm_b04cc4a4-6e86-47ab-a3be-7800cdf133d0/manager/0.log" Apr 16 14:34:03.706399 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:03.706367 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-6667474d89-7dn56_fcd418a5-48d2-4f13-a35b-28504fb6ca61/cluster-monitoring-operator/0.log" Apr 16 14:34:03.732196 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:03.732170 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-v86qq_244a82a3-51aa-45f2-a8fe-823723a2410e/kube-state-metrics/0.log" Apr 16 14:34:03.762074 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:03.762054 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-v86qq_244a82a3-51aa-45f2-a8fe-823723a2410e/kube-rbac-proxy-main/0.log" Apr 16 14:34:03.784310 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:03.784284 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-7479c89684-v86qq_244a82a3-51aa-45f2-a8fe-823723a2410e/kube-rbac-proxy-self/0.log" Apr 16 14:34:03.815494 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:03.815471 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7695b9cb7-4mmkk_e5d1ca4c-458d-4c29-81dd-f62f42d1d4dd/metrics-server/0.log" Apr 16 14:34:03.969367 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:03.969294 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hmhjw_3ac9f0e6-4ee7-4de1-85f7-67127085b819/node-exporter/0.log" Apr 16 14:34:03.997231 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:03.997207 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hmhjw_3ac9f0e6-4ee7-4de1-85f7-67127085b819/kube-rbac-proxy/0.log" Apr 16 14:34:04.020035 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:04.020015 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-hmhjw_3ac9f0e6-4ee7-4de1-85f7-67127085b819/init-textfile/0.log" Apr 16 14:34:04.115929 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:04.115899 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-g4rrg_46fa769b-4f39-41d5-8ac2-232f23bbcbdb/kube-rbac-proxy-main/0.log" Apr 16 14:34:04.139296 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:04.139276 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-g4rrg_46fa769b-4f39-41d5-8ac2-232f23bbcbdb/kube-rbac-proxy-self/0.log" Apr 16 14:34:04.166800 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:04.166777 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-5669946b84-g4rrg_46fa769b-4f39-41d5-8ac2-232f23bbcbdb/openshift-state-metrics/0.log" Apr 16 14:34:04.209152 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:04.209131 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4/prometheus/0.log" Apr 16 14:34:04.244385 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:04.244313 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4/config-reloader/0.log" Apr 16 14:34:04.290692 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:04.290659 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4/thanos-sidecar/0.log" Apr 16 14:34:04.314168 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:04.314147 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4/kube-rbac-proxy-web/0.log" Apr 16 14:34:04.356874 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:04.356851 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4/kube-rbac-proxy/0.log" Apr 16 14:34:04.387073 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:04.387055 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4/kube-rbac-proxy-thanos/0.log" Apr 16 14:34:04.414520 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:04.414503 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_c83f9d8e-3333-4559-9ebc-6c7d1fbfecf4/init-config-reloader/0.log" Apr 16 14:34:04.450145 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:04.450121 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-fhx25_df061447-182f-4f24-a0d0-95178339d48f/prometheus-operator/0.log" Apr 16 14:34:04.472502 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:04.472484 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-78f957474d-fhx25_df061447-182f-4f24-a0d0-95178339d48f/kube-rbac-proxy/0.log" Apr 16 14:34:04.530576 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:04.530491 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-75fc48886f-8mhqt_42fb5c45-515a-4ea9-a937-7418314ae5f2/telemeter-client/0.log" Apr 16 14:34:04.553388 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:04.553368 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-75fc48886f-8mhqt_42fb5c45-515a-4ea9-a937-7418314ae5f2/reload/0.log" Apr 16 14:34:04.574328 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:04.574309 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-75fc48886f-8mhqt_42fb5c45-515a-4ea9-a937-7418314ae5f2/kube-rbac-proxy/0.log" Apr 16 14:34:05.868790 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:05.868757 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-5cb6cf4cb4-474qr_64d4de55-cfd8-4588-af6f-f5d27ec26b16/networking-console-plugin/0.log" Apr 16 14:34:06.790444 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:06.790415 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-586b57c7b4-szfcb_135c9f9a-4369-45dc-8f77-8d080f471674/download-server/0.log" Apr 16 14:34:07.243165 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:07.243129 2570 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kz48r/perf-node-gather-daemonset-9fzsp"] Apr 16 14:34:07.243522 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:07.243466 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0308299a-cfb8-489a-97d5-cca064016418" containerName="gather" Apr 16 14:34:07.243522 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:07.243477 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="0308299a-cfb8-489a-97d5-cca064016418" containerName="gather" Apr 16 14:34:07.243522 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:07.243491 2570 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0308299a-cfb8-489a-97d5-cca064016418" containerName="copy" Apr 16 14:34:07.243522 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:07.243497 2570 state_mem.go:107] "Deleted CPUSet assignment" podUID="0308299a-cfb8-489a-97d5-cca064016418" containerName="copy" Apr 16 14:34:07.243755 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:07.243575 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="0308299a-cfb8-489a-97d5-cca064016418" containerName="copy" Apr 16 14:34:07.243755 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:07.243587 2570 memory_manager.go:356] "RemoveStaleState removing state" podUID="0308299a-cfb8-489a-97d5-cca064016418" containerName="gather" Apr 16 14:34:07.247963 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:07.247932 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-9fzsp" Apr 16 14:34:07.252683 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:07.252660 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kz48r\"/\"openshift-service-ca.crt\"" Apr 16 14:34:07.253005 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:07.252988 2570 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kz48r\"/\"kube-root-ca.crt\"" Apr 16 14:34:07.254122 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:07.253609 2570 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-kz48r\"/\"default-dockercfg-spcjg\"" Apr 16 14:34:07.254122 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:07.253962 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7d955d5dd4-zm999_2bc0eeb3-5222-4373-ab41-9da4b7efca15/volume-data-source-validator/0.log" Apr 16 14:34:07.254715 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:07.254680 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kz48r/perf-node-gather-daemonset-9fzsp"] Apr 16 14:34:07.339825 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:07.339796 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6710a990-49b6-4e81-847b-11c68038fac8-lib-modules\") pod \"perf-node-gather-daemonset-9fzsp\" (UID: \"6710a990-49b6-4e81-847b-11c68038fac8\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-9fzsp" Apr 16 14:34:07.339825 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:07.339830 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mpqx4\" (UniqueName: \"kubernetes.io/projected/6710a990-49b6-4e81-847b-11c68038fac8-kube-api-access-mpqx4\") pod \"perf-node-gather-daemonset-9fzsp\" (UID: \"6710a990-49b6-4e81-847b-11c68038fac8\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-9fzsp" Apr 16 14:34:07.340036 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:07.339864 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6710a990-49b6-4e81-847b-11c68038fac8-proc\") pod \"perf-node-gather-daemonset-9fzsp\" (UID: \"6710a990-49b6-4e81-847b-11c68038fac8\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-9fzsp" Apr 16 14:34:07.340036 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:07.339887 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6710a990-49b6-4e81-847b-11c68038fac8-podres\") pod \"perf-node-gather-daemonset-9fzsp\" (UID: \"6710a990-49b6-4e81-847b-11c68038fac8\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-9fzsp" Apr 16 14:34:07.340036 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:07.339941 2570 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6710a990-49b6-4e81-847b-11c68038fac8-sys\") pod \"perf-node-gather-daemonset-9fzsp\" (UID: \"6710a990-49b6-4e81-847b-11c68038fac8\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-9fzsp" Apr 16 14:34:07.440646 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:07.440618 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6710a990-49b6-4e81-847b-11c68038fac8-lib-modules\") pod \"perf-node-gather-daemonset-9fzsp\" (UID: \"6710a990-49b6-4e81-847b-11c68038fac8\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-9fzsp" Apr 16 14:34:07.440808 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:07.440657 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mpqx4\" (UniqueName: \"kubernetes.io/projected/6710a990-49b6-4e81-847b-11c68038fac8-kube-api-access-mpqx4\") pod \"perf-node-gather-daemonset-9fzsp\" (UID: \"6710a990-49b6-4e81-847b-11c68038fac8\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-9fzsp" Apr 16 14:34:07.440808 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:07.440697 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6710a990-49b6-4e81-847b-11c68038fac8-proc\") pod \"perf-node-gather-daemonset-9fzsp\" (UID: \"6710a990-49b6-4e81-847b-11c68038fac8\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-9fzsp" Apr 16 14:34:07.440808 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:07.440722 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6710a990-49b6-4e81-847b-11c68038fac8-podres\") pod \"perf-node-gather-daemonset-9fzsp\" (UID: \"6710a990-49b6-4e81-847b-11c68038fac8\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-9fzsp" Apr 16 14:34:07.440808 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:07.440781 2570 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6710a990-49b6-4e81-847b-11c68038fac8-sys\") pod \"perf-node-gather-daemonset-9fzsp\" (UID: \"6710a990-49b6-4e81-847b-11c68038fac8\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-9fzsp" Apr 16 14:34:07.440808 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:07.440797 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6710a990-49b6-4e81-847b-11c68038fac8-lib-modules\") pod \"perf-node-gather-daemonset-9fzsp\" (UID: \"6710a990-49b6-4e81-847b-11c68038fac8\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-9fzsp" Apr 16 14:34:07.441027 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:07.440813 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/6710a990-49b6-4e81-847b-11c68038fac8-proc\") pod \"perf-node-gather-daemonset-9fzsp\" (UID: \"6710a990-49b6-4e81-847b-11c68038fac8\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-9fzsp" Apr 16 14:34:07.441027 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:07.440870 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6710a990-49b6-4e81-847b-11c68038fac8-sys\") pod \"perf-node-gather-daemonset-9fzsp\" (UID: \"6710a990-49b6-4e81-847b-11c68038fac8\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-9fzsp" Apr 16 14:34:07.441027 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:07.440881 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/6710a990-49b6-4e81-847b-11c68038fac8-podres\") pod \"perf-node-gather-daemonset-9fzsp\" (UID: \"6710a990-49b6-4e81-847b-11c68038fac8\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-9fzsp" Apr 16 14:34:07.448226 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:07.448205 2570 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mpqx4\" (UniqueName: \"kubernetes.io/projected/6710a990-49b6-4e81-847b-11c68038fac8-kube-api-access-mpqx4\") pod \"perf-node-gather-daemonset-9fzsp\" (UID: \"6710a990-49b6-4e81-847b-11c68038fac8\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-9fzsp" Apr 16 14:34:07.559068 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:07.558990 2570 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-9fzsp" Apr 16 14:34:07.687997 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:07.684182 2570 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kz48r/perf-node-gather-daemonset-9fzsp"] Apr 16 14:34:07.691543 ip-10-0-133-133 kubenswrapper[2570]: W0416 14:34:07.691500 2570 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6710a990_49b6_4e81_847b_11c68038fac8.slice/crio-bc2c29c73f2ac3052c730805788898f009e4a932abc371642f9c356a57eacf16 WatchSource:0}: Error finding container bc2c29c73f2ac3052c730805788898f009e4a932abc371642f9c356a57eacf16: Status 404 returned error can't find the container with id bc2c29c73f2ac3052c730805788898f009e4a932abc371642f9c356a57eacf16 Apr 16 14:34:07.798765 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:07.798740 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-9fzsp" event={"ID":"6710a990-49b6-4e81-847b-11c68038fac8","Type":"ContainerStarted","Data":"334a36cb18b0cce0b6c31d31848b6012f734591737c4f230838f2c5b2b10b17b"} Apr 16 14:34:07.798883 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:07.798775 2570 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-9fzsp" event={"ID":"6710a990-49b6-4e81-847b-11c68038fac8","Type":"ContainerStarted","Data":"bc2c29c73f2ac3052c730805788898f009e4a932abc371642f9c356a57eacf16"} Apr 16 14:34:07.798883 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:07.798855 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-9fzsp" Apr 16 14:34:07.814595 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:07.814502 2570 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-9fzsp" podStartSLOduration=0.81449149 podStartE2EDuration="814.49149ms" podCreationTimestamp="2026-04-16 14:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:34:07.812937276 +0000 UTC m=+2074.018580441" watchObservedRunningTime="2026-04-16 14:34:07.81449149 +0000 UTC m=+2074.020134654" Apr 16 14:34:07.985495 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:07.985467 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-smqtz_6bb67241-6874-4040-a810-80b829751cf9/dns/0.log" Apr 16 14:34:08.005509 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:08.005484 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-smqtz_6bb67241-6874-4040-a810-80b829751cf9/kube-rbac-proxy/0.log" Apr 16 14:34:08.090195 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:08.090127 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-kqv5v_258d5bb3-0083-4fff-96dd-2c9e007b3c05/dns-node-resolver/0.log" Apr 16 14:34:08.581436 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:08.581409 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-p9j4h_29d8d95e-1f57-49fb-9896-340b389f0eea/node-ca/0.log" Apr 16 14:34:09.475980 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:09.475953 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-2l2lq_8d0b7d74-4c84-4b40-9ef3-5c1e6641a116/istio-proxy/0.log" Apr 16 14:34:09.974190 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:09.974154 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-wc4m2_2d037ded-fc00-41e0-b31f-c9fb98bdc629/serve-healthcheck-canary/0.log" Apr 16 14:34:10.371811 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:10.371734 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-qgcsx_e39233e7-6f83-4e72-8e15-0f19ce865b49/insights-operator/0.log" Apr 16 14:34:10.372508 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:10.372490 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-5785d4fcdd-qgcsx_e39233e7-6f83-4e72-8e15-0f19ce865b49/insights-operator/1.log" Apr 16 14:34:10.396973 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:10.396949 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6fksf_75776d76-58e8-483c-ae65-df6ee9cfa222/kube-rbac-proxy/0.log" Apr 16 14:34:10.420633 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:10.420613 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6fksf_75776d76-58e8-483c-ae65-df6ee9cfa222/exporter/0.log" Apr 16 14:34:10.452713 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:10.452695 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6fksf_75776d76-58e8-483c-ae65-df6ee9cfa222/extractor/0.log" Apr 16 14:34:13.139180 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:13.139150 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-dc77c844c-hn9vm_56973a88-a650-4c60-9f0d-df76e2dc41ae/manager/0.log" Apr 16 14:34:13.812784 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:13.812756 2570 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-9fzsp" Apr 16 14:34:18.780479 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:18.780455 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-4kqf9_e4c1dc9f-ffdc-4615-820f-9560dd37ae0b/migrator/0.log" Apr 16 14:34:18.803646 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:18.803615 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-64d4d94569-4kqf9_e4c1dc9f-ffdc-4615-820f-9560dd37ae0b/graceful-termination/0.log" Apr 16 14:34:19.141830 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:19.141799 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-wskzx_234da4eb-18fa-4c53-ab1a-fbbe46b9e3cb/kube-storage-version-migrator-operator/1.log" Apr 16 14:34:19.143237 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:19.143219 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-756bb7d76f-wskzx_234da4eb-18fa-4c53-ab1a-fbbe46b9e3cb/kube-storage-version-migrator-operator/0.log" Apr 16 14:34:20.227043 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:20.227015 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bg5bb_d0acc482-de34-4188-ba44-20d609da46d0/kube-multus-additional-cni-plugins/0.log" Apr 16 14:34:20.251991 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:20.251960 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bg5bb_d0acc482-de34-4188-ba44-20d609da46d0/egress-router-binary-copy/0.log" Apr 16 14:34:20.276197 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:20.276173 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bg5bb_d0acc482-de34-4188-ba44-20d609da46d0/cni-plugins/0.log" Apr 16 14:34:20.301701 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:20.301683 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bg5bb_d0acc482-de34-4188-ba44-20d609da46d0/bond-cni-plugin/0.log" Apr 16 14:34:20.321975 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:20.321959 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bg5bb_d0acc482-de34-4188-ba44-20d609da46d0/routeoverride-cni/0.log" Apr 16 14:34:20.347003 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:20.346984 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bg5bb_d0acc482-de34-4188-ba44-20d609da46d0/whereabouts-cni-bincopy/0.log" Apr 16 14:34:20.367429 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:20.367405 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-bg5bb_d0acc482-de34-4188-ba44-20d609da46d0/whereabouts-cni/0.log" Apr 16 14:34:20.596106 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:20.596032 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vp4gn_1edb190f-96e8-4548-8b55-97073b01a7ed/kube-multus/0.log" Apr 16 14:34:20.617257 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:20.617231 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-29pd4_e6284f77-08e3-4846-904d-6a21f10707ae/network-metrics-daemon/0.log" Apr 16 14:34:20.635393 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:20.635369 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-29pd4_e6284f77-08e3-4846-904d-6a21f10707ae/kube-rbac-proxy/0.log" Apr 16 14:34:21.852811 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:21.852785 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p2d74_346c3280-2b45-4be3-8629-46903ecfe4b8/ovn-controller/0.log" Apr 16 14:34:21.879122 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:21.879096 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p2d74_346c3280-2b45-4be3-8629-46903ecfe4b8/ovn-acl-logging/0.log" Apr 16 14:34:21.897392 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:21.897366 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p2d74_346c3280-2b45-4be3-8629-46903ecfe4b8/kube-rbac-proxy-node/0.log" Apr 16 14:34:21.917353 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:21.917326 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p2d74_346c3280-2b45-4be3-8629-46903ecfe4b8/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 14:34:21.936791 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:21.936768 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p2d74_346c3280-2b45-4be3-8629-46903ecfe4b8/northd/0.log" Apr 16 14:34:21.956942 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:21.956925 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p2d74_346c3280-2b45-4be3-8629-46903ecfe4b8/nbdb/0.log" Apr 16 14:34:21.976919 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:21.976903 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p2d74_346c3280-2b45-4be3-8629-46903ecfe4b8/sbdb/0.log" Apr 16 14:34:22.076759 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:22.076723 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-p2d74_346c3280-2b45-4be3-8629-46903ecfe4b8/ovnkube-controller/0.log" Apr 16 14:34:23.558482 ip-10-0-133-133 kubenswrapper[2570]: I0416 14:34:23.558455 2570 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-88c5t_5a916223-1676-42c3-a13e-815b7355eb26/network-check-target-container/0.log"