Skip to content

fix: Langfuse v3 filter pipeline#557

Open
nurtext wants to merge 1 commit intoopen-webui:mainfrom
nurtext:fix/langfuse-v3-filter-pipeline
Open

fix: Langfuse v3 filter pipeline#557
nurtext wants to merge 1 commit intoopen-webui:mainfrom
nurtext:fix/langfuse-v3-filter-pipeline

Conversation

@nurtext
Copy link

@nurtext nurtext commented Aug 27, 2025

Fixed a infinite trace situation visible in the Langfuse dashboard which lead to missing traces for metadata, used tokens/cost preview and wrong latency measurement.

@wpitts
Copy link

wpitts commented Oct 1, 2025

Thank you so much!! I download your code and it fixed my Langfuse issue!

@YetheSamartaka
Copy link

Tried on my end as well and it is working.

@denisxab
Copy link

Guys, is this a critical fix that it has been weighing for 3 months without merging?

@camucamulemon7
Copy link

Hi @nurtext .
Thank you for the fix.
On OpenWebUI v0.6.36, the token count still cannot be retrieved.
What could be causing this issue?

@nurtext
Copy link
Author

nurtext commented Dec 8, 2025

Hi @camucamulemon7,

I'm currently unaware of this issue and I would need to investigate this new issue. Unfortunately I'm neither the autor of Langfuse nor the filter pipeline itself. It was just meant as a quick bugfix for my own infrastructure. Maybe the original contributors could have a look at it?

Cheers.

@nurtext
Copy link
Author

nurtext commented Dec 8, 2025

Update: There seem to be a new PR available, another fix for the pipeline - maybe this solves your issue:
#586

Copy link

@jkassie jkassie left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Verified that this (in conjunction with pr-586) fixed the issues I was seeing with Langfuse v3.

Copy link

@jkassie jkassie left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actually, after looking at pr-856, in conjunction with this change causes issues. I'd suggest placing the call to trace.end() immediately before the # Flush data to Langfuse (398/400). As it stands if you fail to create the LLM generation you end up with an open trace. And if you apply both pr-557 and pr-586 you end up with multiple trace.end calls which causes issues.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

6 participants