-
-
Notifications
You must be signed in to change notification settings - Fork 1.7k
feat(node): Add LangChain v1 support #18306
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(node): Add LangChain v1 support #18306
Conversation
size-limit report 📦
|
node-overhead report 🧳Note: This is a synthetic benchmark with a minimal express app and does not necessarily reflect the real-world performance impact in an application.
|
Updated comments to reflect changes in finish_reason usage between versions.
nicohrubec
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Makes sense to me, just a few questions/comments. Generally it feels like the attribute extraction gets quite difficult to read, not sure if we can really improve this though (maybe splitting some types into pre/post v1 but not sure if that actually helps)
We usually try to keep things as general as possible, so splitting would create a lot of unnecessary work. Type changes typically only happen with major releases. I’ve moved everything to /types to make it easier to read, and hopefully this is as simple as it can be |
nicohrubec
left a comment
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Thanks for addressing the comments, LGTM
9e0171a to
86ac8f3
Compare
This PR adds support for LangChain v1.
Changes
LangChain v1 Response Metadata Extraction
Instrument ConfigurableModel from
chat_models/universal.cjsQuick context:
The
initChatModelunified API (introduced in LangChain v1) was not being instrumented in CJS applications. This happened because:initChatModeluses dynamicimport()to load providers (even in CJS mode)require(), not dynamicimport()@langchain/openaiwere never being patched when loaded viainitChatModelSolution: Added instrumentation for the main
langchainpackage. Whenlangchain/dist/chat_models/universal.cjsis loaded, we patch theConfigurableModelprototype. SinceinitChatModelreturns aConfigurableModelinstance, it inherits the patched methods regardless of how the provider is loaded.This fix ensures
initChatModelworks in CJS applications without requiring users to manually import provider packages.initChatModelAPI with OpenAI (now working in both ESM and CJS)Closes https://linear.app/getsentry/issue/JS-1071/support-langchain-v1