-
Notifications
You must be signed in to change notification settings - Fork 14.4k
Switch LLM streaming to OpenAI client #11138
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Switch LLM streaming to OpenAI client #11138
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
💡 Codex Review
Here are some automated review suggestions for this pull request.
ℹ️ About Codex in GitHub
Your team has set up Codex to review pull requests in this repo. Reviews are triggered when you
- Open a pull request for review
- Mark a draft as ready
- Comment "@codex review".
If Codex has suggestions, it will comment; otherwise it will react with 👍.
Codex can also answer questions or update the PR. Try commenting "@codex address that feedback".
| const DEBUG_LEVELS: DebugLevel[] = ['trace', 'debug', 'info', 'warn', 'error']; | ||
| const isProduction = process.env.NODE_ENV === 'production'; | ||
| const envDebugLevel = | ||
| (process.env.NEXT_PUBLIC_LOG_LEVEL as DebugLevel | undefined) ?? | ||
| (process.env.LOG_LEVEL as DebugLevel | undefined); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Guard logger against missing process.env in browsers
This module now reads process.env directly to determine the log level. Unlike the previous import.meta.env constants, process is not available in the browser or in the default Cloudflare Worker runtime, so importing this file will throw ReferenceError: process is not defined before any logging can occur. Because the logger is imported by many client-side stores and hooks, this would crash the UI in non‑Node environments. Consider reverting to compile‑time env injection or guarding the access behind a feature check.
Useful? React with 👍 / 👎.
| const apiKey = | ||
| env.NVIDIA_API_KEY || | ||
| cloudflareEnv.NVIDIA_API_KEY || | ||
| env.ANTHROPIC_API_KEY || | ||
| cloudflareEnv.ANTHROPIC_API_KEY; | ||
|
|
||
| if (!apiKey) { | ||
| throw new Error('Missing NVIDIA_API_KEY environment variable'); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Require NVIDIA API key instead of falling back to Anthropic
The new streaming implementation targets NVIDIA’s OpenAI-compatible endpoint, yet getAPIKey still returns ANTHROPIC_API_KEY when no NVIDIA key is set. In environments that only configured the old Anthropic key (the previous default), the call will proceed with that key and fail at runtime with an opaque 401 instead of failing fast with a clear configuration error. Now that Env declares NVIDIA_API_KEY as required, this function should throw when that key is missing rather than silently accepting an incompatible one.
Useful? React with 👍 / 👎.
No description provided.