Open AI Responses provider with Background support #381
marceloandrader
started this conversation in
Ideas
Replies: 1 comment 2 replies
-
|
Do you have a link to the openai documentation about this feature? |
Beta Was this translation helpful? Give feedback.
2 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hey there,
As part of a feature I'm working on, I needed to use the openai responses endpoint with a deep-research model in background mode, this will return the response object immediately while the output is being generated. And a webhook will be called when the response status is complete.
The neuronai chat method in my Agent using openai responses, raise an error in this condition because the response doesn't have a user message. It fails in:
To avoid this error I applied the following patch using https://docs.cweagans.net/composer-patches/ in src/Providers/OpenAI/Responses/OpenAIResponses.php
to allow me grab the response id in my app code, I'll live with this for now, but I was wondering if there is any plan of better handling this scenario for background jobs on openai.
Thanks
Beta Was this translation helpful? Give feedback.
All reactions