Replies: 1 comment
-
|
@xyzhu68 1. System prompts are for LLM, and the Agent framework is only responsible for forwarding. 2. Prompts are needed to tell the Agent how to use the tool. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi,
in an Agent you can define an LLM, e.g. deepseek-chat. You can also define the system prompt (I think this is for the LLM or is it for the agent or both?)
My question is, is it possible to make function calling with the LLM? How can I define "tools" in the request and how can I get "tool_calls" from the response of the LLM?
Beta Was this translation helpful? Give feedback.
All reactions