LLM based inference of Workflow node parameters when they are not explicitly defined. #4193
Open
4 tasks done
Labels
💪 enhancement
New feature or request
Self Checks
1. Is this request related to a challenge you're experiencing?
Issue with Dify's ChatFlow feature. I initiated some nodes that required certain parameters. ChatFlow allowed me to initiate a run from a message in the chatbox. The message contained all necessary parameters for the intermediate nodes to execute successfully, but these parameter values were not inferred, causing the chatflow run to fail with an error.
2. Describe the feature you'd like to see
I'd like a mechanism similar to Agent Behavior, where when a tool parameter is specified as
form: llm
in the schema, the system automatically fills in the parameter value based on the user's message. This would eliminate the need for the user to prefill the parameter value in the chatflow tool node. Without this feature, the capabilities offered by ChatFlow are very similar to those of WorkFlow, and there's no significant differentiation.For improved inference, it would be beneficial to provide the entire chat history at the time of parameter inference.
3. How will this feature improve your workflow or experience?
I can interact with my workflow in the same way I interact with agents.
I won't have to fill out forms.
I'll be able to publish more reusable ChatFlows. As, I can specify the tool parameters during execution time instead of at chatflow creation time, it will enable more generic use cases for each published ChatFlow.
4. Additional context or comments
This inferencing capability should be added to every built-in tool, likely within the class ToolNode after tool configurations describing the schema have been fetched. If a parameter value is empty, user input and preferably chat history should be passed to an LLM call along with the input schema fetched from the
tool.yml
file. The LLM should then be able to fill in the parameters passed into the tool, just like Agents currently do.5. Can you help us with this feature?
The text was updated successfully, but these errors were encountered: