You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
To address the issue where the stop sequence \n is being escaped to \\n in your Dify 0.6.6 workflow, ensure that the stop sequence is correctly specified in the model configuration parameters. The stop sequence should be included as a list of strings in the completion_params of your model configuration, like so:
model_config= {
"name": "TheBloke/codecept-coder-6.7B-instruct-AI21",
"completion_params": {
"stop": ["\n"], //Directlyspecifyingthenewlinecharacter# other parameters...
}
# other configuration...
}
Ensure that the stop parameter is not modified or incorrectly escaped from the point of configuration to invocation. If the issue persists, review the code path for any transformations or re-escapings that might occur, especially in utility functions or methods that manipulate string parameters.
The Dify codebase handles the escaping of characters in the stop sequence configuration for the LLM node in methods like _transform_chat_json_prompts and _transform_completion_json_prompts within the OpenAILargeLanguageModel class. These methods append specific escape sequences to the stop sequences for JSON or XML response formats, ensuring correct interpretation and handling.
If you're still facing issues, providing more details about your setup, configuration, and the specific part of your workflow where this issue occurs could help in identifying a more precise solution.
About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.
Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.
Self Checks
Dify version
0.6.6
Cloud or Self Hosted
Self Hosted (Docker)
Steps to reproduce
In workflow setting the stop sequence to '\n' in the LLM node will be escaped, LLM API receives it as '\\n'.
OpenAI-API-compatible Model Provider:
✔️ Expected Behavior
The stop sequence configured on the frontend same as finally called stop config.
❌ Actual Behavior
Configuration is escaped.
The text was updated successfully, but these errors were encountered: