Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Workflow LLM node stop sequence to '\n' will be escape to '\\n'. #4197

Open
4 tasks done
sagarey opened this issue May 8, 2024 · 1 comment
Open
4 tasks done

Workflow LLM node stop sequence to '\n' will be escape to '\\n'. #4197

sagarey opened this issue May 8, 2024 · 1 comment
Assignees
Labels
🐞 bug Something isn't working

Comments

@sagarey
Copy link
Contributor

sagarey commented May 8, 2024

Self Checks

  • This is only for bug report, if you would like to ask a quesion, please head to Discussions.
  • I have searched for existing issues search for existing issues, including closed ones.
  • I confirm that I am using English to submit this report (我已阅读并同意 Language Policy).
  • Pleas do not modify this template :) and fill in all the required fields.

Dify version

0.6.6

Cloud or Self Hosted

Self Hosted (Docker)

Steps to reproduce

In workflow setting the stop sequence to '\n' in the LLM node will be escaped, LLM API receives it as '\\n'.
屏幕截图 2024-05-09 010741

OpenAI-API-compatible Model Provider:
image

✔️ Expected Behavior

The stop sequence configured on the frontend same as finally called stop config.
image

❌ Actual Behavior

Configuration is escaped.
屏幕截图 2024-05-09 011833

@dosubot dosubot bot added the 🐞 bug Something isn't working label May 8, 2024
Copy link

dosubot bot commented May 8, 2024

To address the issue where the stop sequence \n is being escaped to \\n in your Dify 0.6.6 workflow, ensure that the stop sequence is correctly specified in the model configuration parameters. The stop sequence should be included as a list of strings in the completion_params of your model configuration, like so:

model_config = {
    "name": "TheBloke/codecept-coder-6.7B-instruct-AI21",
    "completion_params": {
        "stop": ["\n"],  // Directly specifying the newline character
        # other parameters...
    }
    # other configuration...
}

Ensure that the stop parameter is not modified or incorrectly escaped from the point of configuration to invocation. If the issue persists, review the code path for any transformations or re-escapings that might occur, especially in utility functions or methods that manipulate string parameters.

The Dify codebase handles the escaping of characters in the stop sequence configuration for the LLM node in methods like _transform_chat_json_prompts and _transform_completion_json_prompts within the OpenAILargeLanguageModel class. These methods append specific escape sequences to the stop sequences for JSON or XML response formats, ensuring correct interpretation and handling.

If you're still facing issues, providing more details about your setup, configuration, and the specific part of your workflow where this issue occurs could help in identifying a more precise solution.

Details

Sources


About Dosu
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research.

Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant.

To continue the conversation, mention @dosu.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
🐞 bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants