Skip to content

Qwen format not taking system messages #1697

@mahobley

Description

@mahobley

models that use the 'qwen' chat format dont use custom system messages. This also seems to be the case for snoozy. A few others (vicuna, openbuddy, phind, intel) see to have a hard coded system message but as they dont have system_templates that might be the desired setup.

I think the only change required is
https://github.com/abetlen/llama-cpp-python/tree/main/llama_cpp/llama_chat_format.py
in format_qwen

1012 system_message = "You are a helpful assistant."

1012 system_message = _get_system_message(messages)

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions