Skip to content

[WIP] .Net: Bug: When using ChatCompletionAgent and the locally deployed llama3.2:3b model, the user's Chinese question became garbled in the function call parameters. #12366

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 1 commit into from

Conversation

Copilot
Copy link

@Copilot Copilot AI commented Jun 4, 2025

Thanks for assigning this issue to me. I'm starting to work on it and will keep this PR's description up to date as I form a plan and make progress.

Original issue description:

Describe the bug
Framework: Microsoft Semantic Kernel 1.49.0

I am testing ChatCompletionAgent with a locally deployed llama3.2:3b to query a knowledge base with data in Chinese via a text search plugin. When a user asks a question in Chinese, the agent can invoke the text search plugin, but with a garbled Chinese text, causing the search to fail. Please see the screenshot below:

Screenshots

Image

I attach part of my code below:

        kernelBuilder.Services.AddOllamaChatCompletion(
                modelId: LLMConfig.Instance.ConfigModel.ModelId,
                endpoint: new Uri(LLMConfig.Instance.ConfigModel.ApiEndpoint)
            );
    var textEmbeddingGeneration = vectorStoreFixture.TextEmbeddingGenerationService;
        var vectorSearch = vectorStoreFixture.VectorStoreRecordCollection;
        var customVectorSearch = new CustomVectorSearch(vectorSearch, threshold);

        // Create a text search instance using the InMemory vector store.
        var textSearch = new VectorStoreTextSearch<VectorRecordModel>(
            customVectorSearch,
            textEmbeddingGeneration);

        var searchPlugin = KernelPluginFactory.CreateFromFunctions(
            pluginName, description,
            [textSearch.CreateGetTextSearchResults(searchOptions: searchOptions)]);

        kernel.Plugins.Add(searchPlugin);

       var kernel = kernelBuilder.Build();
       ChatCompletionAgent faqAgent =
           new()
           {
               Name = "SearchFAQAgent",
               Instructions = LLMConfig.Instance.ConfigModel.Instructions,
               Kernel = kernel,
               Arguments =
                   new KernelArguments(new OllamaPromptExecutionSettings()
                   {
                       FunctionChoiceBehavior = FunctionChoiceBehavior.Auto()
                   })
           };

`

Platform

  • Language: [C#]
  • AI model: [llama3.2:3b]
  • IDE: [Visual Studio]
  • OS: [Windows]

Fixes #12103.


💡 You can make Copilot smarter by setting up custom instructions, customizing its development environment and configuring Model Context Protocol (MCP) servers. Learn more Copilot coding agent tips in the docs.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
2 participants