-
Notifications
You must be signed in to change notification settings - Fork 4.5k
NET: How to disable think mode when calling the ollama model #13733
Copy link
Copy link
Open
Description
Microsoft.SemanticKernel:
public static async Task Main()
{
Console.OutputEncoding = System.Text.Encoding.UTF8;
var builder = Kernel.CreateBuilder();
var modelId = "gemma4:e4b-it-q4_K_M";
var endpoint = new Uri("http://localhost:11434");
builder.Services.AddOllamaChatCompletion(modelId, endpoint);
var kernel = builder.Build();
var chatCompletionService = kernel.GetRequiredService<IChatCompletionService>();
string prompt = "c#写一个计算当前月天数的方法";
var executionSettings = new OllamaPromptExecutionSettings
{
FunctionChoiceBehavior = FunctionChoiceBehavior.Auto(autoInvoke: false),
};
var response = await chatCompletionService.GetChatMessageContentAsync(prompt, kernel: kernel, executionSettings: executionSettings);
Console.Write(response.Content);
}
How to disable think mode when calling the ollama model?
It is recommended to add a method like chatOptions.AddOllamaOption(OllamaOption.Think, false);
Microsoft.Extensions.AI:
public async static Task Main()
{
OllamaApiClient ollamaClient = new OllamaApiClient(new Uri("http://localhost:11434/"), "gemma4:e4b-it-q4_K_M");
// 3. 调用 Ollama API
var client = new Microsoft.Extensions.AI.ChatClientBuilder(ollamaClient)
.UseFunctionInvocation()
.Build();
ChatOptions chatOptions = new()
{
Tools = [AIFunctionFactory.Create(QueryDateTime), AIFunctionFactory.Create(SaveFile), AIFunctionFactory.Create(ReadFile)],
};
chatOptions.AddOllamaOption(OllamaOption.Think, false);//Disable think
await foreach (var res in client.GetStreamingResponseAsync("c#写一个计算当前月天数的方法", chatOptions))//,
{
Console.Write(res.Text);
}
}
Reactions are currently unavailable
Metadata
Metadata
Assignees
Labels
No labels