首页
学习
活动
专区
圈层
工具
发布
社区首页 >专栏 >golang源码分析:langchaingo(2)

golang源码分析:langchaingo(2)

作者头像
golangLeetcode
发布2026-03-18 17:53:29
发布2026-03-18 17:53:29
1140
举报

接着我们看下提示词补全的调用是如何实现的github.com/tmc/langchaingo@v0.1.13/llms/llms.go

代码语言:javascript
复制
func GenerateFromSinglePrompt(ctx context.Context, llm Model, prompt string, options ...CallOption) (string, error) {
    msg := MessageContent{
        Role:  ChatMessageTypeHuman,
        Parts: []ContentPart{TextContent{Text: prompt}},
    }
    resp, err := llm.GenerateContent(ctx, []MessageContent{msg}, options...)
    if err != nil {
        return "", err
    }
    choices := resp.Choices
    if len(choices) < 1 {
        return "", errors.New("empty response from model")
    }
    c1 := choices[0]
    return c1.Content, nil
}

它给当前提示词的角色是

代码语言:javascript
复制
ChatMessageTypeHuman ChatMessageType = "human"

调用了GenerateContent,并且获取返回值列表的第一个值。github.com/tmc/langchaingo@v0.1.13/llms/llms.go中规范了调用的两个接口:GenerateContent和Call,它适用于所有LLM模型。

代码语言:javascript
复制
// Model is an interface multi-modal models implement.
type Model interface {
    // GenerateContent asks the model to generate content from a sequence of
    // messages. It's the most general interface for multi-modal LLMs that support
    // chat-like interactions.
    GenerateContent(ctx context.Context, messages []MessageContent, options ...CallOption) (*ContentResponse, error)
    // Call is a simplified interface for a text-only Model, generating a single
    // string response from a single string prompt.
    //
    // Deprecated: this method is retained for backwards compatibility. Use the
    // more general [GenerateContent] instead. You can also use
    // the [GenerateFromSinglePrompt] function which provides a similar capability
    // to Call and is built on top of the new interface.
    Call(ctx context.Context, prompt string, options ...CallOption) (string, error)
}

而具体到openai的LLM模型的实现源码位于:github.com/tmc/langchaingo@v0.1.13/llms/openai/openaillm.go

代码语言:javascript
复制
// GenerateContent implements the Model interface.
func (o *LLM) GenerateContent(ctx context.Context, messages []llms.MessageContent, options ...llms.CallOption) (*llms.ContentResponse, error) { //nolint: lll, cyclop, goerr113, funlen

如果注册了调用前的回调,先执行回调

代码语言:javascript
复制
    if o.CallbacksHandler != nil {
        o.CallbacksHandler.HandleLLMGenerateContentStart(ctx, messages)
    }

然后进行了角色映射,由于不同的LLM支持的角色不一样,langchaingo索性定义了角色全集,最后映射到每个具体模型的内部角色

代码语言:javascript
复制
for _, mc := range messages {
        msg := &ChatMessage{MultiContent: mc.Parts}
        switch mc.Role {
        case llms.ChatMessageTypeSystem:
            msg.Role = RoleSystem
        case llms.ChatMessageTypeAI:
            msg.Role = RoleAssistant
        case llms.ChatMessageTypeHuman:
            msg.Role = RoleUser

然后提取中间的MCP调用,进行格式转换

代码语言:javascript
复制
        newParts, toolCalls := ExtractToolParts(msg)
        msg.MultiContent = newParts
        msg.ToolCalls = toolCallsFromToolCalls(toolCalls)

然后组装成一个复杂的openai大模型调用接口,中间包括temperature等参数

代码语言:javascript
复制
req := &openaiclient.ChatRequest{
        Model:            opts.Model,
        StopWords:        opts.StopWords,
        Messages:         chatMsgs,
        StreamingFunc:    opts.StreamingFunc,
        Temperature:      opts.Temperature,
        N:                opts.N,
        FrequencyPenalty: opts.FrequencyPenalty,
        PresencePenalty:  opts.PresencePenalty,
        MaxCompletionTokens: opts.MaxTokens,
        ToolChoice:           opts.ToolChoice,
        FunctionCallBehavior: openaiclient.FunctionCallBehavior(opts.FunctionCallBehavior),
        Seed:                 opts.Seed,
        Metadata:             opts.Metadata,
    }
    if opts.JSONMode {
        req.ResponseFormat = ResponseFormatJSON
    }

然后是组装支持的tool call

代码语言:javascript
复制
for _, fn := range opts.Functions {
        req.Tools = append(req.Tools, openaiclient.Tool{
            Type: "function",
            Function: openaiclient.FunctionDefinition{
                Name:        fn.Name,
                Description: fn.Description,
                Parameters:  fn.Parameters,
                Strict:      fn.Strict,
            },
        })
    }

定义返回值格式

代码语言:javascript
复制
if o.client.ResponseFormat != nil {
        req.ResponseFormat = o.client.ResponseFormat
    }

接着调用CreateChat获取返回结果

代码语言:javascript
复制
result, err := o.client.CreateChat(ctx, req)

进一步追踪下如何调用的github.com/tmc/langchaingo@v0.1.13/llms/openai/internal/openaiclient/openaiclient.go

代码语言:javascript
复制
func (c *Client) CreateChat(ctx context.Context, r *ChatRequest) (*ChatCompletionResponse, error) {
    if r.Model == "" {
        if c.Model == "" {
            r.Model = defaultChatModel
        } else {
            r.Model = c.Model
        }
    }
    resp, err := c.createChat(ctx, r)
    if err != nil {
        return nil, err
    }
    if len(resp.Choices) == 0 {
        return nil, ErrEmptyResponse
    }
    return resp, nil
}

github.com/tmc/langchaingo@v0.1.13/llms/openai/internal/openaiclient/chat.go

代码语言:javascript
复制
func (c *Client) createChat(ctx context.Context, payload *ChatRequest) (*ChatCompletionResponse, error) {
    if payload.StreamingFunc != nil {
        payload.Stream = true
        if payload.StreamOptions == nil {
            payload.StreamOptions = &StreamOptions{IncludeUsage: true}
        }
    }
    // Build request payload
    payloadBytes, err := json.Marshal(payload)
    if err != nil {
        return nil, err
    }
    // Build request
    body := bytes.NewReader(payloadBytes)
    req, err := http.NewRequestWithContext(ctx, http.MethodPost, c.buildURL("/chat/completions", payload.Model), body)
    if err != nil {
        return nil, err
    }
    c.setHeaders(req)
    // Send request
    r, err := c.httpClient.Do(req)
    if err != nil {
        return nil, err
    }
    defer r.Body.Close()
    if r.StatusCode != http.StatusOK {
        msg := fmt.Sprintf("API returned unexpected status code: %d", r.StatusCode)
        // No need to check the error here: if it fails, we'll just return the
        // status code.
        var errResp errorMessage
        if err := json.NewDecoder(r.Body).Decode(&errResp); err != nil {
            return nil, errors.New(msg) // nolint:goerr113
        }
        return nil, fmt.Errorf("%s: %s", msg, errResp.Error.Message) // nolint:goerr113
    }
    if payload.StreamingFunc != nil {
        return parseStreamingChatResponse(ctx, r, payload)
    }
    // Parse response
    var response ChatCompletionResponse
    return &response, json.NewDecoder(r.Body).Decode(&response)
}

最终是通过http的POST方法调用了/chat/completions接口。

处理返回值

代码语言:javascript
复制
for i, c := range result.Choices {
        choices[i] = &llms.ContentChoice{
            Content:    c.Message.Content,
            StopReason: fmt.Sprint(c.FinishReason),
            GenerationInfo: map[string]any{
                "CompletionTokens": result.Usage.CompletionTokens,
                "PromptTokens":     result.Usage.PromptTokens,
                "TotalTokens":      result.Usage.TotalTokens,
                "ReasoningTokens":  result.Usage.CompletionTokensDetails.ReasoningTokens,
            },
        }

处理function call

代码语言:javascript
复制
if c.FinishReason == "function_call" {
            choices[i].FuncCall = &llms.FunctionCall{
                Name:      c.Message.FunctionCall.Name,
                Arguments: c.Message.FunctionCall.Arguments,
            }
        }
        for _, tool := range c.Message.ToolCalls {
            choices[i].ToolCalls = append(choices[i].ToolCalls, llms.ToolCall{
                ID:   tool.ID,
                Type: string(tool.Type),
                FunctionCall: &llms.FunctionCall{
                    Name:      tool.Function.Name,
                    Arguments: tool.Function.Arguments,
                },
            })
        }

调用结束回调

代码语言:javascript
复制
    if o.CallbacksHandler != nil {
        o.CallbacksHandler.HandleLLMGenerateContentEnd(ctx, response)
    }

至此完整的流程介绍完毕。

本文参与 腾讯云自媒体同步曝光计划,分享自微信公众号。
原始发表:2025-06-06,如有侵权请联系 cloudcommunity@tencent.com 删除

本文分享自 golang算法架构leetcode技术php 微信公众号,前往查看

如有侵权,请联系 cloudcommunity@tencent.com 删除。

本文参与 腾讯云自媒体同步曝光计划  ,欢迎热爱写作的你一起参与!

评论
登录后参与评论
0 条评论
热度
最新
推荐阅读
领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档