LangChain4j Prompt对话机器人

引言

引入spring-boot 3.5.4,langchain4j-bom。截至目前,官网上langchain4j-bom的最高版本是1.8.0,均需要jdk17+

<parent>
    <groupId>org.springframework.boot</groupId>
    <artifactId>spring-boot-starter-parent</artifactId>
    <version>3.5.4</version>
</parent>

<properties>
    <maven.compiler.source>21</maven.compiler.source>
    <maven.compiler.target>21</maven.compiler.target>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
</properties>

<dependencyManagement>
    <dependencies>
        <dependency>
            <groupId>dev.langchain4j</groupId>
            <artifactId>langchain4j-bom</artifactId>
            <version>1.8.0</version>
            <type>pom</type>
            <scope>import</scope>
        </dependency>
    </dependencies>
</dependencyManagement>

<repositories>
    <repository>
        <name>Central Portal Snapshots</name>
        <id>central-portal-snapshots</id>
        <url>https://central.sonatype.com/repository/maven-snapshots/</url>
        <releases>
            <enabled>false</enabled>
        </releases>
        <snapshots>
            <enabled>true</enabled>
        </snapshots>
    </repository>
</repositories>

1.Prompt

以对接OpenAI大模型为例,添加依赖langchain4j-open-ai-spring-boot-starter

<dependencies>

    <dependency>
        <groupId>dev.langchain4j</groupId>
        <artifactId>langchain4j-open-ai-spring-boot-starter</artifactId>
    </dependency>

    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-webflux</artifactId>
    </dependency>

    <dependency>
        <groupId>org.springframework.boot</groupId>
        <artifactId>spring-boot-starter-test</artifactId>
        <scope>test</scope>
    </dependency>

</dependencies>
langchain4j:
  open-ai:
    streaming-chat-model:
      base-url: https://api.gptsapi.net/v1
      api-key: ${OPEN_AI_KEY}
      model-name: gpt-5-chat
      log-requests: true
      log-responses: true
      return-thinking: true
    chat-model:
      base-url: https://api.gptsapi.net/v1
      api-key: ${OPEN_AI_KEY}
      model-name: gpt-5-chat
      log-requests: true
      log-responses: true
      return-thinking: true


server:
  port: 8080

阻塞式和类Spring AI Flux的“流式”

package org.example.controller;

import dev.langchain4j.data.message.ChatMessage;
import dev.langchain4j.data.message.SystemMessage;
import dev.langchain4j.data.message.UserMessage;
import dev.langchain4j.model.chat.ChatModel;
import dev.langchain4j.model.chat.StreamingChatModel;
import dev.langchain4j.model.chat.response.*;
import jakarta.annotation.Resource;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
import reactor.core.publisher.Flux;

import java.util.Arrays;
import java.util.List;

@RestController
@RequestMapping("chat")
public class ChatController {

    @Resource
    private StreamingChatModel streamingChatModel;

    @Resource
    private ChatModel chatModel;

    @GetMapping("chat")
    public String chat(String msg) {
        List<ChatMessage> messages = Arrays.asList(
                new SystemMessage("你是一个数学老师,用简单易懂的方式解释数学概念。"),
                new UserMessage(msg)
        );

        ChatResponse chatResponse = chatModel.chat(messages);
        return chatResponse.aiMessage().text();

    }

    @GetMapping("streaming")
    public Flux<String> streaming(String msg) {

        List<ChatMessage> messages = Arrays.asList(
                new SystemMessage("你是一个数学老师,用简单易懂的方式解释数学概念。"),
                new UserMessage(msg)
        );

        return Flux.create(sink -> {
            streamingChatModel.chat(messages, new StreamingChatResponseHandler() {
                @Override
                public void onPartialResponse(PartialResponse partialResponse, PartialResponseContext context) {
                    sink.next(partialResponse.text());
                }

                @Override
                public void onPartialThinking(PartialThinking partialThinking) {
                    sink.next("<thinking>" + partialThinking.text() + "</thinking>");
                }

                @Override
                public void onCompleteResponse(ChatResponse completeResponse) {
                    sink.complete();
                }

                @Override
                public void onError(Throwable error) {
                    error.printStackTrace();
                    sink.complete();
                }
            });
        });

    }



}

"如果文章对您有帮助,可以请作者喝杯咖啡吗?"

微信二维码

微信支付

支付宝二维码

支付宝


LangChain4j Prompt对话机器人
https://blog.liuzijian.com/post/langchain4j/2025/11/04/langchain4j-prompt/
作者
Liu Zijian
发布于
2025年11月4日
许可协议