Introduction
As a developer with 9 years of experience in Java and Spring Boot, I’ve always been curious about how emerging technologies can blend with enterprise frameworks. Recently, I explored integrating Spring Boot with Ollama via Spring AI to build a lightweight POST API. The goal: send a question to the API and get an AI‑powered response back — all running locally.
This post walks through the setup, code, and demo, and reflects on why this integration excites me.
Project Setup
I generated the project using Spring Initializr with the following configuration:
- Project: Maven
- Language: Java
- Spring Boot Version: 4.0.6
- Group:
com.github.aleem-raja.ai - Artifact:
ollama - Dependencies: Spring Web, Ollama (Spring AI), Lombok
Configuration
In application.yml, I configured Ollama to connect to the local model:
spring:
application:
name: ollama
ai:
ollama:
chat:
options:
model: llama3.1:8b
(Tip: Run ollama list to see which models are installed. If you see errors like model 'mistral' not found, just pull the model with ollama pull mistral or switch to one you already have.)
Code Walkthrough
DTOs
Using Lombok to reduce boilerplate:
@Data
@NoArgsConstructor
@AllArgsConstructor
public class AskRequest {
private String question;
}
@Data
@NoArgsConstructor
@AllArgsConstructor
public class AskResponse {
private String answer;
}
Controller
Expose a POST endpoint /ask:
@RestController
@RequestMapping("/api")
public class AskController {
private final OllamaChatModel chatModel;
public AskController(OllamaChatModel chatModel) {
this.chatModel = chatModel;
}
@PostMapping("/ask")
public AskResponse ask(@RequestBody AskRequest request) {
String response = chatModel.call(request.getQuestion());
return new AskResponse(response);
}
}
Demo
Run the app:
mvn spring-boot:run
Send a POST request:
curl -X POST http://localhost:8080/api/ask \
-H "Content-Type: application/json" \
-d '{"question":"Explain Spring Boot in simple terms"}'
Reflection
What excites me about this experiment is how enterprise frameworks like Spring Boot can integrate directly with local AI models. This opens up possibilities for:
- Smart backend APIs
- AI‑powered developer tools
- Lightweight prototypes without cloud dependency
It’s a reminder that even after years of working with Java, there’s always room to explore new intersections between established tech and emerging trends.
Conclusion
This project is available on GitHub:
👉 springboot-ollama-ai-demo
Try it yourself, experiment with different models, and let me know how you’d use AI in your backend workflows. I’ll continue exploring streaming responses and advanced integrations — stay tuned for updates.
0 Comments