Christian Tzolov

Christian Tzolov

Spring AI lead developer; Staff Software Engineer at Broadcom; Apache Software Foundation Committer. Anything about Integration and Interoperability Architectures, Distributed and Data-Intensive Systems

Lead developer for the Spring AI project; Spring Framework Engineer at Broadcom; Apache Software Foundation Committer. Work focuses on system integrations, distributed data processing, data engineering and AI.

Blog posts by Christian Tzolov

Spring AI MCP 0.5.0 (Milestone) Released

Releases | January 13, 2025 | ...

We're pleased to announce Spring AI MCP 0.5.0 milestone release.

Major Features & Improvements

Transport Layer Enhancements

  • New Servlet-based SSE Transport

    • Added HttpServletSseServerTransport with Servlet 6.0 support
    • Enables integration with any Java HTTP server supporting Servlets
    • Compatible with Jakarta Servlet API 6.1.0
    • Includes comprehensive integration tests with Tomcat
  • Enhanced WebMVC Transport

    • Replaced Spring's SseEmitter with custom BlockingQueue-based implementation
    • Improved event delivery control and connection management
    • Added dedicated session management with SSEEvent record
    • Enhanced error handling and timeout management
    • Includes comprehensive integration tests

Spring AI MCP 0.4.0 (Milestone) Released

Releases | January 04, 2025 | ...

We're pleased to announce Spring AI MCP 0.4.0 milestone release.

Repository Configuration

Add this Spring milestone repository to your POM:

<repositories>
  <repository>
    <id>spring-milestones</id>
    <name>Spring Milestones</name>
    <url>https://repo.spring.io/libs-milestone-local</url>
    <snapshots>
      <enabled>false</enabled>
    </snapshots>
  </repository>
</repositories>

Major Features

Reference documentation: https://docs.spring.io/spring-ai-mcp/reference/overview.html

Enhanced Roots Management

  • Implemented proper ListRootsResult wrapping in async client
  • Added listRoots methods in server components
  • Implemented roots change notification handling in async server
  • Added roots integration tests including async notifications
  • Added support for roots change notification with single and multiple consumers
  • Improved robustness of root addition/removal scenarios

Spring AI MCP 0.3.0 (Milestone) Released

Releases | December 29, 2024 | ...

We're pleased to announce Spring AI MCP 0.2.0 milestone release.

Repository Configuration

Add this Spring milestone repository to your POM:

<repositories>
  <repository>
    <id>spring-milestones</id>
    <name>Spring Milestones</name>
    <url>https://repo.spring.io/libs-milestone-local</url>
    <snapshots>
      <enabled>false</enabled>
    </snapshots>
  </repository>
</repositories>

Major Features

MCP Server Enhancements

  • Introduced new McpServer factory with builder pattern for flexible configuration
  • Added McpAsyncServer with non-blocking operations and reactive support
  • Implemented McpSyncServer as synchronous wrapper around async implementation
  • Added runtime tool management capabilities
  • Introduced server capabilities and implementation information support

Spring AI MCP 0.2.0 (Milestone) Released

Releases | December 21, 2024 | ...

We're pleased to announce Spring AI MCP 0.2.0 milestone release.

Repository Configuration

Add this Spring milestone repository to your POM:

<repositories>
  <repository>
    <id>spring-milestones</id>
    <name>Spring Milestones</name>
    <url>https://repo.spring.io/libs-milestone-local</url>
    <snapshots>
      <enabled>false</enabled>
    </snapshots>
  </repository>
</repositories>

Breaking Changes

  • Module restructuring (see the Module Names Updated section below)
  • Renamed StdioServerTransport to StdioClientTransport

Key Features

API Updates

  • Simplified McpClient listing operations (no cursor parameter needed)
  • Added McpClient.Builder support.

Announcing Spring AI MCP: A Java SDK for the Model Context Protocol

Engineering | December 11, 2024 | ...

We're excited to introduce Spring AI MCP, a robust Java SDK implementation of the Model Context Protocol (MCP). This new addition to the Spring AI ecosystem brings standardized AI model integration capabilities to the Java platform.

What is MCP?

The Model Context Protocol (MCP) is an open protocol that standardizes how applications provide context to Large Language Models (LLMs). MCP provides a standardized way to connect AI models to different data sources and tools, making integration seamless and consistent. It helps you build agents and complex workflows on top of LLMs. LLMs frequently…

Introducing Spring AI Amazon Bedrock Nova Integration via Converse API

Engineering | December 10, 2024 | ...

The Amazon Bedrock Nova models represent a new generation of foundation models supporting a broad range of use cases, from text and image understanding to video-to-text analysis.

With the Spring AI Bedrock Converse API integration, developers can seamlessly connect to these advanced Nova models and build sophisticated conversational applications with minimal effort.

This blog post introduces the key features of Amazon Nova models, demonstrates their integration with Spring AI's Bedrock Converse API, and provides practical examples for text, image, video, document processing, and function…

Audio Multimodality: Expanding AI Interaction with Spring AI and OpenAI

Engineering | December 05, 2024 | ...

This blog post is co-authored by our great contributor Thomas Vitale.

OpenAI provides specialized models for speech-to-text and text-to-speech conversion, recognized for their performance and cost-efficiency. Spring AI integrates these capabilities via Voice-to-Text and Text-to-Speech (TTS).

The new Audio Generation feature (gpt-4o-audio-preview) goes further, enabling mixed input and output modalities. Audio inputs can contain richer data than text alone. Audio can convey nuanced information like tone and inflection, and together with the audio outputs it enables asynchronous speech-to-speech interactions. Additionally, this new multimodality opens up possibilities for innovative applications, such as structured data extraction. Developers can extract structured information not just from simple text, but also from images and audio, building complex, structured objects…

Leverage the Power of 45k, free, Hugging Face Models with Spring AI and Ollama

Engineering | October 22, 2024 | ...

This blog post is co-authored by our great contributor Thomas Vitale.

Ollama now supports all GGUF models from Hugging Face, allowing access to over 45,000 community-created models through Spring AI's Ollama integration, runnable locally.

spring-ai-ollama-huggingface-gguf2

We'll explore using this new feature with Spring AI. The Spring AI Ollama integration can automatically pull unavailable models for both chat completion and embedding models. This is useful when switching models or deploying to new environments.

Setting Up Spring AI with Ollama

Install Ollama on your system: https://ollama.com/download.

Tip: Spring AI also supports running Ollama via Testcontainers or integrating with an external Ollama service via Kubernetes Service Bindings

Supercharging Your AI Applications with Spring AI Advisors

Engineering | October 02, 2024 | ...

In the rapidly evolving world of artificial intelligence, developers are constantly seeking ways to enhance their AI applications. Spring AI, a Java framework for building AI-powered applications, has introduced a powerful feature: the Spring AI Advisors.

The advisors can supercharge your AI applications, making them more modular, portable and easier to maintain.

If reading the post isn't convenient, you can listen to this experimental podcast, AI-generated from blog's content:

What are Spring AI Advisors?

At their core, Spring AI Advisors are components that intercept and potentially modify the flow of chat-completion requests and responses in your AI applications. The key player in this system is the AroundAdvisor

Spring AI with NVIDIA LLM API

Engineering | August 20, 2024 | ...

Spring AI now supports NVIDIA's Large Language Model API, offering integration with a wide range of models. By leveraging NVIDIA's OpenAI-compatible API, Spring AI allows developers to use NVIDIA's LLMs through the familiar Spring AI API.

SpringAI-NVIDIA-API-5

We'll explore how to configure and use the Spring AI OpenAI chat client to connect with NVIDIA LLM API.

  • The demo application code is available in the nvidia-llm GitHub repository.
  • The SpringAI / NVIDIA integration documentation.

Prerequisite

  • Create NVIDIA account with sufficient credits.
  • Select your preferred LLM model from NVIDIA's offerings. Like the meta/llama-3.1-70b-instruct in the screenshot below.
  • From the model's page, obtain the API key for your chosen model.

Get ahead

VMware offers training and certification to turbo-charge your progress.

Learn more

Get support

Tanzu Spring offers support and binaries for OpenJDK™, Spring, and Apache Tomcat® in one simple subscription.

Learn more

Upcoming events

Check out all the upcoming events in the Spring community.

View all