Skip to content

bonitasoft/bonita-connector-ai

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

149 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

bonita-connector-ai

The project follows a modular architecture with a shared core and provider-specific modules:

bonita-connector-ai (parent)
|
+-- bonita-connector-ai-core        Core abstractions, shared logic
|
+-- bonita-connector-ai-openai      OpenAI provider (GPT-4o, GPT-4o-mini)
+-- bonita-connector-ai-anthropic   Anthropic provider (Claude Sonnet, Opus, Haiku)
+-- bonita-connector-ai-gemini      Google Gemini provider (Gemini 2.0 Flash, 1.5 Pro)
+-- bonita-connector-ai-mistral     Mistral AI provider (Pixtral, Mistral Large)
+-- bonita-connector-ai-azure       Azure AI Foundry provider (Azure-hosted OpenAI models)
+-- bonita-connector-ai-ollama      Ollama provider (local LLMs: Llama, Mistral, etc.)

Each provider module contains three connector implementations:

  • Ask — Send a user prompt and get a response (with optional documents and JSON schema)

  • Extract — Extract structured data from documents

  • Classify — Classify documents into predefined categories

The core module defines the template method pattern with abstract base classes (AskAiConnector, ExtractAiConnector, ClassifyAiConnector) and the AiChat interface that each provider implements using LangChain4j.

Module Provider Default Model API Docs

bonita-connector-ai-core

(shared abstractions)

N/A

N/A

bonita-connector-ai-openai

OpenAI

gpt-4o

Models

bonita-connector-ai-anthropic

Anthropic (Claude)

claude-sonnet-4-6

Models

bonita-connector-ai-gemini

Google Gemini

gemini-2.0-flash

Models

bonita-connector-ai-mistral

Mistral AI

pixtral-12b-2409

Models

bonita-connector-ai-azure

Azure AI Foundry

(depends on deployment)

Models

bonita-connector-ai-ollama

Ollama (local)

llama3.1

Model Library

To use a connector, add it as a dependency to your Bonita process. Choose the module for your AI provider.

<dependency>
    <groupId>org.bonitasoft.connectors</groupId>
    <artifactId>bonita-connector-ai-openai</artifactId>
    <version>x.y.z</version>
</dependency>

API key: OpenAI API Keys

<dependency>
    <groupId>org.bonitasoft.connectors</groupId>
    <artifactId>bonita-connector-ai-anthropic</artifactId>
    <version>x.y.z</version>
</dependency>
<dependency>
    <groupId>org.bonitasoft.connectors</groupId>
    <artifactId>bonita-connector-ai-gemini</artifactId>
    <version>x.y.z</version>
</dependency>

API key: Google AI Studio

<dependency>
    <groupId>org.bonitasoft.connectors</groupId>
    <artifactId>bonita-connector-ai-mistral</artifactId>
    <version>x.y.z</version>
</dependency>

API key: Mistral Console

Warning
Image documents are not supported yet for the Mistral connector due to a limitation of the underlying library.
<dependency>
    <groupId>org.bonitasoft.connectors</groupId>
    <artifactId>bonita-connector-ai-azure</artifactId>
    <version>x.y.z</version>
</dependency>

API key: Azure Portal > Azure AI Foundry > Keys and Endpoint

Note
Azure AI Foundry requires setting the url parameter to your Azure endpoint and the chatModelName to your deployment name.
<dependency>
    <groupId>org.bonitasoft.connectors</groupId>
    <artifactId>bonita-connector-ai-ollama</artifactId>
    <version>x.y.z</version>
</dependency>
Note
Ollama allows you to run large language models locally on your infrastructure. No API key required. Ideal for on-premises deployments, data privacy requirements, or cost optimization.
Parameter name Required Description Default value

apiKey

false

The AI provider API key. The connector will use the system environment variable named AI_API_KEY or the same JVM property (-DAI_API_KEY=xxx) or the provided connector parameter and at last a dummy changeMe value.

changeMe

url

false

The AI provider endpoint url. This parameter allows to use an alternate endpoint for tests or custom deployments.

  • OpenAI: Official OpenAI endpoint

  • Anthropic: Official Anthropic endpoint

  • Gemini: Google AI Studio (automatic)

  • MistralAI: Official Mistral AI endpoint

  • Azure: Your Azure AI Foundry endpoint (required)

  • Ollama: http://localhost:11434 (default local installation)

requestTimeout

false

The request timeout in milliseconds for AI provider calls.

null

chatModelName

false

The model to use for chat. See the Modules table above for default values per provider.

  • OpenAI: gpt-4o

  • Anthropic: claude-sonnet-4-6

  • Gemini: gemini-2.0-flash

  • MistralAI: pixtral-12b-2409

  • Azure: (your deployment name)

  • Ollama: llama3.1

modelTemperature

false

The temperature to use for the model. Higher values will result in more creative responses. Must be between 0 and 1. Leave blank if the selected model does not support this parameter.

null

{
  "apiKey": "${AI_API_KEY}",
  "chatModelName": "gpt-4o",
  "systemPrompt": "You are a customer service analyst.",
  "userPrompt": "Summarize this complaint: ${complaintText}"
}
{
  "apiKey": "${AI_API_KEY}",
  "chatModelName": "claude-sonnet-4-6",
  "systemPrompt": "You are a legal compliance analyst.",
  "userPrompt": "Analyze this contract for GDPR compliance issues."
}
{
  "apiKey": "${AI_API_KEY}",
  "chatModelName": "gemini-2.0-flash",
  "categories": "INVOICE,CONTRACT,ID_CARD,PROOF_OF_ADDRESS,OTHER"
}
{
  "apiKey": "${AI_API_KEY}",
  "chatModelName": "mistral-large-latest",
  "categories": "BILLING,TECHNICAL_SUPPORT,ACCOUNT_MANAGEMENT,OTHER"
}
{
  "apiKey": "${AZURE_OPENAI_API_KEY}",
  "url": "https://my-company.openai.azure.com",
  "chatModelName": "gpt-4o",
  "systemPrompt": "You are a helpful assistant for HR processes.",
  "userPrompt": "Evaluate this CV: ${cvText}"
}
{
  "apiKey": "not-needed",
  "url": "http://localhost:11434",
  "chatModelName": "llama3.1",
  "systemPrompt": "You are a document analysis assistant.",
  "userPrompt": "Extract the key dates and amounts from this invoice."
}

AI connectors have the capability to return structured data in JSON format. It is possible to pass a JSON schema to tell the LLM how to format response data.

When using a JSON schema, you must list in the required property, all the fields you want in the JSON response.

JSON schema sample
{
  "$schema": "https://json-schema.org/draft/2020-12/schema",
  "title": "ProofOfAddress",
  "type": "object",
  "required": [
    "firstName",
    "lastName",
    "fullName",
    "fullAddress",
    "emissionDate",
    "issuerName",
    "identificationNumber"
  ],
  "properties": {
    "firstName": { "type": "string" },
    "lastName": { "type": "string" },
    "fullName": { "type": "string" },
    "fullAddress": { "type": "string" },
    "emissionDate": { "type": "string" },
    "issuerName": { "type": "string" },
    "identificationNumber": { "type": "string" }
  }
}

Take a user prompt and send it to the AI provider then return the response. The prompt text can ask questions about a provided process document.

Parameter name Required Description Default value

systemPrompt

false

The system prompt to influence the behavior of the assistant and specify a default context.

"You are a polite Assistant"

userPrompt

true

The user prompt content to send to the AI provider

sourceDocumentRef

false

The reference to the process document to load and add to the user prompt. Supported formats: "doc", "docx", "pdf", …​ (see Apache Tika formats)

null

outputJsonSchema

false

The JSON schema that represent how to structure the JSON connector output.

null

The result will be placed as a map entry of type java.lang.String for the key named output.

This connector allows extracting information from a Bonita document.

Parameter name Required Description Default value

sourceDocumentRef

true

The reference to the process document to load. Supported formats: "doc", "docx", "pdf", …​ (see Apache Tika formats)

null

fieldsToExtract

false

The list of fields to extract from the given document (List.of("firstName","lastName","address")).

null

outputJsonSchema

false

The JSON schema that represent how to structure the JSON connector output. If specified, the fieldsToExtract parameter is ignored.

null

Important
You must provide at least one of fieldsToExtract or outputJsonSchema parameters.

This connector allows classifying a Bonita process document according to a list of categories provided by the user.

Parameter name Required Description Default value

sourceDocumentRef

true

The reference to the process document to load. Supported formats: "doc", "docx", "pdf", …​ (see Apache Tika formats)

null

categories

true

The list of categories used to classify the given document (List.of("RIB","ID",…​)). It is recommended to add a default category such as Unknown.

null

Sample classification result
{
  "category": "xxx",
  "confidence": 0.9
}

The confidence score is defined as:

  • [0.0..0.3]: Very uncertain or guessing

  • [0.3..0.6]: Some uncertainty, potential ambiguity exists

  • [0.6..0.8]: Reasonably certain, minor doubt

  • [0.8..1.0]: Very certain, no doubt

Prerequisite:

  • Java ( jdk 17 or higher)

  • Maven (optional if you chose to use maven wrapper script as archetype option)

  • A Git client (optional but highly recommended)

  • Docker and docker compose for integration tests

The project is a standard maven project. For more details about Apache Maven, please refer to the documentation

git clone https://github.com/bonitasoft/bonita-connector-ai.git
cd bonita-connector-ai/
./mvnw package
# Build only the OpenAI module (and core dependency)
./mvnw clean package -pl bonita-connector-ai-openai -am

# Build only the Anthropic module
./mvnw clean package -pl bonita-connector-ai-anthropic -am

# Build only the Gemini module
./mvnw clean package -pl bonita-connector-ai-gemini -am

# Build only the Mistral module
./mvnw clean package -pl bonita-connector-ai-mistral -am

# Build only the Azure module
./mvnw clean package -pl bonita-connector-ai-azure -am

# Build only the Ollama module
./mvnw clean package -pl bonita-connector-ai-ollama -am

The build should produce connector packages as jar and zip archives under the modules target/ folders.

Integration tests require actual AI provider endpoints. Here are the options for each provider:

export OPENAI_API_KEY=your-api-key-here
./mvnw verify -PITs -pl bonita-connector-ai-openai
export ANTHROPIC_API_KEY=your-api-key-here
./mvnw verify -PITs -pl bonita-connector-ai-anthropic
export GEMINI_API_KEY=your-api-key-here
./mvnw verify -PITs -pl bonita-connector-ai-gemini
export MISTRAL_API_KEY=your-api-key-here
./mvnw verify -PITs -pl bonita-connector-ai-mistral
export AZURE_OPENAI_API_KEY=your-api-key-here
export AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com
export AZURE_OPENAI_MODEL=gpt-4o
./mvnw verify -PITs -pl bonita-connector-ai-azure

Step 1: Start Ollama with Docker Compose

docker compose -f docker-compose-ollama.yml up -d

Step 2: Pull a model (first time only)

# For default model (llama3.1 - ~4.7GB)
docker exec -it ollama-test ollama pull llama3.1

# Or for faster testing with smaller model (llama3.2:1b - ~1.3GB)
docker exec -it ollama-test ollama pull llama3.2:1b

Step 3: Run integration tests

# Using default model (llama3.1)
./mvnw verify -PITs -pl bonita-connector-ai-ollama

# Using smaller model (llama3.2:1b)
export OLLAMA_MODEL_NAME="llama3.2:1b"
./mvnw verify -PITs -pl bonita-connector-ai-ollama

Step 4: Stop Ollama when done

docker compose -f docker-compose-ollama.yml down
# Make sure Ollama is running and API keys are set
export OPENAI_API_KEY=your-openai-key
export ANTHROPIC_API_KEY=your-anthropic-key
export GEMINI_API_KEY=your-gemini-key
export MISTRAL_API_KEY=your-mistral-key
export AZURE_OPENAI_API_KEY=your-azure-key
export AZURE_OPENAI_ENDPOINT=https://your-resource.openai.azure.com
./mvnw verify -PITs

To add support for a new AI provider:

  1. Create a new module bonita-connector-ai-{provider} following the existing structure

  2. Add provider-specific LangChain4j dependency to the module pom.xml

  3. Create {Provider}Chat implementing the AiChat interface from the core module

  4. Create three connector classes extending the abstract connectors from core:

    • {Provider}AskConnector extends AskAiConnector

    • {Provider}ExtractDataConnector extends ExtractAiConnector

    • {Provider}ClassifyConnector extends ClassifyAiConnector

  5. Create connector definition files in src/main/resources-filtered/:

    • {provider}-ask.def / {provider}-ask.impl / {provider}-ask.properties

    • {provider}-extract.def / {provider}-extract.impl / {provider}-extract.properties

    • {provider}-classify.def / {provider}-classify.impl / {provider}-classify.properties

  6. Add the new module to the parent pom.xml <modules> section

  7. Configure maven properties for connector IDs and versions

To release a new version, maintainers may use the Release and Publication GitHub Actions workflows.

  • Running the Release workflow will invoke the gitflow-maven-plugin to perform all required merges, version updates and tag creation.

  • Run the Publication workflow action will build and deploy a given tag to Maven Central.

  • A GitHub release should be created and associated with the tag. Manage this manually.

Once this is done, update the Bonita marketplace repository with the new version of the connector.

About

No description, website, or topics provided.

Resources

License

Contributing

Stars

Watchers

Forks

Packages

 
 
 

Contributors

Languages