diff --git a/content/en/docs/marketplace/genai/how-to/byo_connector.md b/content/en/docs/marketplace/genai/how-to/byo_connector.md
index 1e2dd639c33..ad8538f9119 100644
--- a/content/en/docs/marketplace/genai/how-to/byo_connector.md
+++ b/content/en/docs/marketplace/genai/how-to/byo_connector.md
@@ -68,11 +68,7 @@ The Echo connector is a module in the [GenAI Showcase App](https://marketplace.m
This section allows you to focus on implementing chat completions, a fundamental capability supported by most LLMs. To make the process more practical, develop an example connector—the Echo Connector. This simple connector returns the similar text as output provided as input while remaining fully compatible with the chat capabilities of GenAICommons and ConversationalUI.
During development, you will get the key considerations to keep in mind when creating your own connector. You can either start from scratch and build your own connector or use the finished Echo Connector from the GenAI Showcase App and modify it to fit your use case.
-To enable chat completion, the key microflow to consider is `ChatCompletions_WithHistory`, located in the GenAICommons module.
-
-{{< figure src="/attachments/appstore/platform-supported-content/modules/genai/genai-howto-byo/ChatCompletions_WithHistory.png" >}}
-
-This microflow plays a crucial role as it derives and calls the appropriate microflow from the provided DeployedModel, ensuring that the module remains independent of individual connectors. This is especially important for modules like ConversationalUI, which should work seamlessly with any connector following the same principles.
+To enable chat completion, the key microflow to consider is `ChatCompletions_WithHistory`, located in the GenAICommons module. This microflow plays a crucial role as it derives and calls the appropriate microflow from the provided DeployedModel, ensuring that the module remains independent of individual connectors. This is especially important for modules like ConversationalUI, which should work seamlessly with any connector following the same principles.
To integrate properly, the microflow must supply two essential input objects:
diff --git a/content/en/docs/marketplace/genai/how-to/create-single-agent.md b/content/en/docs/marketplace/genai/how-to/create-single-agent.md
index 0430abd80dc..1875363ab88 100644
--- a/content/en/docs/marketplace/genai/how-to/create-single-agent.md
+++ b/content/en/docs/marketplace/genai/how-to/create-single-agent.md
@@ -150,7 +150,7 @@ We will add two microflows that the agent can leverage to use live app data:
* One microflow will cover the count of tickets in the database that have a specific status.
* The other microflow will cover the details of a specific ticket, given that the identifier is known.
-The final result for the function microflows used in this document can be found in the **ExampleMicroflows** folder of the [GenAI Showcase App](https://marketplace.mendix.com/link/component/220475) for reference. This example focuses only on retrieval functions, but you can also expose functions that perform actions on behalf of the user—for example, creating a new ticket, as demonstrated in the [Agent Builder Starter App](https://marketplace.mendix.com/link/component/240369).
+The final result for the function microflows used in this document can be found in the **ExampleMicroflows** module of the [GenAI Showcase App](https://marketplace.mendix.com/link/component/220475) for reference. This example focuses only on retrieval functions, but you can also expose functions that perform actions on behalf of the user—for example, creating a new ticket, as demonstrated in the [Agent Builder Starter App](https://marketplace.mendix.com/link/component/240369).
#### Function Microflow: Get Number of Tickets by Status
@@ -269,7 +269,7 @@ Create an agent that can be called to interact with the LLM. The [Agent Commons]
4. Choose the version you want to set as `In Use`.
5. Select the *Initial agent with prompt* version and click **Select**.
-### Empower the Agent
+### Empower the Agent {#empower-agent}
In order to let the agent generate responses based on specific data and information, you will connect it to two function microflows and a knowledge base. Even though the implementation is not complex—you only need to link it in the front end—it is highly recommended to be familiar with the [Integrate Function Calling into Your Mendix App](/appstore/modules/genai/how-to/howto-functioncalling/) and [Grounding Your Large Language Model in Data – Mendix Cloud GenAI](/appstore/modules/genai/how-to/howto-groundllm/#chatsetup) documents. These guides cover the foundational concepts for function calling and knowledge base retrieval.
@@ -305,9 +305,9 @@ Before adding tools via MCP, ensure you have at least one `MCPClient.MCPServerCo
1. Navigate to the agent view page for the IT-Ticket Helper agent and go to the Tools section. Add a new tool of type MCP tools.
2. Select the appropriate MCP server configuration from the available options.
- 3. Choose your import type:
- * `server`: imports all tools exposed by the server
- * `tools`: allows you to select specific tools from the server
+ 3. Choose how to add MCP tools:
+ * **Use all available tools**: imports the entire server, including all tools it provides. This also means less control over individual tools and if tools are added in the future, that they get added automatically on agent execution.
+ * **Select Tools**: allows you to import specific tools from the server and changing specific fields for individual tools.
4. If you selected import type `tools`, you can choose to enable all available tools or select only the specific ones you need.
5. Click **Save**. The connected server or your selected tools will now appear in the agent's tool section.
@@ -317,7 +317,7 @@ You will also connect the agent to our knowledge base, so that it can use histor
1. From the agent view page for the `IT-Ticket Helper` agent, under **Knowledge bases**, add a new knowledge base:
- * Knowledge base: select the knowledge base created in a previous step. For Mendix Cloud GenAI in particular, look for the collection `HistoricalTickets`. If nothing appears in the list, refer to the documentation of the connector on how to set it up correctly.
+ * Consumed Knowledge base: select the knowledge base resource created in a previous step. Next, look for the collection `HistoricalTickets`. If nothing appears in the list, refer to the documentation of the connector on how to set it up correctly.
* Name: `RetrieveSimilarTickets` (expression)
* Description: `Similar tickets from the database` (expression)
* MaxNumberOfResults: empty (expression; optional)
@@ -367,7 +367,26 @@ The button does not perform any actions yet, so you need to create a microflow t
{{< figure src="/attachments/appstore/platform-supported-content/modules/genai/genai-howto-singleagent/Microflow_AgentCommons.png" >}}
-Run the app to see the agent integrated in the use case. From the **TicketHelper_Agent** page, the user can ask the model questions and receive responses. When it deems it relevant, it uses the functions or knowledge base. If you ask the agent "How many tickets are open?", a log should appear in your Studio Pro console indicating that the function microflow was executed. Furthermore, when a user submits a request like, "My VPN crashes all the time and I need it to work on important documents", the agent will search the knowledge base for similar tickets and provide a relevant solution.
+Run the app to see the agent integrated in the use case. From the **TicketHelper_Agent** page, the user can ask the model questions and receive responses. When it deems it relevant, it uses the functions or the knowledge base. If you ask the agent "How many tickets are open?", a log should appear in your Studio Pro console indicating that the function microflow was executed. Furthermore, when a user submits a request like "My VPN crashes all the time and I need it to work on important documents", the agent will search the knowledge base for similar tickets and provide a relevant solution.
+
+#### Enable User Confirmation for Tools {#user-confirmation}
+
+This is an optional step to use the human-in-the-loop pattern to give users control over tool executions. When [adding tools to the agent](#empower-agent) you can configure a `User Access and Approval` setting to either make the tools visible to the user or require the user to confirm or reject a tool call. This way, the user is in control of actions that the LLM requested to perform.
+
+To make this work, you need to implement a few steps. For more information, see [here](/appstore/modules/genai/genai-for-mx/conversational-ui/#human-in-the-loop):
+
+1. Change the `User Access and Approval` setting for one of the tools to `User Confirmation Required` in the agent editor. You may want to add a display title and description to make it more human-readable. Make sure to save the version and mark it as `In Use`.
+2. In Studio Pro, modify your microflow that calls the agent. After the agent retrieval step, add the `Create Request` action from the toolbox. All parameters can be empty except the `ID`, which you can get from the `TicketHelper` object.
+3. Afterward, add the microflow `Request_AddMessage_ToolMessages` from the ConversationalUI module. Pass the message that is associated with your **TicketHelper**.
+4. Duplicate the `Request_CallAgent_ToolUserConfirmation_Example` microflow from ConversationalUI in your own module and include it in the project. This microflow needs to be called instead of `Call Agent Without History`. You need to make some modifications to it (the annotations show the position):
+ * Add your context object **TicketHelper** as an input parameter and pass it in the first **Call Agent Without History** action.
+ * Change the message retrieval to retrieve a **Message** from your **TicketHelper** via association.
+ * After calling the microflow `Response_CreateOrUpdateMessage`, add a `Change object` action to set the association **TicketHelper_Message** to the **Message_Conversational** object.
+ * After the decision, add an action to call the `ACT_TicketHelper_CallAgent_Commons` again to ensure that updated tool messages are sent back to the LLM.
+ * Inside the loop in the `false` path, you can open a page for the user to decide if the tool should be executed or not. For this, you may want to add the `ToolMessage_UserConfirmation_Example` page to your module.
+5. Create microflows for the **confirm** and **reject** buttons that should update the status of the tool message, for example by calling the `ToolMessage_UpdateStatus` microflow. If no more pending tool messages are available, you can call the **ACT_TicketHelper_CallAgent_Commons** again. Make sure to always close the popup page on decisions.
+
+Examples for both Agent Commons and GenAI Commons can be found in the [GenAI Showcase App](https://marketplace.mendix.com/link/component/220475) in the `ExampleMicroflows` module.
## Define the Agent Using Microflows {#define-genai-commons}
@@ -506,13 +525,12 @@ For both approaches, you need an `MCPClient.MCPServerConfiguration` object conta
Finally, you can add a tool for knowledge base retrieval. This allows the agent to query the knowledge base for similar tickets and thus tailor a response to the user based on private knowledge. Note that the knowledge base retrieval is only supported for [Mendix Cloud GenAI Resource Packs](/appstore/modules/genai/mx-cloud-genai/resource-packs/).
-1. In the microflow `ACT_TicketHelper_CallAgent`, add a `Retrieve` action, before the request is created, to retrieve a **Deployed Knowledge Base** object:
+1. In the microflow `_ACT_TicketHelper_Agent_GenAICommons`, add a `Retrieve` action, before the request is created, to retrieve a **Consumed Knowledge Base** object:
* Source: `From database`
- * Entity: `GenAICommons.DeployedKnowledgeBase` (search for *DeployedKnowledgeBase*)
- * Xpath: `[Name = 'HistoricalTickets']` (name that was used in the [Ingest Data into Knowledge Base](#ingest-knowledge-base))
+ * Entity: `GenAICommons.ConsumedKnowledgeBase` (search for *ConsumedKnowledgeBase*)
* Range: `First`
- * Object name: `DeployedKnowledgeBase` (default)
+ * Object name: `ConsumedKnowledgeBase` (default)
2. Add the `Tools: Add Knowledge Base` action after the **Request** creation microflow:
@@ -522,13 +540,16 @@ Finally, you can add a tool for knowledge base retrieval. This allows the agent
* MetadataCollection: empty (expression; optional)
* Name: `RetrieveSimilarTickets` (expression)
* Description: `Similar tickets from the database` (expression)
- * DeployedKnowledgeBase: `DeployedKnowledgeBase` (as retrieved in step 1)
+ * ConsumedKnowledgeBase: `ConsumedKnowledgeBase` (as retrieved in step 1)
+ * CollectionIdentifier: `'HistoricalTickets'` (name that was used in the [Ingest Data into Knowledge Base](#ingest-knowledge-base))
* Use return value: `no`
You have successfully integrated a knowledge base into your agent interaction. Run the app to see the agent integrated in the use case. Using the **TicketHelper_Agent** page, the user can ask the model questions and receive responses. When it deems it relevant, it will use the functions or the knowledge base. If you ask the agent "How many tickets are open?", a log should appear in your Studio Pro console indicating that the function microflow was executed. Now, when a user submits a request like "My VPN crashes all the time and I need it to work on important documents", the agent will search the knowledge base for similar tickets and provide a relevant solution.
{{< figure src="/attachments/appstore/platform-supported-content/modules/genai/genai-howto-singleagent/Microflow_GenAICommons.png" >}}
+If you'd like to learn how to [Enable User Confirmation for Tools](#user-confirmation) similar as described for agent above, you can find examples in the `ExampleMicroflows` module of the [GenAI Showcase App](https://marketplace.mendix.com/link/component/220475).
+
## Testing and Troubleshooting
{{% alert color="info" %}}
diff --git a/content/en/docs/marketplace/genai/how-to/ground_your_llm_in_data.md b/content/en/docs/marketplace/genai/how-to/ground_your_llm_in_data.md
index 57dc8049a39..30522ba82fa 100644
--- a/content/en/docs/marketplace/genai/how-to/ground_your_llm_in_data.md
+++ b/content/en/docs/marketplace/genai/how-to/ground_your_llm_in_data.md
@@ -177,10 +177,9 @@ To use the knowledge in a chat interface, create and adjust certain microflows a
5. After the `Request found` decision, add a `Retrieve` action. In this example, we retrieve the same as in the insertion microflow.
* **Source**: `From database`
- * **Entity**: `GenAICommons.DeployedKnowledgeBase`
- * **XPath constraint**: `[Name = 'HistoricalTickets']`
+ * **Entity**: `GenAICommons.ConsumedKnowledgeBase`
* **Range**: `First`
- * **Object name**: `DeployedKnowledgeBase_SimilarTickets`
+ * **Object name**: `ConsumedKnowledgeBase_SimilarTickets`
6. Add the `Tools: Add Knowledge Base` action with the settings shown in the image below:
diff --git a/content/en/docs/marketplace/genai/how-to/integrate_function_calling.md b/content/en/docs/marketplace/genai/how-to/integrate_function_calling.md
index e7d1614d38b..1e7ac3f6e5c 100644
--- a/content/en/docs/marketplace/genai/how-to/integrate_function_calling.md
+++ b/content/en/docs/marketplace/genai/how-to/integrate_function_calling.md
@@ -153,6 +153,10 @@ Optionally, you can change the system prompt to provide the model additional ins
4. Update the `System prompt` value to reflect your desired behavior. For example, *`Answer like a Gen Z person. Always keep your answers short.`*
5. Save the changes.
+### Optional: Setting User Access and Approval
+
+When adding tools to a request, you can optionally set a [User Access Approval](appstore/modules/genai/genai-for-mx/commons/#enum-useraccessapproval) value to control if the user first needs to confirm the tool before execution or if the tool is even visible to the user. To show different title and description for the tool, you may modify the `DiplayTitle` and `DisplayDescription` which are only used for display and can thus be less technical/detailed than the `Name` and `Description` of the tool.
+
## Testing and Troubleshooting {#testing-troubleshooting}
Before testing, ensure that you have completed the Mendix Cloud GenAI, OpenAI, or Bedrock configuration as described in the [Build a Chatbot from Scratch Using the Blank GenAI App](/appstore/modules/genai/how-to/blank-app/), particularly the [Infrastructure Configuration](/appstore/modules/genai/how-to/blank-app/#config) section.
diff --git a/content/en/docs/marketplace/genai/reference-guide/agent-commons.md b/content/en/docs/marketplace/genai/reference-guide/agent-commons.md
index 426b6f51cb2..ae07ce15a69 100644
--- a/content/en/docs/marketplace/genai/reference-guide/agent-commons.md
+++ b/content/en/docs/marketplace/genai/reference-guide/agent-commons.md
@@ -136,12 +136,10 @@ For more technical details, see the [Function Calling](/appstore/modules/genai/f
##### Adding tools from MCP servers
-Besides microflow tools, tools exposed by MCP servers are also supported. To add MCP tools to an agent version, select an MCP server configuration from the [MCP client module](/appstore/modules/genai/mcp-modules/mcp-client/). You can then choose one of two import types:
+Besides microflow tools, tools exposed by MCP servers are also supported. To add MCP tools to an agent version, select an MCP server configuration from the [MCP client module](/appstore/modules/genai/mcp-modules/mcp-client/). You can then choose one of two to add MCP tools:
-* Server: imports the entire server, including all tools it provides.
-* Tools: allows you to import specific tools from the server.
-
-Once the agent is called, all tools currently available from the server are added to the request and are available to the model.
+* Use all available tools: imports the entire server, including all tools it provides. This also means less control over individual tools and if tools are added in the future, that they get added automatically on agent execution.
+* Select Tools: allows you to import specific tools from the server and changing specific fields for individual tools.
#### Adding Knowledge Bases
@@ -154,6 +152,8 @@ For supported knowledge bases registered in your app, you can connect them to ag
To allow an agent to perform semantic searches, add the knowledge base to the agent definition and configure the retrieval parameters, such as the number of chunks to retrieve, and the threshold similarity. Multiple knowledge bases can be added to the agent to pick from. Give each knowledge base a name and description (in human language) so that the model can decide which retrieves are necessary based on the input it gets.
+Note that [user access approval](#enum-useraccessapproval) can only be set to `HiddenForUser` or `VisibleForUser` for knowledge base retrievals.
+
#### Testing and Refining the Agent
While writing the system prompt (for both conversational and single-call types) or the user prompt (only for the single-call type), the prompt engineer can include variables by enclosing them in double braces, for example, `{{variable}}`. The actual values of these placeholders are typically known at runtime based on the user's page context.
@@ -178,7 +178,7 @@ For most use cases, a `Call Agent` microflow activity can be used. You can find
##### Call Agent with History {#call-agent-with-history}
-This action uses all defined settings, including the selected model, system prompt, tools, knowledge base, and model parameters to call the Agent using the specified `Request` and execute a `Chat Completions` operation. If a `Request` object is passed that already contains a system prompt, or a value for the parameters temperature, top P or max tokens, those values have priority and will not be overwritten by the agent configurations. If a context entity is configured, the corresponding context object must be passed so that variables in the system prompt can be replaced. The operation returns a `Response` object containing the assistant’s final message, consistent with the chat completions operations from GenAI Commons.
+This action uses all defined settings, including the selected model, system prompt, tools, knowledge base, and model parameters to call the Agent using the specified `Request` and execute a `Chat Completions` operation. If a `Request` object is passed that already contains a system prompt, or a value for the parameters temperature, top P or max tokens, those values have priority and will not be overwritten by the agent configurations. If a context entity is configured, the corresponding context object must be passed so that variables in the system prompt can be replaced. The operation returns a `Response` object containing the assistant’s final message, consistent with the chat completions operations from GenAI Commons. If there are tool calls requested by the model and set for visibility to the user, the response will contain those instead, see [here](/appstore/modules/genai/genai-for-mx/conversational-ui/#human-in-the-loop) for more information.
To use it:
@@ -196,7 +196,7 @@ Download the [Agent Builder Starter App](https://marketplace.mendix.com/link/com
##### Call Agent without History {#call-agent-without-history}
-This action is only supported by Single-call agents which have a user prompt defined as part of the agent version. It uses all defined settings, including the selected model, system prompt, user prompt, tools, knowledge base, and model parameters to call the agent by executing a `Chat Completions` operation. If any of the parameters (system prompt, temperature, top P or max tokens) should be overwritten or you want to pass an additional knowledge base or tool that is not already defined with the agent, you can do this by creating a request and adding these properties before passing it as `OptionalRequest` to the operation. If a context entity was configured, the corresponding context object must be passed so that variables in the system prompt can be replaced. The operation returns a `Response` object containing the assistant’s final message, similar to the chat completions operations from GenAI Commons.
+This action is only supported by Single-call agents which have a user prompt defined as part of the agent version. It uses all defined settings, including the selected model, system prompt, user prompt, tools, knowledge base, and model parameters to call the agent by executing a `Chat Completions` operation. If any of the parameters (system prompt, temperature, top P or max tokens) should be overwritten or you want to pass an additional knowledge base or tool that is not already defined with the agent, you can do this by creating a request and adding these properties before passing it as `OptionalRequest` to the operation. If a context entity was configured, the corresponding context object must be passed so that variables in the system prompt can be replaced. The operation returns a `Response` object containing the assistant’s final message, similar to the chat completions operations from GenAI Commons. If there are tool calls requested by the model and set for visibility to the user, the response will contain those instead, see [here](/appstore/modules/genai/genai-for-mx/conversational-ui/#human-in-the-loop) for more information.
To use it:
diff --git a/content/en/docs/marketplace/genai/reference-guide/conversational-ui.md b/content/en/docs/marketplace/genai/reference-guide/conversational-ui.md
index 7c5ccac37f9..cfcab9dc122 100644
--- a/content/en/docs/marketplace/genai/reference-guide/conversational-ui.md
+++ b/content/en/docs/marketplace/genai/reference-guide/conversational-ui.md
@@ -213,6 +213,28 @@ The following operations are used in a (custom) action microflow:
* `Get Current User Prompt` gets the current user prompt. It can be used in the [action microflow](#action-microflow) because the `CurrentUserPrompt` from the chat context is no longer available.
* `Update Assistant Response` processes the response of the model and adds the new message and any sources to the UI. This is typically one of the last steps of the logic in an [action microflow](#action-microflow). It only needs to be included at the end of the happy flow of an action microflow. Make sure to pass the response object.
+##### Using Tool or Knowledge Base calling {#action-microflow-tool-calling}
+
+Since version 6.0.0, the module stores messages from tool calling persistently in the database which will be sent along next chat messages. This makes the model aware of previously called tools (and their results). Additionally, if a tool is visible to the user or needs user confirmation before execution, the `ToolMessage` entity is used to display those. Note that this may increase token consumption as all information sent to an LLM usually counts as input tokens.
+
+This changes how action microflows are used, because they are called each time a tool is called and the UI changes for the user, for example displaying a tool call or waiting for a user decision if a tool can be executed. Logic that only needs to happen right after the user sends their message (preprocessing) or after the final assistant's message was returned (postprocessing), should perhaps only be executed for those cases.
+
+If no [user-visibility](#enum-useraccessapproval) is configured for tools and you'd like to not store tool messages (and therefore retain the behavior from versions before 6.0.0), you can change the boolean `SaveToolCallHistory` to false on the [Request](/appstore/modules/genai/genai-for-mx/commons/#request). Note that [knowledge base retrievals](/appstore/modules/genai/genai-for-mx/commons/#add-knowledge-base-to-request) are by default `VisibleForUser`.
+
+### Human in the loop {#human-in-the-loop}
+
+When using the [Function Calling](/appstore/modules/genai/function-calling/) pattern by adding tools to the request, you can control when those tools get executed and if they are visible to the user by setting [user access approval](#enum-useraccessapproval) per tool. Human in the loop describes a pattern where the AI can perform powerful tasks, but still requires humans to take certain decisions and oversee the agent's behavior. When using the ConversationalUI module, its basic action microflow pattern to execute requests with history and UI snippets to display the chat, human in the loop works out of the box. Note that action microflows are called until there is a final assistant's response as described [above](#action-microflow-tool-calling), even if all tools are executed without user interaction.
+
+If you're not using the ConversationalUI module for [chat with history executions](/appstore/modules/genai/genai-for-mx/commons/#chat-completions-with-history) or your use case does not contain a chat history, but is [task-focused (without history)](/appstore/modules/genai/genai-for-mx/commons/#chat-completions-without-history), you need to implement the following actions:
+
+1. Store the tool calls from the returned [Response](/appstore/modules/genai/genai-for-mx/commons/#response) in your database. You can either use your own entities or reuse `ToolMessage` from ConversationalUIUI. The microflow `Response_CreateOrUpdateMessage` updates or creates a `Message` object with its corresponding tool messages, based on the response from the LLM.
+2. If `UserConfirmationRequired` was enabled for a tool in the [user access approval](/appstore/modules/genai/genai-for-mx/commons/#enum-useraccessapproval) setting, you can use the tool messages to display the information and wait for the user to decide. The `pending` status of the tool message indicates that a user needs to take action. The `ToolMessage_UserConfirmation_Example` page shows an example as a popup. You can duplicate the page and modify to your own. The buttons for confirmation or rejection should recall the whole action.
+3. The content of the tool messages need to be added to the request. [Add a message](/appstore/modules/genai/genai-for-mx/commons/#chat-add-message-to-request) with role `assistant` that contains the toolcall information and messages with role `tool` for the tool results. You can use the `Request_AddMessage_ToolMessages` microflow to pass the same message from step 1 which takes care of this.
+4. Recall the chat completions action. Be aware that the response might contain new toolcalls and not the final message yet, so you need to follow the steps above again. A recursive loop might be helpful, for example as shown in the `Request_CallWithoutHistory_ToolUserConfirmation_Example` microflow.
+
+For an example for a task-based (without history) use case, you can review the [GenAI Showcase App's](https://marketplace.mendix.com/link/component/220475) function calling example, especially the microflows `Task_ProcessWithFunctionCalling` and `Task_CallWithoutHistory`. Alternatively, the [How to create your first agent](/appstore/modules/genai/how-to/howto-single-agent/) documentation covers a similar example with a step by step guide.
+
+
### Customizing Styling {#customize-styling}
The ConversationalUI module comes with stylesheets that are intended to work on top of Atlas Core. You can use variables and custom classes to modify the default rendering and think of colors, sizes, and positions. To learn more about customizing styling in a Mendix app in general and targeting elements using SCSS selectors, refer to the [how-to](/howto/front-end/customize-styling-new/#add-custom-styling) page.
diff --git a/content/en/docs/marketplace/genai/reference-guide/external-platforms/gemini.md b/content/en/docs/marketplace/genai/reference-guide/external-platforms/gemini.md
index cfd1a53fa33..2e53c685638 100644
--- a/content/en/docs/marketplace/genai/reference-guide/external-platforms/gemini.md
+++ b/content/en/docs/marketplace/genai/reference-guide/external-platforms/gemini.md
@@ -121,7 +121,7 @@ Gemini does not directly connect to the knowledge resources. The model returns a
This functionality is part of the implementation executed by the GenAI Commons Chat Completions operations mentioned earlier. As a developer, make the system aware of your indexes and their purpose by registering them with the request. This is done using the GenAI Commons operation [Tools: Add Knowledge Base](/appstore/modules/genai/genai-for-mx/commons/#add-knowledge-base-to-request), which must be called once per knowledge resource before passing the request to the Chat Completions operation.
-Note that the retrieval process is independent of the model provider and can be used with any model that supports function calling, as it relies on the generalized `GenAICommons.DeployedKnowledgeBase` input parameter.
+Note that the retrieval process is independent of the model provider and can be used with any model that supports function calling, as it relies on the generalized `GenAICommons.ConsumedKnowledgeBase` input parameter.
#### Vision {#chatcompletions-vision}
diff --git a/content/en/docs/marketplace/genai/reference-guide/external-platforms/mistral.md b/content/en/docs/marketplace/genai/reference-guide/external-platforms/mistral.md
index 6d00f4a0f57..11df0ae6c4f 100644
--- a/content/en/docs/marketplace/genai/reference-guide/external-platforms/mistral.md
+++ b/content/en/docs/marketplace/genai/reference-guide/external-platforms/mistral.md
@@ -129,7 +129,7 @@ Mistral does not directly connect to the knowledge resources. The model returns
This functionality is part of the implementation executed by the GenAI Commons Chat Completions operations mentioned earlier. As a developer, you need to make the system aware of your indexes and their purpose by registering them with the request. This is done using the GenAI Commons operation [Tools: Add Knowledge Base](/appstore/modules/genai/genai-for-mx/commons/#add-knowledge-base-to-request), which must be called once per knowledge resource before passing the request to the Chat Completions operation.
-Note that the retrieval process is independent of the model provider and can be used with any model that supports function calling, as it relies on the generalized `GenAICommons.DeployedKnowledgeBase` input parameter.
+Note that the retrieval process is independent of the model provider and can be used with any model that supports function calling, as it relies on the generalized `GenAICommons.ConsumedKnowledgeBase` input parameter.
#### Vision {#chatcompletions-vision}
diff --git a/content/en/docs/marketplace/genai/reference-guide/external-platforms/openai.md b/content/en/docs/marketplace/genai/reference-guide/external-platforms/openai.md
index 6e1460138ee..12d5338e5c4 100644
--- a/content/en/docs/marketplace/genai/reference-guide/external-platforms/openai.md
+++ b/content/en/docs/marketplace/genai/reference-guide/external-platforms/openai.md
@@ -179,7 +179,7 @@ OpenAI does not directly connect to the Azure AI Search resource. The model retu
This functionality is part of the implementation executed by the GenAI Commons Chat Completions operations mentioned earlier. As a developer, you need to make the system aware of your indexes and their purpose by registering them with the request. This is done using the GenAI Commons operation [Tools: Add Knowledge Base](/appstore/modules/genai/genai-for-mx/commons/#add-knowledge-base-to-request), which must be called once per index before passing the request to the Chat Completions operation.
-Note that the retrieval process is independent of the model provider and can be used with any model that supports function calling, as it relies on the generalized `GenAICommons.DeyploedKnowledgeBase`entity.
+Note that the retrieval process is independent of the model provider and can be used with any model that supports function calling, as it relies on the generalized `GenAICommons.ConsumedKnowledgeBase`entity. For Azure indexes specifically, as part of this module, when collection identifiers needs to be passed to operations, the `Name` of the `Index` should be used.
#### Vision {#chatcompletions-vision}
diff --git a/content/en/docs/marketplace/genai/reference-guide/external-platforms/pg-vector-knowledge-base/_index.md b/content/en/docs/marketplace/genai/reference-guide/external-platforms/pg-vector-knowledge-base/_index.md
index 368d47f94d0..43a340bdc0b 100644
--- a/content/en/docs/marketplace/genai/reference-guide/external-platforms/pg-vector-knowledge-base/_index.md
+++ b/content/en/docs/marketplace/genai/reference-guide/external-platforms/pg-vector-knowledge-base/_index.md
@@ -77,7 +77,7 @@ Additionally, there is one activity to prepare the connection input, which is a
#### `DeployedKnowledgeBase: Create` {#create-pgvectordeployedknowledgebase}
-All operations that include knowledge base interaction need the connection details to the knowledge base. Adhering to the GenAI Commons standard, this information is conveyed in a specialization of the GenAI Commons [DeployedKnowledgeBase](/appstore/modules/genai/genai-for-mx/commons/#deployed-knowledge-base) entity (see the [Technical Reference](#technical-reference) section). After instantiating the `PgVectorKnowledgeBase` based on custom logic and/or front-end logic, this object can be used for the actual knowledge base operations.
+All operations that include knowledge base interaction need the connection details to the knowledge base. Adhering to the GenAI Commons standard, this information is conveyed in a specialization of the GenAI Commons [DeployedKnowledgeBase](/appstore/modules/genai/genai-for-mx/commons/#deployed-knowledge-base) entity and the [ConsumedKnowledgeBase](/appstore/modules/genai/genai-for-mx/commons/#consumed-knowledge-base) (see the [Technical Reference](#technical-reference) section). After instantiating the `PgVectorKnowledgeBase` based on custom logic and/or front-end logic, this object can be used for the actual knowledge base operations. For operations where collection identifiers are needed in combination with a `ConsumedKnowledgeBase` object, the `Name` of the KnowledgeBase (see the `PgVectorKnowledgeBase` entity) needs to be passed as string.
### (Re)populate Operations {#repopulate-operations-configuration}
diff --git a/content/en/docs/marketplace/genai/reference-guide/genai-commons.md b/content/en/docs/marketplace/genai/reference-guide/genai-commons.md
index 3be37cabd99..2e36d03af37 100644
--- a/content/en/docs/marketplace/genai/reference-guide/genai-commons.md
+++ b/content/en/docs/marketplace/genai/reference-guide/genai-commons.md
@@ -85,9 +85,27 @@ The `DeployedModel` entity replaces the capabilities that were covered by the `C
| `SupportsFunctionCalling` | An enum to specify if the model supports function calling. |
| `IsActive` | A boolean to specify if the model is active/usable with the current authentication settings and user preference. |
+#### `ConsumedKnowledgeBase` {#consumed-knowledge-base}
+
+The `ConsumedKnowledgeBase` represents a GenAI knowledge base resource. Each connector module that integrates with knowledge base resources implements its own specialization. If multiple collections of data are supported by the knowledge base resource, these collections will be a specialization of `DeployedKnowledgeBase`. The consumed knowledge base can be added to the request when calling an LLM, along with the identifier of such a collection, so that the logic can use the chosen data in the knowledge base for text generation. The consumed knowledge base entity contains a display name, architecture, and a label field specifying how the concept of a collection should be called in the user front-end (e.g. **Index** for Azure Search Resources). Furthermore it contains the name of the microflow to be executed to do a retrieval for the specified deployed knowledge base specialization, a microflow that returns the selectable options in the front-end for the data collections (identifiers) inside of the resource, and lastly, a microflow that based on the chosen colleciton identifier retrieves an instance of the connector-specific specialization of `DeployedKnowledgeBase`.
+
+As these objects are created as a specialization by the logic in connectors themselves (specializations), such a specialization typically contains more specific data required for the connection to the resource according to the provider infrastructure details, such as endpoints and credentials. Admins need to configure this at runtime.
+
+This entity was introduced in module version 6.0.0. To migrate data from erlier versions, please refer to the [GenAI migration guide](/appstore/modules/genai/genai-for-mx/migration-guide/#march-2026)
+
+| Attribute | Description |
+| --- | --- |
+| `DisplayName` | The display name of the consumed knowledge base. |
+| `Architecture` | The architecture of the consumed knowledge base, for example, Mendix Cloud or Amazon Bedrock. |
+| `CollectionIdentifierLabel` | The name of a deployed knowledge base (collection), in the language of the provider (e.g. **Index** for Azure Search Resources). This is used in the front end when building agents. |
+| `RetrievalMicroflow` | The microflow to execute to retrieve information for the specified knowledge base resource. |
+| `GetCollectionsMicroflow` | The microflow to execute to retrieve selectable options for collections present in the specified consumed knowledge base. |
+| `GetDeployedKnowledgeBaseMicroflow` | The microflow to execute to retrieve selectable options for collections present in the specified consumed knowledge base. |
+| `IsSelectable` | A boolean to specify if the knowledge base resource is active/usable when defining agents. |
+
#### `DeployedKnowledgeBase` {#deployed-knowledge-base}
-The `DeployedKnowledgeBase` represents a GenAI knowledge base that can be added to the request when calling an LLM. It contains a display name, a technical name (or identifier), the name of the microflow to be executed for the specified knowledge base specialization, and other relevant information to connect to the knowledge base. These objects are created by the connectors themselves (see their specializations), allowing admins to configure them at runtime.
+The `DeployedKnowledgeBase` represents a GenAI knowledge base collection that can be added to the request when calling an LLM. It refers to a discrete dataset as part of the [ConsumedKnowledgeBase](#consumed-knowledge-base) It contains a display name, a technical name (or identifier), the name of the microflow to be executed for the specified knowledge base specialization, and other relevant information to connect to the knowledge base. These objects are created by the connectors themselves (see their specializations), allowing admins to configure them at runtime.
The `DeployedKnowledgeBase` entity replaces the capabilities covered by the `Connection` entity for knowledge base interaction in earlier versions of GenAI Commons.
@@ -117,6 +135,7 @@ The data stored in this entity is to be used later on for token consumption moni
| Attribute | Description |
| --- | --- |
+| `UsageId` | The usage id is set internally to identify a usage based on the conversation id. |
| `Architecture` | The architecture of the used deployed model; e.g. OpenAI or Amazon Bedrock. |
| `DeployedModelDisplayName` | DisplayName of the DeployedModel. |
| `InputTokens` | The amount of tokens consumed by an LLM call that is related to the input. |
@@ -124,6 +143,7 @@ The data stored in this entity is to be used later on for token consumption moni
| `TotalTokens` | The total amount of tokens consumed by an LLM call. |
| `DurationMilliseconds` | The duration in milliseconds of the technical part of the call to the system of the LLM provider. This excludes custom pre and postprocessing but corresponds to a complete LLM interaction. |
| `_DeploymentIdentifier` | Internal object used to identify the DeployedModel used. |
+| `EndTime` | The end time after the final model invocation is completed. |
#### `Trace` {#trace}
@@ -182,6 +202,7 @@ A tool span is created for each tool call requested by the LLM. The tool call is
| `ToolName` | The name of the tool that was called. |
| `ToolDescription` | The description of the tool. |
| `_ToolCallId` | The ID of the tool call used by the model to map an assistant message containing a tool call with the output of the tool call (tool message). |
+| `ToolCallStatus` | The current status of the ToolCall. |
`ToolSpan` was introduced in version 5.3.0.
@@ -221,6 +242,7 @@ The `Request` is an input object for the chat completions operations defined in
| `TopP` | `TopP` is an alternative to temperature for controlling the randomness of the model response. `TopP` defines a probability threshold so that only words with probabilities greater than or equal to the threshold will be included in the response. It is recommended to steer either the temperature or `TopP`, but not both. |
| `ToolChoice` | Controls which (if any) tool is called by the model. For more information, see the [ENUM_ToolChoice](#enum-toolchoice) section containing a description of the possible values. |
| `_AgentVersionId` | The `AgentVersionId` is set if the execution of the request was called from an Agent. |
+| `SaveToolCallHistory` | Indicates if the tool calls should be stored for later continuation (must be implemented). |
#### `Message` {#message}
@@ -263,9 +285,12 @@ A tool in the tool collection. This is sent along with the request to expose a l
| `Name` | The name of the tool to call. This is used by the model in the response to identify which function needs to be called. |
| `Description` | An optional description of the tool, used by the model in addition to the name attribute to choose when and how to call the tool. |
| `ToolType` | The type of the tool. Refer to the documentation supplied by your AI provider for information about the supported types. |
-| `Microflow` | The name (string) of the microflow that this tool represents. |
+| `Microflow` | The name (string) of the microflow that this tool represents. Note that tool microflows do not respect entity access of the current user. Make sure that you only return information that the user is allowed to view, otherwise confidential information may be visible to the current user in the assistant's response. |
| `MCPServerName` | The name of the MCP server (only appliable for MCP Tools). |
| `Schema` | The schema represents the raw JSON schema defined by the tool. This is typically the case when the tool is external and not a Mendix microflow. |
+| `DisplayDescription` | (Optional) A description meant for users if tools are shown in the UI. |
+| `DisplayTitle` | (Optional) A title meant for users if tools are shown in the UI. |
+| `UserAccessApproval` | Controls how the tool calling should behave.
HiddenForUser (Default): automatic tool approval, tools are not shown to users.
VisibleForUser: automatic tool approval, tools are visible to users.
UserConfirmationRequired: user decides if tools are called or not. |
#### `Function` {#function}
@@ -316,15 +341,9 @@ A tool call object may be generated by the model in certain scenarios, such as a
| `ToolType` | The type of the tool. View AI provider documentation for supported types. |
| `ToolCallId` | This is a model-generated ID of the proposed tool call. It is used by the model to map an assistant message containing a tool call with the output of the tool call (tool message). |
| `Input` | The input is the raw tool JSON input generated by the model, usually passed for external tools where no mapping to a microflow is required. |
-
-#### `Argument` {#argument}
-
-The arguments are used to call the tool, generated by the model in JSON format. Note that the model does not always generate valid JSON and may hallucinate parameters that are not defined by your tool's schema. Mendix recommends validating the arguments in the code before calling the tool. One argument is generated for each primitive input parameter of the selected microflow.
-
-| Attribute | Description |
-| --- | --- |
-| `Key` | The name of the input parameter as given in the microflow. |
-| `Value` | The value that is passed to the input parameter. |
+| `Status` | The current status of the ToolCall to determine next steps and UI display. |
+| `ToolResult` | The result of the toolcall. |
+| `IsError` | Indicates if the toolcall failed. |
#### `Reference` {#reference}
@@ -433,7 +452,7 @@ It is recommended that you adapt to the same interface when developing custom ch
##### Chat Completions (with history) {#chat-completions-with-history}
-The `Chat Completions (with history)` operation supports more complex use cases where a list of (historical) messages (for example, comprising the conversation or context so far) is sent as part of the request to the LLM.
+The `Chat Completions (with history)` operation supports more complex use cases where a list of (historical) messages (for example, comprising the conversation or context so far) is sent as part of the request to the LLM. Note that the response might not be complete if tools with [UserAccessApproval](#enum-useraccessapproval) other than `HiddenForUser` are added or the request specifies that the tool messages should be stored ([SaveToolCallHistory](#request)). In those cases, logic needs to be implemented to call the action again, with [toolcalls](#toolcall) appended to the assistant's message as well as messages of role tool to the request. If you are using the [ConversationalUI](/appstore/modules/genai/genai-for-mx/conversational-ui/#human-in-the-loop) module, this is automatically handled.
###### Input Parameters
@@ -450,7 +469,7 @@ The `Chat Completions (with history)` operation supports more complex use cases
##### Chat Completions (without history) {#chat-completions-without-history}
-The `Chat Completions (without history)` operation supports scenarios where there is no need to send a list of (historic) messages comprising the conversation so far as part of the request.
+The `Chat Completions (without history)` operation supports scenarios where there is no need to send a list of (historic) messages comprising the conversation so far as part of the request. Note that the response might not be complete if tools with [UserAccessApproval](#enum-useraccessapproval) other than `HiddenForUser` are added or the request specifies that the tool messages should be stored ([SaveToolCallHistory](#request)). In those cases, logic needs to be implemented to call the action again, with [toolcalls](#toolcall) appended to the assistant's message as well as messages of role tool to the request. See [here](/appstore/modules/genai/genai-for-mx/conversational-ui/#human-in-the-loop) for more information.
###### Input Parameters
@@ -540,7 +559,9 @@ This microflow can add a new [Message](#message) to the [Request](#request) obje
###### Return Value
-This microflow does not have a return value.
+| Name | Type | Description |
+|--- |---|---|
+| `Message` | [Message](#message) | The message that was created and added to the request. |
##### Create Request {#chat-create-request}
@@ -608,8 +629,11 @@ Adds a new Function to a [ToolCollection](#toolcollection) that is part of a Req
|---|---|---|---|
| `Request` | [Request](#request) | mandatory | The request to add the function to. |
| `ToolName` | String | mandatory | The name of the tool to use/call. |
-| `ToolDescription` | String | optional | An optional description of what the tool does, used by the model to choose when and how to call the tool. |
-| `FunctionMicroflow` | Microflow | mandatory | The microflow that is called within this function. |
+| `ToolDescription` | String | optional | A description of what the tool does, used by the model to choose when and how to call the tool. |
+| `FunctionMicroflow` | Microflow | mandatory | The microflow that is called within this function. A function microflow can have none or multiple primitive input parameters. Additionally, a Request and/or Tool object can be added as input. The microflow needs to return a String.
Note that function microflows do not respect entity access of the current user. Make sure that you only return information that the user is allowed to view, otherwise confidential information may be visible to the current user in the assistant's response. |
+| `UserAccessApproval` | [Enumeration GenAICommons.ENUM_UserAccessApproval](#enum-useraccessapproval) | optional | Control how the tool calling should behave.
HiddenForUser (Default): automatic tool approval, tools are not shown to users.
VisibleForUser: automatic tool approval, tools are visible to users.
UserConfirmationRequired: user decides if tools are called or not. |
+| `DisplayTitle` | String | optional | A title meant for users if tools are shown in the UI. |
+| `DisplayDescription` | String | optional | A description meant for users if tools are shown in the UI. |
{{% alert color="info" %}}
Since this microflow runs in the context of the user, you can make sure that it only shows data that is relevant to the current user.
@@ -641,9 +665,9 @@ This microflow does not have a return value.
##### Tools: Add Knowledge Base {#add-knowledge-base-to-request}
-This tool adds a function that performs a retrieval from a knowledge base to a [ToolCollection](#toolcollection) that is part of a Request. Use this microflow when you have knowledge bases in your application that may be called to retrieve the required information as part of a GenAI interaction. If you want the model to be aware of these microflows, you can use this operation to add them as functions to the request. If supported by the LLM connector, the chat completion operation calls the appropriate knowledge base function based on the LLM response and continue the process until the assistant's final response is returned.
+This tool adds a function that performs a retrieval from a knowledge base to a [ToolCollection](#toolcollection) that is part of a Request. Use this microflow when you have knowledge bases in your application that may be called to retrieve the required information as part of a GenAI interaction. If you want the model to be aware of these knowledge base, you can use this operation to add them as functions to the request. If supported by the LLM connector, the chat completion operation calls the appropriate knowledge base function based on the LLM response and continue the process until the assistant's final response is returned.
-`DeployedKnowledgeBase` objects have provider-specific specializations, for example, `Collection` for Mendix Cloud.
+`ConsumedKnowledgeBase` objects have provider-specific specializations, for example, `MxCloudKnowledgeBaseResource` for Mendix Cloud.
###### Input Parameters
@@ -652,10 +676,13 @@ This tool adds a function that performs a retrieval from a knowledge base to a [
| `Request` | [Request](#request) | mandatory | The request to which the knowledge base should be added. |
| `Name` | String | mandatory | The name of the knowledge base to use or call. Technically, this is the name of the tool that is passed to the LLM. This needs to be unique per request (if multiple tools/knowledge base retrievals are added). |
| `Description` | String | optional | A description of the knowledge base's purpose, used by the model to determine when and how to invoke it. |
-| `DeployedKnowledgeBase` | Object | mandatory | The knowledge base that is called within this tool. This object includes a `microflow`, which is executed when the knowledge base is invoked. |
+| `ConsumedKnowledgeBase` | Object | mandatory | The knowledge base resource that is called within this tool. This also determines which provider (and connector) is used. Only specialization objects are allowed, not a generalized GenAICommons object. |
+| `CollectionIdentifier` | String | Mandatory | This is a string reference to the dataset (collection) which is part of the consumed knowledge base, that contains the relevant data for the LLM. E.g. for Mendix Cloud knowledge base resources, this would correspond to the name of a `Collection`. Refer to the documentation of the specific connector to learn more. |
| `MaxNumberOfResults` | Integer | optional | This can be used to limit the number of results that should be retrieved. |
| `MinimumSimilarity` | Decimal | optional | Filters the results to retrieve only chunks with a similarity score greater than or equal to the specified value. The score ranges from 0 (no similarity) to 1.0 (the same vector). |
| `MetadataCollection` | Object | optional | Optional: This contains a list for additional filtering in the retrieve. Only chunks that comply with the metadata labels will be returned. |
+| `DisplayTitle` | String | optional | A title meant for users if knowledge base retrievals are shown in the UI. |
+| `IsVisible` | Boolean | optional | If set to true, the knowledge base is visible for the user in chat. |
###### Return Value
@@ -937,6 +964,16 @@ This microflow creates a new [MetadataCollection](#metadatacollection-entity) an
| `any` | **Any** | Any function will be called. Not available for all providers and might be changed to auto. |
| `tool` | **Tool** | A particular tool needs to be called, which is the one specified over association `ToolCollection_ToolChoice`. |
+#### `ENUM_UserAccessApproval` {#enum-useraccessapproval}
+
+`ENUM_UserAccessApproval` provides a list of ways to control how tool calling should behave in relation to user visibility and approval.
+
+| Name | Caption | Description |
+| --- | --- | --- |
+| `HiddenForUser` | **HiddenForUser** | Automatic tool approval; tools are not shown to users. |
+| `VisibleForUser` | **VisibleForUser** | Automatic tool approval; tools are visible to users. |
+| `UserConfirmationRequired` | **UserConfirmationRequired** | User decides if tools are called or not. |
+
#### `ENUM_SourceType` {#enum-sourcetype}
`ENUM_SourceType` provides a list of source types, which describes how the pointer to the `Source` attribute on the [Reference](#reference) object should be interpreted to get the source location. Currently, only `Url` is supported.
diff --git a/content/en/docs/marketplace/genai/reference-guide/migration-guide.md b/content/en/docs/marketplace/genai/reference-guide/migration-guide.md
new file mode 100644
index 00000000000..9c41bf30e09
--- /dev/null
+++ b/content/en/docs/marketplace/genai/reference-guide/migration-guide.md
@@ -0,0 +1,218 @@
+---
+title: "Release and Migration Guide for GenAI Modules"
+url: /appstore/modules/genai/genai-for-mx/migration-guide/
+linktitle: "Release and Migration Guide"
+description: "Describes the combined releases of various GenAI-related modules and their inter-module dependencies. It also include migration steps and notices about deprecations and removals"
+weight: 1
+---
+
+During most regular release cycles, upgrading GenAI modules is seamless and does not require manual actions. However, breaking database or code changes are sometimes unavoidable to allow for future improvements.
+
+This document is intended for consumers of any of the GenAI modules. It describes for impactful releases what the affected module versions are, what the nature of the changes is and what actions need to be performed upon upgrading to the newer versions.
+
+{{% alert color="warning" %}}
+Do not skip below listed major versions indicated as containing deprecations and/or requiring migration.
+
+Modules remove **deprecated entities, associations and attributes** in the major release that were marked as deprecated, indicated by the `_DEPR` affix in the domain model element, in the previous major. This means:
+
+If you are on **v3.x.x** and want to reach **v5.0.0**, you must first upgrade to
+ **v4.0.0**, do a deployment and perform all required migration steps before proceeding further.
+Skipping a major version may result in data loss, broken logic, and/or failed deployments.
+
+Correct upgrade path: v3.x.x → v4.0.0 (migrate) → v5.0.0 (removes deprecated elements)
+
+Unsupported path: v3.x.x → v5.0.0
+
+{{% /alert %}}
+
+## General recommendations
+
+Mendix recommends to take the following steps per release to ensure a smooth upgrade without data loss. For the details of each release, please refer to the sections below.
+
+- Read the full migration guide for the specific release and make sure you cover each module that is used in your app
+- Perform the upgrade first in a non-production environment
+- Back up your database before starting
+- Upgrade all modules to the versions listed in the upgrade matrix for the release
+- Update any custom application logic referencing deprecated entities, associations, attributes, microflows, or enumerations
+- Run all required migration microflows upon starting the application (e.g. as part of the after-startup)
+- Verify migration results in the running app
+- Test your application thoroughly
+- Perform the upgrade and migrations in production
+
+
+## Releases
+
+The sections below cover each release increment of a set of modules that are released at the same moment in time. For upgrade paths not covering any of the below mentioned module releases, no additional actions are required.
+
+### Release March 2026 {#march-2026}
+
+This section describes breaking changes and required actions for a number of GenAI modules released early March 2026. The changes prepare the domain models for future improvements considering Agent definitions using MCP tools and Knowledge Bases.
+
+
+{{% alert color="warning" %}}
+This release contains **breaking changes** across several modules. Skipping these major versions completely is **not supported**: a number of migrations must be performed to avoid data loss or application failure in a later release. Please read this guide carefully before upgrading.
+{{% /alert %}}
+
+
+#### Affected modules and versions
+
+The following module versions are released as **compatible** with each other and should be upgraded together.
+
+| Module | Previous Versions | New Version | Contains deprecations | Requires migration |
+|---------------------|-----------------|-------------|-----------------|----|
+| GenAI Commons | 5.x.x | 6.0.0 | No | Yes, as part of dependent modules |
+| Agent Commons | 2.x.x | 3.0.0 | Yes | Yes |
+| MCP Client | 2.x.x | 3.0.0 | Yes | No, but update required for other migrations to work |
+| OpenAI Connector | 7.x.x | 8.0.0 | Yes | Yes |
+| Amazon Bedrock Connector | 9.x.x | 10.0.0 | No | Yes |
+| PgVector Knowledge Base | 5.x.x | 6.0.0 | Yes | Yes |
+| Mendix Cloud GenAI Connector | 5.x.x | 6.0.0 | No | Yes |
+
+{{% alert color="info" %}}
+Even if a module has no deprecations, we strongly recommend upgrading all modules together according to the above table. This ensures that migrations of other modules can work properly.
+{{% /alert %}}
+
+
+#### Migration Guide per Topic
+
+Migration steps are grouped by topic rather than by module, as some changes span multiple modules.
+
+##### Single MCP Tools used by Agent definitions
+- **Agent Commons**: v2.x.x → v3.0.0
+- **MCP Client**: v2.x.x → v3.0.0
+
+###### What Changed
+- The association from entity `SingleMCPTool` towards the entity `MCPTool` has been deprecated.
+- Entity `SingleMCPTool` has a new association `SingleMCPTool_ConsumedMCPService` and a new attribute `Tool`.
+
+###### Impact
+Agent definitions containing Single MCP tools require migration to prevent failing agent calls at runtime.
+
+Migration is only required if your app uses Agent definitions containing Single MCP tools
+
+###### Required Actions
+
+To prevent having to recreate existing data concerning Agent definitions, perform the following steps:
+
+1. Upgrade the **MCP Client** to v3.0.0 in your Mendix project.
+1. Upgrade the **Agent Commons module** to v3.0.0 in your Mendix project.
+1. **Run the data migration microflow** upon starting your application (e.g. include it in the after-startup):
+ ```
+ AgentCommons > USE_ME > Migration > SingleMCPTool_Migrate
+ ```
+
+ This microflow will set the new association and attribute on existing `SingleMCPTool` records.
+
+1. **Update any custom logic or pages** in your application that reference:
+ - The old entity or its attributes `MCPTool_DEPR` in the `MCPClient` module. Available tools are not cached anymore. In cases where the actual list of available tools is required, please refer to microflow `ConsumedMCPService_ListTools`.
+1. **Verify** your application compiles and runs correctly before deploying to cloud environments.
+
+
+{{% alert color="info" %}}
+The `SingleMCPTool` entity and related attributes and association will be **permanently removed** in the next major version of the **MCP Client** module, which will be **v4.0.0**.
+
+Ensure the migration microflow has been run before upgrading to the next major version.
+{{% /alert %}}
+
+
+
+##### Consumed Knowledge Bases
+
+- **GenAI Commons**: v5.x.x → v6.0.0
+- **Amazon Bedrock Connector**: v9.x.x → v10.0.0
+- **Mendix Cloud GenAI Connector**: v5.x.x → v6.0.0
+- **OpenAI Connector**: v7.x.x → v8.0.0
+- **PgVector Knowledge Base**: v5.x.x → v6.0.0
+
+###### What Changed
+- A new entity `ConsumedKnowledgeBase` has been added to the domain model of GenAI Commons. Each connector that provides logic to interact with Deployed Knowledge Bases now provides a specialization for this new entity.
+- In the **Amazon Bedrock Connector** module, entity `BedrockConsumedKnowledgeBase` was added as a specialization of `ConsumedKnowledgeBase`.
+- In the **Mendix Cloud GenAI Connector** module, existing entity `MxCloudKnowledgeBaseResource` is now a specialization of `ConsumedKnowledgeBase`.
+- In the **OpenAI Connector** module, existing entity `AzureAISearchResource` is now a specialization of `ConsumedKnowledgeBase`. The `DisplayName` attribute has been deprecated and replaced by the attribute on the generalization.
+- In the **PgVector Knowledge Base** module, existing entity `DatabaseConfiguration` is now a specialization of `ConsumedKnowledgeBase`. The `DisplayName` attribute has been deprecated and replaced by the attribute on the generalization.
+
+###### Impact
+Agent definitions using KnowledgeBases require migration to prevent failing agent calls at runtime.
+Existing knowledge base configurations in any of the mentioned connector modules, require migration to prevent failing knowledge base calls at runtime.
+
+Migration is only required if your app interacts with knowledge bases from any of the mentioned modules, or contains existing data for such knowledge base configurations.
+
+###### Required Actions
+
+To prevent having to recreate existing data concerning Agent definitions and knowledge base configurations, perform the following steps:
+
+1. Upgrade the **GenAI Commons** module to v6.0.0 in your Mendix project.
+1. If present, upgrade the **Agent Commons** module to v3.0.0 in your Mendix project.
+
+1. If your app has the **Amazon Bedrock Connector** module:
+
+ 1. upgrade the **Amazon Bedrock Connector** module to v10.0.0 in your Mendix project.
+ 1. include logic to **run the data migration microflow** upon starting your application (e.g. include it in the after-startup):
+ ```
+ AmazonBedrockConnector > USE_ME > Migration > ConsumedKnowledgeBase_Migrate
+ ```
+ This microflow will make sure the new attributes on the generalization are set properly and the `DisplayName` field is migrated.
+ 1. if the **Agent Commons** is part of your app as well and there are Agents defined using knowledge bases, include the following initially excluded submicroflow into the project and add it as microflow call according to the annotation in the above-mentioned microflow:
+ ```
+ AmazonBedrockConnector > USE_ME > Migration > AmazonBedrock_KnowledgeBase_Migrate
+ ```
+ This microflow will set the `CollectionIdentifier` field on the `KnowledgeBase` entity, as well as the outgoing reference to the `ConsumedKnowledgeBase`.
+
+
+1. If your app has the **Mendix Cloud GenAI Connector** module:
+
+ 1. upgrade the **Mendix Cloud GenAI Connector** module to v6.0.0 in your Mendix project.
+ 1. include logic to **run the data migration microflow** upon starting your application (e.g. include it in the after-startup):
+ ```
+ MxGenAIConnector > USE_ME > Migration > ConsumedKnowledgeBase_Migrate
+ ```
+ This microflow will make sure the new attributes on the generalization are set properly and the `DisplayName` field is migrated.
+ 1. if the **Agent Commons** is part of your app as well and there are Agents defined using knowledge bases, include the following initially excluded submicroflow into the project and add it as microflow call according to the annotation in the above-mentioned microflow:
+ ```
+ MxGenAIConnector > USE_ME > Migration > MxGenAI_KnowledgeBase_Migrate
+ ```
+ This microflow will set the `CollectionIdentifier` field on the `KnowledgeBase` entity, as well as the outgoing reference to the `ConsumedKnowledgeBase`.
+
+1. If your app has the **OpenAI Connector** module:
+
+ 1. upgrade the **OpenAI Connector** module to v8.0.0 in your Mendix project.
+ 1. include logic to **run the data migration microflow** upon starting your application (e.g. include it in the after-startup):
+ ```
+ OpenAIConnector > USE_ME > Migration > ConsumedKnowledgeBase_Migrate
+ ```
+ This microflow will make sure the new attributes on the generalization are set properly and the `DisplayName` field is migrated.
+ 1. if the **Agent Commons** is part of your app as well and there are Agents defined using knowledge bases, include the following initially excluded submicroflow into the project and add it as microflow call according to the annotation in the above-mentioned microflow:
+ ```
+ OpenAIConnector > USE_ME > Migration > Azure_KnowledgeBase_Migrate
+ ```
+ This microflow will set the `CollectionIdentifier` field on the `KnowledgeBase` entity, as well as the outgoing reference to the `ConsumedKnowledgeBase`.
+
+1. If your app has the **PgVector Knowledge Base** module:
+
+ 1. upgrade the **PgVector Knowledge Base** module to v6.0.0 in your Mendix project.
+ 1. include logic to **run the data migration microflow** upon starting your application (e.g. include it in the after-startup):
+ ```
+ PgVectorKnowledgeBase > USE_ME > Migration > ConsumedKnowledgeBase_Migrate
+ ```
+ This microflow will make sure the new attributes on the generalization are set properly and the `DisplayName` field is migrated.
+ 1. if the **Agent Commons** is part of your app as well and there are Agents defined using knowledge bases, include the following initially excluded submicroflow into the project and add it as microflow call according to the annotation in the above-mentioned microflow:
+ ```
+ PgVectorKnowledgeBase > USE_ME > Migration > PgVector_KnowledgeBase_Migrate
+ ```
+ This microflow will set the `CollectionIdentifier` field on the `KnowledgeBase` entity, as well as the outgoing reference to the `ConsumedKnowledgeBase`.
+
+1. **Update any custom logic or pages** in your application that reference:
+ 1. The attributes `DisplayName_DEPR` on the `DatabaseConfiguration` and `AzureAISearchResource` entities. Instead, now use the `DisplayName` field that comes as part of the generalization.
+ 1. The association `KnowledgeBase_DeployedModel_DEPR`. Instead, now use the `CollectionIdentifier` attribute on the `KnowledgeBase` entity, if needed in combination with the `KnowledgeBase_ConsumedKnowledgeBase` association.
+1. **Verify** your application compiles and runs correctly before deploying to cloud environments.
+1. **Remove** the migration logic from the app logic the moment it has run **at least once in every impacted environment**. It can, however, be triggered multiple times without harm.
+
+{{% alert color="info" %}}
+The `KnowledgeBase_DeployedModel_DEPR` association will be **permanently removed** in the next major version of the **Agent Commons** module, which will be **v4.0.0**.
+
+The `DisplayName_DEPR` attribute will be **permanently removed** in the next major version of the **OpenAI Connector** module, which will be **v9.0.0**.
+
+The `DisplayName_DEPR` attribute will be **permanently removed** in the next major version of the **PgVector Knowledge Base** module, which will be **v7.0.0**.
+
+Ensure the migration microflow has been run before upgrading to the next major version.
+{{% /alert %}}
\ No newline at end of file
diff --git a/content/en/docs/marketplace/platform-supported-content/modules/aws/amazon-bedrock.md b/content/en/docs/marketplace/platform-supported-content/modules/aws/amazon-bedrock.md
index 464fcc9db5a..4a71fc714e4 100644
--- a/content/en/docs/marketplace/platform-supported-content/modules/aws/amazon-bedrock.md
+++ b/content/en/docs/marketplace/platform-supported-content/modules/aws/amazon-bedrock.md
@@ -256,7 +256,8 @@ The [ChatCompletions (with history)](/appstore/modules/genai/genai-for-mx/common
Some capabilities of the chat completions operations are currently only available for specific models:
-* **Function Calling** - You can use function calling in all chat completions operations. To do this, use a [supported model](https://docs.aws.amazon.com/bedrock/latest/userguide/conversation-inference-supported-models-features.html) by adding a `ToolCollection` with a `Tool` via the [Tools: Add Function to Request](/appstore/modules/genai/genai-for-mx/commons/#add-function-to-request) operation. You can also first retrieve data from a knowledge base and then call `ChatCompletions` with the information required using the connector's function calling properties. In order to use this function, add a knowledge base to your Request using [Tools: Add Knowledge Base](/appstore/modules/genai/genai-for-mx/commons/#add-knowledge-base-to-request). For more information about function calling, see [Function Calling](/appstore/modules/genai/function-calling/).
+* **Function Calling** - You can use function calling in all chat completions operations. To do this, use a [supported model](https://docs.aws.amazon.com/bedrock/latest/userguide/conversation-inference-supported-models-features.html) by adding a `ToolCollection` with a `Tool` via the [Tools: Add Function to Request](/appstore/modules/genai/genai-for-mx/commons/#add-function-to-request) operation. You can also first retrieve data from a knowledge base and then call `ChatCompletions` with the information required using the connector's function calling properties. In order to use a function calling pattern with knowledge bases, add a knowledge base to your Request using [Tools: Add Knowledge Base](/appstore/modules/genai/genai-for-mx/commons/#add-knowledge-base-to-request). Here the collection identifier that needs to be passed is the `KnowledgeBaseID`.
+For additional general information about function calling, see [Function Calling](/appstore/modules/genai/function-calling/).
**Function calling microflows**: A microflow used as a tool for function calling must satisfy the following conditions:
diff --git a/static/attachments/appstore/platform-supported-content/modules/genai/genai-howto-byo/ChatCompletions_WithHistory.png b/static/attachments/appstore/platform-supported-content/modules/genai/genai-howto-byo/ChatCompletions_WithHistory.png
deleted file mode 100644
index b05fe4bd006..00000000000
Binary files a/static/attachments/appstore/platform-supported-content/modules/genai/genai-howto-byo/ChatCompletions_WithHistory.png and /dev/null differ
diff --git a/static/attachments/appstore/platform-supported-content/modules/genai/genai-howto-byo/GenAICommons_TextFiles_DomainModel.png b/static/attachments/appstore/platform-supported-content/modules/genai/genai-howto-byo/GenAICommons_TextFiles_DomainModel.png
index bd66f6c006c..0be494e28c0 100644
Binary files a/static/attachments/appstore/platform-supported-content/modules/genai/genai-howto-byo/GenAICommons_TextFiles_DomainModel.png and b/static/attachments/appstore/platform-supported-content/modules/genai/genai-howto-byo/GenAICommons_TextFiles_DomainModel.png differ
diff --git a/static/attachments/appstore/platform-supported-content/modules/genai/genai-howto-goundllm/chatcontext-microflow-example.png b/static/attachments/appstore/platform-supported-content/modules/genai/genai-howto-goundllm/chatcontext-microflow-example.png
index a7e969c1f67..21c7dc40dde 100644
Binary files a/static/attachments/appstore/platform-supported-content/modules/genai/genai-howto-goundllm/chatcontext-microflow-example.png and b/static/attachments/appstore/platform-supported-content/modules/genai/genai-howto-goundllm/chatcontext-microflow-example.png differ
diff --git a/static/attachments/appstore/platform-supported-content/modules/genai/genai-howto-goundllm/tool-addknowledgebase-example.png b/static/attachments/appstore/platform-supported-content/modules/genai/genai-howto-goundllm/tool-addknowledgebase-example.png
index ec3484f107e..8146ea20207 100644
Binary files a/static/attachments/appstore/platform-supported-content/modules/genai/genai-howto-goundllm/tool-addknowledgebase-example.png and b/static/attachments/appstore/platform-supported-content/modules/genai/genai-howto-goundllm/tool-addknowledgebase-example.png differ
diff --git a/static/attachments/appstore/platform-supported-content/modules/genai/genai-howto-singleagent/Microflow_GenAICommons.png b/static/attachments/appstore/platform-supported-content/modules/genai/genai-howto-singleagent/Microflow_GenAICommons.png
index 934a9d5ce7d..a6247a315f6 100644
Binary files a/static/attachments/appstore/platform-supported-content/modules/genai/genai-howto-singleagent/Microflow_GenAICommons.png and b/static/attachments/appstore/platform-supported-content/modules/genai/genai-howto-singleagent/Microflow_GenAICommons.png differ
diff --git a/static/attachments/appstore/platform-supported-content/modules/genai/genaicommons/GenAICommons_domain_model.png b/static/attachments/appstore/platform-supported-content/modules/genai/genaicommons/GenAICommons_domain_model.png
index f2f25070aac..e5275ce4131 100644
Binary files a/static/attachments/appstore/platform-supported-content/modules/genai/genaicommons/GenAICommons_domain_model.png and b/static/attachments/appstore/platform-supported-content/modules/genai/genaicommons/GenAICommons_domain_model.png differ