Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 1 addition & 5 deletions content/en/docs/marketplace/genai/how-to/byo_connector.md
Original file line number Diff line number Diff line change
Expand Up @@ -68,11 +68,7 @@ The Echo connector is a module in the [GenAI Showcase App](https://marketplace.m
This section allows you to focus on implementing chat completions, a fundamental capability supported by most LLMs. To make the process more practical, develop an example connector—the Echo Connector. This simple connector returns the similar text as output provided as input while remaining fully compatible with the chat capabilities of GenAICommons and ConversationalUI.
During development, you will get the key considerations to keep in mind when creating your own connector. You can either start from scratch and build your own connector or use the finished Echo Connector from the GenAI Showcase App and modify it to fit your use case.

To enable chat completion, the key microflow to consider is `ChatCompletions_WithHistory`, located in the GenAICommons module.

{{< figure src="/attachments/appstore/platform-supported-content/modules/genai/genai-howto-byo/ChatCompletions_WithHistory.png" >}}

This microflow plays a crucial role as it derives and calls the appropriate microflow from the provided DeployedModel, ensuring that the module remains independent of individual connectors. This is especially important for modules like ConversationalUI, which should work seamlessly with any connector following the same principles.
To enable chat completion, the key microflow to consider is `ChatCompletions_WithHistory`, located in the GenAICommons module. This microflow plays a crucial role as it derives and calls the appropriate microflow from the provided DeployedModel, ensuring that the module remains independent of individual connectors. This is especially important for modules like ConversationalUI, which should work seamlessly with any connector following the same principles.

To integrate properly, the microflow must supply two essential input objects:

Expand Down
18 changes: 9 additions & 9 deletions content/en/docs/marketplace/genai/how-to/create-single-agent.md
Original file line number Diff line number Diff line change
Expand Up @@ -305,9 +305,9 @@ Before adding tools via MCP, ensure you have at least one `MCPClient.MCPServerCo

1. Navigate to the agent view page for the IT-Ticket Helper agent and go to the Tools section. Add a new tool of type MCP tools.
2. Select the appropriate MCP server configuration from the available options.
3. Choose your import type:
* `server`: imports all tools exposed by the server
* `tools`: allows you to select specific tools from the server
3. Choose how to add MCP tools:
* **Use all available tools**: imports the entire server, including all tools it provides. This also means less control over individual tools and if tools are added in the future, that they get added automatically on agent execution.
* **Select Tools**: allows you to import specific tools from the server and changing specific fields for individual tools.
4. If you selected import type `tools`, you can choose to enable all available tools or select only the specific ones you need.
5. Click **Save**. The connected server or your selected tools will now appear in the agent's tool section.

Expand All @@ -317,7 +317,7 @@ You will also connect the agent to our knowledge base, so that it can use histor

1. From the agent view page for the `IT-Ticket Helper` agent, under **Knowledge bases**, add a new knowledge base:

* Knowledge base: select the knowledge base created in a previous step. For Mendix Cloud GenAI in particular, look for the collection `HistoricalTickets`. If nothing appears in the list, refer to the documentation of the connector on how to set it up correctly.
* Consumed Knowledge base: select the knowledge base resource created in a previous step. Next, look for the collection `HistoricalTickets`. If nothing appears in the list, refer to the documentation of the connector on how to set it up correctly.
* Name: `RetrieveSimilarTickets` (expression)
* Description: `Similar tickets from the database` (expression)
* MaxNumberOfResults: empty (expression; optional)
Expand Down Expand Up @@ -506,13 +506,12 @@ For both approaches, you need an `MCPClient.MCPServerConfiguration` object conta

Finally, you can add a tool for knowledge base retrieval. This allows the agent to query the knowledge base for similar tickets and thus tailor a response to the user based on private knowledge. Note that the knowledge base retrieval is only supported for [Mendix Cloud GenAI Resource Packs](/appstore/modules/genai/mx-cloud-genai/resource-packs/).

1. In the microflow `ACT_TicketHelper_CallAgent`, add a `Retrieve` action, before the request is created, to retrieve a **Deployed Knowledge Base** object:
1. In the microflow `_ACT_TicketHelper_Agent_GenAICommons`, add a `Retrieve` action, before the request is created, to retrieve a **Consumed Knowledge Base** object:

* Source: `From database`
* Entity: `GenAICommons.DeployedKnowledgeBase` (search for *DeployedKnowledgeBase*)
* Xpath: `[Name = 'HistoricalTickets']` (name that was used in the [Ingest Data into Knowledge Base](#ingest-knowledge-base))
* Entity: `GenAICommons.ConsumedKnowledgeBase` (search for *ConsumedKnowledgeBase*)
* Range: `First`
* Object name: `DeployedKnowledgeBase` (default)
* Object name: `ConsumedKnowledgeBase` (default)

2. Add the `Tools: Add Knowledge Base` action after the **Request** creation microflow:

Expand All @@ -522,7 +521,8 @@ Finally, you can add a tool for knowledge base retrieval. This allows the agent
* MetadataCollection: empty (expression; optional)
* Name: `RetrieveSimilarTickets` (expression)
* Description: `Similar tickets from the database` (expression)
* DeployedKnowledgeBase: `DeployedKnowledgeBase` (as retrieved in step 1)
* ConsumedKnowledgeBase: `ConsumedKnowledgeBase` (as retrieved in step 1)
* CollectionIdentifier: `'HistoricalTickets'` (name that was used in the [Ingest Data into Knowledge Base](#ingest-knowledge-base))
* Use return value: `no`

You have successfully integrated a knowledge base into your agent interaction. Run the app to see the agent integrated in the use case. Using the **TicketHelper_Agent** page, the user can ask the model questions and receive responses. When it deems it relevant, it will use the functions or the knowledge base. If you ask the agent "How many tickets are open?", a log should appear in your Studio Pro console indicating that the function microflow was executed. Now, when a user submits a request like "My VPN crashes all the time and I need it to work on important documents", the agent will search the knowledge base for similar tickets and provide a relevant solution.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -177,10 +177,9 @@ To use the knowledge in a chat interface, create and adjust certain microflows a
5. After the `Request found` decision, add a `Retrieve` action. In this example, we retrieve the same as in the insertion microflow.

* **Source**: `From database`
* **Entity**: `GenAICommons.DeployedKnowledgeBase`
* **XPath constraint**: `[Name = 'HistoricalTickets']`
* **Entity**: `GenAICommons.ConsumedKnowledgeBase`
* **Range**: `First`
* **Object name**: `DeployedKnowledgeBase_SimilarTickets`
* **Object name**: `ConsumedKnowledgeBase_SimilarTickets`

6. Add the `Tools: Add Knowledge Base` action with the settings shown in the image below:

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -153,6 +153,10 @@ Optionally, you can change the system prompt to provide the model additional ins
4. Update the `System prompt` value to reflect your desired behavior. For example, *`Answer like a Gen Z person. Always keep your answers short.`*
5. Save the changes.

### Optional: Setting User Access and Approval

When adding tools to a request, you can optionally set a [User Access Approval](appstore/modules/genai/genai-for-mx/commons/#enum-useraccessapproval) value to control if the user first needs to confirm the tool before execution or if the tool is even visible to the user. To show different title and description for the tool, you may modify the `DiplayTitle` and `DisplayDescription` which are only used for display and can thus be less technical/detailed than the `Name` and `Description` of the tool.

## Testing and Troubleshooting {#testing-troubleshooting}

Before testing, ensure that you have completed the Mendix Cloud GenAI, OpenAI, or Bedrock configuration as described in the [Build a Chatbot from Scratch Using the Blank GenAI App](/appstore/modules/genai/how-to/blank-app/), particularly the [Infrastructure Configuration](/appstore/modules/genai/how-to/blank-app/#config) section.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -136,12 +136,10 @@ For more technical details, see the [Function Calling](/appstore/modules/genai/f

##### Adding tools from MCP servers

Besides microflow tools, tools exposed by MCP servers are also supported. To add MCP tools to an agent version, select an MCP server configuration from the [MCP client module](/appstore/modules/genai/mcp-modules/mcp-client/). You can then choose one of two import types:
Besides microflow tools, tools exposed by MCP servers are also supported. To add MCP tools to an agent version, select an MCP server configuration from the [MCP client module](/appstore/modules/genai/mcp-modules/mcp-client/). You can then choose one of two to add MCP tools:

* Server: imports the entire server, including all tools it provides.
* Tools: allows you to import specific tools from the server.

Once the agent is called, all tools currently available from the server are added to the request and are available to the model.
* Use all available tools: imports the entire server, including all tools it provides. This also means less control over individual tools and if tools are added in the future, that they get added automatically on agent execution.
* Select Tools: allows you to import specific tools from the server and changing specific fields for individual tools.

#### Adding Knowledge Bases

Expand All @@ -154,6 +152,8 @@ For supported knowledge bases registered in your app, you can connect them to ag

To allow an agent to perform semantic searches, add the knowledge base to the agent definition and configure the retrieval parameters, such as the number of chunks to retrieve, and the threshold similarity. Multiple knowledge bases can be added to the agent to pick from. Give each knowledge base a name and description (in human language) so that the model can decide which retrieves are necessary based on the input it gets.

Note that [user access approval](#enum-useraccessapproval) can only be set to `HiddenForUser` or `VisibleForUser` for knowledge base retrievals.

#### Testing and Refining the Agent

While writing the system prompt (for both conversational and single-call types) or the user prompt (only for the single-call type), the prompt engineer can include variables by enclosing them in double braces, for example, `{{variable}}`. The actual values of these placeholders are typically known at runtime based on the user's page context.
Expand All @@ -178,7 +178,7 @@ For most use cases, a `Call Agent` microflow activity can be used. You can find

##### Call Agent with History {#call-agent-with-history}

This action uses all defined settings, including the selected model, system prompt, tools, knowledge base, and model parameters to call the Agent using the specified `Request` and execute a `Chat Completions` operation. If a `Request` object is passed that already contains a system prompt, or a value for the parameters temperature, top P or max tokens, those values have priority and will not be overwritten by the agent configurations. If a context entity is configured, the corresponding context object must be passed so that variables in the system prompt can be replaced. The operation returns a `Response` object containing the assistant’s final message, consistent with the chat completions operations from GenAI Commons.
This action uses all defined settings, including the selected model, system prompt, tools, knowledge base, and model parameters to call the Agent using the specified `Request` and execute a `Chat Completions` operation. If a `Request` object is passed that already contains a system prompt, or a value for the parameters temperature, top P or max tokens, those values have priority and will not be overwritten by the agent configurations. If a context entity is configured, the corresponding context object must be passed so that variables in the system prompt can be replaced. The operation returns a `Response` object containing the assistant’s final message, consistent with the chat completions operations from GenAI Commons, unless tool calls are requested by the model.

To use it:

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -213,6 +213,28 @@ The following operations are used in a (custom) action microflow:
* `Get Current User Prompt` gets the current user prompt. It can be used in the [action microflow](#action-microflow) because the `CurrentUserPrompt` from the chat context is no longer available.
* `Update Assistant Response` processes the response of the model and adds the new message and any sources to the UI. This is typically one of the last steps of the logic in an [action microflow](#action-microflow). It only needs to be included at the end of the happy flow of an action microflow. Make sure to pass the response object.

##### Using Tool or Knowledge Base calling {#action-microflow-tool-calling}

Since version 6.0.0, the module stores messages from toolcalling persistently in the database which will be sent along next chat messages. This makes the model aware of previous called tools (and their results). Additionally, if a tool is visible to the user or needs user confirmation before execution, the `ToolMessage` entity is used to display those. Note that this may increase token consumption as each information sent to an LLM usually counts as input tokens.

This changes the behavior action microflows are used, because they are called each time a tool is called and the UI changes for the user, for example displaying a toolcall or waiting for a user decision if a tool can be executed. Logic that only needs to happen right after the user sends their message (preprocessing) or after the final assistant's message was returned (postprocessing), should perhaps only be executed for those cases.

If no [user-visibility](#enum-useraccessapproval) is configured for tools and you'd like to not store tool messages (and therefore retain the behavior from versions before 6.0.0), you can change the boolean `SaveToolCallHistory` to false on the [Request](/appstore/modules/genai/genai-for-mx/commons/#request). Note that [knowledge base retrievals](/appstore/modules/genai/genai-for-mx/commons/#add-knowledge-base-to-request) are per default `VisibileForUser`.

### Human in the loop {#human-in-the-loop}

When using the [Function Calling](/appstore/modules/genai/function-calling/) pattern by adding tools to the request, you can control when those tools get executed and if they are visible to the user by setting [user access approval](#enum-useraccessapproval) per tool. Human in the loop describes a pattern where the AI can perform powerful tasks, but still requires humans to take certain decisions and oversee the agent's behavior. When using the ConversationalUI module, its basic action microflow pattern to execute requests with history and UI snippets to display the chat, human in the loop works out of the box. Note that action microflows are called until there is a final assistant's response as described [above](#action-microflow-tool-calling), even if all tools are executed without user interaction.

If you're not using the ConversationalUI module for [chat with history exeuctions](/appstore/modules/genai/genai-for-mx/commons/#chat-completions-with-history) or your use case does not contain a chat history, but is [task-focused (without history)](/appstore/modules/genai/genai-for-mx/commons/#chat-completions-without-history), you need to implement the following actions:

1. Store the tool calls from the returned [Response](/appstore/modules/genai/genai-for-mx/commons/#response) in your database. You can either use your own entities or reuse `ToolMessage` from ConversationalUI. The microflow `Response_CreateOrUpdateMessage` updates or creates a `Message` object with its corresponding tool messages, based on the response from the LLM.
2. If `UserConfirmationRequired` was enabled for a tool in the [user access approval](/appstore/modules/genai/genai-for-mx/commons/#enum-useraccessapproval) setting, you can use the tool messages to display the information and wait for the user to decide. The `pending` status of the tool message indicates that a user needs to take action. The `ToolMessage_UserConfirmation_Example` page shows an example as a popup. You can duplicate the page and modify to your own. The buttons for confirmation or rejection should recall the whole action.
3. The content of the tool messages need to be added to the request. [Add a message](/appstore/modules/genai/genai-for-mx/commons/#chat-add-message-to-request) with tole `assistant` that contains the toolcall information and messages with role `tool` for the tool results. You can use the `Request_AddMessage_ToolMessages` microflow to pass the same message from step 1 which takes care of this.
4. Recall the chat completions action. Be aware that the response might contain new toolcalls and not the final message yet, so you need to follow the steps above again. A recursive loop might be helpful.

For an example for a task-based (without history) use case, you can review the [GenAI Showcase App's](https://marketplace.mendix.com/link/component/220475) function calling example, especially the microflows `Task_ProcessWithFunctionCalling` and `Task_CallWithoutHistory`.


### Customizing Styling {#customize-styling}

The ConversationalUI module comes with stylesheets that are intended to work on top of Atlas Core. You can use variables and custom classes to modify the default rendering and think of colors, sizes, and positions. To learn more about customizing styling in a Mendix app in general and targeting elements using SCSS selectors, refer to the [how-to](/howto/front-end/customize-styling-new/#add-custom-styling) page.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -121,7 +121,7 @@ Gemini does not directly connect to the knowledge resources. The model returns a

This functionality is part of the implementation executed by the GenAI Commons Chat Completions operations mentioned earlier. As a developer, make the system aware of your indexes and their purpose by registering them with the request. This is done using the GenAI Commons operation [Tools: Add Knowledge Base](/appstore/modules/genai/genai-for-mx/commons/#add-knowledge-base-to-request), which must be called once per knowledge resource before passing the request to the Chat Completions operation.

Note that the retrieval process is independent of the model provider and can be used with any model that supports function calling, as it relies on the generalized `GenAICommons.DeployedKnowledgeBase` input parameter.
Note that the retrieval process is independent of the model provider and can be used with any model that supports function calling, as it relies on the generalized `GenAICommons.ConsumedKnowledgeBase` input parameter.

#### Vision {#chatcompletions-vision}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -129,7 +129,7 @@ Mistral does not directly connect to the knowledge resources. The model returns

This functionality is part of the implementation executed by the GenAI Commons Chat Completions operations mentioned earlier. As a developer, you need to make the system aware of your indexes and their purpose by registering them with the request. This is done using the GenAI Commons operation [Tools: Add Knowledge Base](/appstore/modules/genai/genai-for-mx/commons/#add-knowledge-base-to-request), which must be called once per knowledge resource before passing the request to the Chat Completions operation.

Note that the retrieval process is independent of the model provider and can be used with any model that supports function calling, as it relies on the generalized `GenAICommons.DeployedKnowledgeBase` input parameter.
Note that the retrieval process is independent of the model provider and can be used with any model that supports function calling, as it relies on the generalized `GenAICommons.ConsumedKnowledgeBase` input parameter.

#### Vision {#chatcompletions-vision}

Expand Down
Loading