Table of contents

50% Faster Setups, 91% Code Accuracy: Scaling Integrations with Exalate’s AI Assist
Executive summary
Who is Exalate?
AI journey
Goal of the project: Enable natural-language-based integration setup
Solution: LLM-based assistant with scripting tools
Case study
The challenge
The AI journey of Exalate
The solution: Architecture and Features of the Integration Assistant
Examples
Results
The future

Table of contents

50% Faster Setups, 91% Code Accuracy: Scaling Integrations with Exalate’s AI Assist
Executive summary
Who is Exalate?
AI journey
Goal of the project: Enable natural-language-based integration setup
Solution: LLM-based assistant with scripting tools
Case study
The challenge
The AI journey of Exalate
The solution: Architecture and Features of the Integration Assistant
Examples
Results
The future

Table of contents

50% Faster Setups, 91% Code Accuracy: Scaling Integrations with Exalate’s AI Assist

50% Faster Setups, 91% Code Accuracy: Scaling Integrations with Exalate’s AI Assist

Exalate

50%

50%

50%

faster integration setup for customers

faster integration setup for customers

faster integration setup for customers

400+

400+

400+

integrations created monthly using the assistant

integrations created monthly using the assistant

integrations created monthly using the assistant

91%

91%

91%

Code accuracy

Code accuracy

Code accuracy

Executive summary

Who is Exalate?

Exalate is a powerful integration platform that connects work management systems by syncing data and workflows across teams. It helps organizations tackle complex integration challenges with the right level of IT involvement—avoiding the need for rigid, costly workarounds while maintaining control and security.

AI journey

This project addressed a fundamental industry challenge: IT teams face a huge backlog as integration demands rise, slowing business and draining resources.

By developing a custom AI-powered assistant that is deeply embedded in Exalate’s core platform, we have transformed years of Exalate’s expert and scripting know-how into a powerful, easy-to-use tool for all users.

This not only creates a competitive advantage, but, most importantly, lowers the technical barrier for “citizen integrators”. Now, non-technical users can set up, customize, and maintain integrations on their own, without depending on busy IT teams.

The impact for Exalate’s customers is clear:

  • Faster integration set-up: Customers now set up integrations 50% faster, accelerating their own business processes and reducing project delays.

  • High-quality output: An acceptance rate of over 80% shows that the assistant consistently produces useful, production-ready integration scripts.

  • Greater reliability: Code accuracy of 91%, minimizing disruptions and reducing the need for troubleshooting or support.

  • Empowered teams: Over 400 integrations are now set up each month using the assistant, enabling more users across organizations to independently solve their integration challenges.

Exalate’s customers are able to unlock more value from the platform, accelerate their own digital transformation, and realize the benefits of seamless, scalable integration. This progress brings us closer to the Exaverse vision: a fully connected network of businesses.

"AI Assist isn’t just another trendy feature. It’s grounded in over a decade of real integration pain points. It’s not here to replace human input. It clears the path so more users can shape integration logic without hitting a wall of code."

Bruno Dauwe, Product Manager at Exalate.


As we continue to improve accuracy and user experience, we move closer to a future where integrations are fully descriptive—removing the need for coding entirely and making seamless connectivity accessible to all. By lowering the technical barrier, Exalate empowers more organizations to join the Exaverse and unlock new opportunities for collaboration and growth.

Goal of the project: Enable natural-language-based integration setup

Make it easier for Exalate’s customers to set up and modify integrations between their task management systems. Instead of manually writing code scripts, users should be able to converse with an LLM-based assistant that writes the code for them.

Solution: LLM-based assistant with scripting tools

An LLM-based assistant processes the user requirements and converts them into code. By asking follow-up questions the assistant first clarifies the user request, when necessary. Subsequently, it generates the corresponding code by accessing an extensive knowledge base of diverse code snippets. The solution is integrated into the Exalate platform and streamlines the configuration process. Non-technical users can now also set up more complex integrations, making them unlock the full value of the product.

Case study

The challenge

The Exalate platform allows users to set up integrations between work management systems like Jira, Salesforce, and ServiceNow. A common use case is the escalation of tickets between customer service and the IT department. For example, a customer service employee logs a new client issue in Salesforce and tags it with an “IT” label and “high” priority. Because of the label this issue is synchronized automatically to the IT department’s Jira. The priority and other (customizable) metadata are transferred as well and someone from the IT team picks up this ticket to start working on it. Once resolved, the status of the Jira issue is moved to “completed”, which triggers the escalation back to Salesforce. The status in Salesforce changes to “pending” which triggers the customer service employee to loop back to the customer and close the ticket in Salesforce as well.

The above example offers a glimpse into the power of Exalate. Users can customize when and what to sync to match their specific needs. The customizability of these integration rules results from the integration being defined by code. Unfortunately, this means non-technical users struggle to set up anything more complex than the basic use cases. Consequently, they might not understand the full value of the product.

The AI journey of Exalate

Exalate’s AI journey is a textbook example of how AI can be leveraged to create impact within a company. It’s a real-world example of how to turn bold ideas into tangible outcomes.

Together, we have advanced through the following stages:

  1. Applied AI Discovery: Starting with an inspiring workshop, we brainstormed together on potential AI use cases within Exalate. A prioritization based on effort and impact puts an integration assistant as the number one priority.

  2. Proof of Value: We researched the capabilities of LLMs to generate Exalate configuration (Groovy scripts), given some natural language requirements.

  3. Proof of Concept: Retrieval of reference code snippets is added for improved results. We also design a custom evaluation framework to more easily compare the performance of different experiments.

  4. Minimum Viable Product: We deploy a working prototype for internal use and collect valuable feedback.

  5. Sustainable Solution: The scope is increased, feedback is taken into account and the solution is released to end users. We enter the phase of continuous improvement.

The solution: Architecture and Features of the Integration Assistant

Our solution is an elegant and extensible LLM-based assistant that interacts with the user via chat and has access to tools to modify the user’s integration scripts directly. We’ll now dive into the most important features:

> Inputs and outputs

Different from a generic chatbot this assistant does not only take the user question as input but also takes into account the current integration script and additional metadata like the platform (Jira, Salesforce, Azure DevOps, …) the user is working with. Similarly, it does not only answer the user in the chat but can also update the integration script in real time. When the assistant suggests a change to the integration script, the user receives visual feedback on these changes and can accept or decline them.

> An LLM-based assistant with tools

The assistant, at its core, is an API call to an LLM, with the input described above as well as a carefully crafted set of instructions (system prompt). To give the assistant the power to modify integration scripts, we provide it access to certain tools. For example, when a user wants to start anew and revert the integration to the defaults, the assistant can call the “load_default_configuration” tool. This tool loads the default template for the platform of the user.

> The configuration generation tool: explained

The most important tool is the “generate_configuration” tool. This tool takes in the current integration script as well as a list of new requirements. It outputs a new integration script in which the requirements have been implemented. The assistant can only call this tool when it has a clear idea of what the user wants to achieve, i.e. the listed requirements need to be very concrete. It is instructed to ask the user for more information when there is some ambiguity in the request.

When the “generate_configuration” tool is called it triggers a workflow that will first retrieve the most relevant code snippets per requirement from a carefully curated knowledge base. Subsequently, another LLM call is triggered with the instructions to update the given integration script taking the new requirements and related code snippets into account. The resulting script is sent to the user and the assistant can provide additional information in the chat.

> Evaluation and monitoring

To guard the quality of our solution and facilitate continuous improvement we actively evaluate and monitor the integration assistant. Evaluating LLM-based applications on a test set ensures that new versions never perform worse than previous ones. It also allows us to easily experiment with new techniques and settings as we can immediately see the impact on the metrics. Here are some of the properties we track on our test set:

  • Is the generated configuration script functionally the same as the reference script? (assessed by an LLM-as-a-judge)

  • Does the assistant ask a follow-up question when it needs to (i.e. when the user input is ambiguous)?

  • Does the assistant ask the right follow-up questions? (assessed by an LLM-as-a-judge)

  • Does the assistant deal correctly with jailbreak attempts and out-of-scope questions?

In addition to evaluation, monitoring the feedback from end users is crucial to ensure continuous improvement. Via a dedicated tool, Exalate reviews the suggested code changes that have been rejected by the user. This negative feedback is labeled into categories for follow-up and, if applicable, the knowledge base is updated to prevent similar issues in the future. Issues that are harder to resolve are grouped and prioritized for the next development cycle. This way we have shown to steadily improve performance over time.


Examples

Within the Exalate UI, the user has the option to directly modify the integration configuration or chat with the Integration Assistant instead:

When the user shares an ambiguous requirement, the assistant will first ask a follow-up question to acquire more information:

Once the requirement is sufficiently clear, the Integration Assistant will update the code accordingly. The user gets visual feedback on the applied changes and has the option to insert or discard them:

Results

With the explicit feedback of the end user (insert/discard) we can compute the acceptance rate as follows: # inserts / (# inserts + # discards). This metric allows us to monitor solution quality and compare improvements over time.

Thanks to our continuous improvement actions the acceptance rate rose from the initial 50% to over 80%. Additionally, setup times dropped by 50% and script errors fell, having code accuracy of 91%. The Integration Assistant is being used in 400+ integrations, generating more than 300 AI-driven code suggestions per month, thus lowering the setup barrier and increasing the perceived value of Exalate.

The future

Exalate and Superlinear continue their collaboration to tackle even more industry challenges and accelerate growth toward realizing the Exaverse vision. The AI-Assist project isn’t just a one-off success— it’s a cornerstone of Exalate’s corporate strategy. Betting on AI isn’t optional; it’s the only path to achieving their vision. Their AI roadmap is built not just for efficiency and productivity, but to secure competitive advantage and drive long-term growth.

While the Integration Assistant focuses on simplifying one-to-one integrations, our next milestone is unlocking the management of interconnected systems: networks. This new capability directly builds upon the foundation laid by AI-Assist: the same AI-driven expertise, scripting knowledge, and seamless user experience that power individual integrations will now enable users to easily create, manage, and control entire networks of connections.

By reusing the infrastructure, data, and learnings from AI-Assist, we’re making the shift from single integrations to networked systems. Our goal is to leverage AI to make managing these complex networks just as effortless as setting up a single integration, increasing value for all Exalate users.


Curious to hear what AI could do for your specific business? Contact us.

CASE

CNH partners with Superlinear to enhance data-driven inventory management, optimizing safety stock with predictive analytics to cut costs and reduce shortages.

CASE

CNH partners with Superlinear to enhance data-driven inventory management, optimizing safety stock with predictive analytics to cut costs and reduce shortages.

CASE

CNH partners with Superlinear to enhance data-driven inventory management, optimizing safety stock with predictive analytics to cut costs and reduce shortages.

Port of Antwerp-Bruges and Superlinear lead the smart port revolution with APICA

CASE

Learn how Port of Antwerp-Bruges & Superlinear lead the smart port revolution with APICA, transforming operations with cutting-edge solutions.

Port of Antwerp-Bruges and Superlinear lead the smart port revolution with APICA

CASE

Learn how Port of Antwerp-Bruges & Superlinear lead the smart port revolution with APICA, transforming operations with cutting-edge solutions.

Port of Antwerp-Bruges and Superlinear lead the smart port revolution with APICA

CASE

Learn how Port of Antwerp-Bruges & Superlinear lead the smart port revolution with APICA, transforming operations with cutting-edge solutions.

CASE

Discover how APICA Chat enables the Port of Antwerp-Bruges to interact with their SQL data warehouse, simplifying data access through natural language.

CASE

Discover how APICA Chat enables the Port of Antwerp-Bruges to interact with their SQL data warehouse, simplifying data access through natural language.

CASE

Discover how APICA Chat enables the Port of Antwerp-Bruges to interact with their SQL data warehouse, simplifying data access through natural language.

Contact Us

Ready to tackle your business challenges?

Stay Informed

Subscribe to our newsletter

Get the latest AI insights and be invited to our digital sessions!

Stay Informed

Subscribe to our newsletter

Get the latest AI insights and be invited to our digital sessions!

Stay Informed

Subscribe to our newsletter

Get the latest AI insights and be invited to our digital sessions!

Locations

Brussels HQ

Central Gate

Cantersteen 47



1000 Brussels

Ghent

Planet Group Arena

Ottergemsesteenweg-Zuid 808 b300

9000 Gent

© 2024 Superlinear. All rights reserved.

Locations

Brussels HQ

Central Gate

Cantersteen 47



1000 Brussels

Ghent

Planet Group Arena
Ottergemsesteenweg-Zuid 808 b300
9000 Gent

© 2024 Superlinear. All rights reserved.

Locations

Brussels HQ

Central Gate

Cantersteen 47



1000 Brussels

Ghent

Planet Group Arena
Ottergemsesteenweg-Zuid 808 b300
9000 Gent

© 2024 Superlinear. All rights reserved.