Regnology

Regnology Chatbot: answering complex regulatory questions

Impact

The Regnology Chatbot helps answer complex questions in the fast-moving and time-sensitive regulatory world. By answering customer questions faster and with high quality, this solution ultimately accelerates Regnology’s ability to scale their operations and serve more customers.

Intro to the customer

As a leading global provider of software for regulatory reporting and management, Regnology is helping both regulators and companies to ensure their regulatory processes run as efficiently as possible.

Challenge

The regulatory environment is complex; and this complexity is increasing as Regnology continues to scale globally, integrating new and evolving policies. Helping clients to navigate these frequent updates is a challenge for Regnology’s support team, who need to consult the ever-growing documentation of the platform to answer questions. As regulatory questions are often time-sensitive, this adds to the challenge.

By integrating ML6's Retrieval-Augmented Generation (RAG) expertise, we have significantly enhanced the accuracy and relevance of our responses, ensuring our clients receive precise and up-to-date information. This advanced approach has enabled us to swiftly and seamlessly provide our clients with AI-augmented access to our knowledge base, thereby enhancing their ability to navigate the complex regulatory landscape efficiently

by

Steffen Dangmann (Director Cloud & AI Engineering)

Solution

Therefore, Regnology decided to invest in an AI-powered chatbot to support both their internal support team and their customers. The Regnology Chatbot aims at providing high quality responses to these complex regulatory questions, faster.

The solution leverages the Retrieval Augmented Generation (RAG) approach to ensure responses are grounded in the large corpus of Regnology documentation. To build a production-grade application, the project focused on:

  • Scalability: the Regnology Chatbot needs to be able to serve many internal and external users concurrently, with limited latency.
  • Guardrails: as an external facing product, it is crucial to catch potential misuse of the application. To mitigate this risk, the team trained custom input guardrail models that are able to detect irrelevant or malicious questions.
  • Data: Regnology has a fastly evolving, large set of documentation. To ensure trustworthy and up-to-date responses, the team set up automated data pipelines to store the documentation in the search database.
  • Evaluation and monitoring: to monitor the application’s accuracy, the team installed automated benchmarking based on a ground truth dataset. This allows Regnology to launch iterative improvements in a monitored way. Next to this, other KPIs such as user adoption and feedback rates are continuously monitored

The full architecture is built on the Google Cloud Platform.

While best practices exist for customizing LLMs to unique challenges, guiding them to tackle regulatory questions is no simple task. We had to ensure high precision while navigating the complexities of searching in big amounts of unstructured data, handling regulatory content, and custom domain-specific programming languages.

by

ML6 team

Results

As a result of our focus on building a production-ready application, the Regnology Chatbot has been successfully deployed and has demonstrated its capability to handle a substantial volume of concurrent questions.

Next steps include:

  • Continuously monitoring and improving the Regnology Chatbot quality.
  • Supporting additional Regnology products.
  • Adding Premium features such as explaining the regulatory source code directly.
  • Adding customer context, enabling the Regnology Chatbot to take into account knowledge about the customer or their data.

This project is the result of a close collaboration between Regnology, Google Cloud & ML6. We worked in a co-creation setting and in a cross-functional team. This brought many benefits, as each of the parties brought crucial experience and knowledge on the Regnology product suite, best practices for building production-ready RAG applications and of course the latest Gen AI capabilities of Google Cloud Platform to support.

Throughout the course of the collaboration, Regnology has further grown their in-house AI team and their capabilities, setting them up for success in their broader AI strategy. As their partner, we are beyond happy to see and support this growth.