Bolt Energie

Transforming Bolt’s customer service with a Large Language Model

Impact

With the AI solution built by ML6, customer service agents now have an accurate indicator for the language and category of an incoming customer service ticket. This enables the agents to focus on what’s really important, instead of sorting the tickets first, resulting in a time and efficiency gain. 

Intro to the customer

Bolt Energie is the first Belgian energy platform that makes a direct connection between energy generators and consumers by supplying green electricity and natural gas. The customer chooses from which local producer he purchases green energy: from a farmer from Zulte to floating hydroelectric power stations on the Maas.

Challenge

Customer care is a significant and important workload at Bolt Energie. As Bolt grows, the need for customer care grows as well. With Generative AI (Large Language Model), the way of offering customer support can be tremendously improved, which results in lower costs for each request and the opportunity to do even better customer care with the same team size.

ML6 provided us with a fast solution to bring the workload on our customer care significantly down. Communication was clear from the start & they ensured a smooth handover.

Development Manager at Bolt Energie

by

Jasper Verschuere (Bolt Energie)

Solution

ML6 leveraged Large Language Models and Generative AI to assign categories to customer questions, eliminating the need for a labelled dataset and to train a task-specific model. An open source model was used to detect the language of each customer service ticket.

To answer customer questions, general answers given by e.g. ChatGPT are not accurate enough: they don’t have access to the domain and company-specific information of Bolt Energie. Using Retrieval Augmented Generation (RAG), ML6 provides the model with access to Bolt Energie’s internal knowledge base, to generate a more accurate and specific answer.

ML6 has set up this solution on AWS, leveraging services such as Amazon OpenSearch, AWS Bedrock and other AWS infrastructure. By using managed LLM services,the cost related to keeping a self-hosted model available at all times can be avoided.

by

Results

By letting a Large Language Model determine the category of an incoming customer service ticket, it can automatically be directed to a customer service agent. Agents can now prioritize actual customer interactions over ticket sorting, 80% of tickets are now assigned to the correct queue automatically saving BOLT 4 days per week in sorting time.

ML6 also experimented with generating answer suggestions to give customer care agents a head-start in replying to tickets. These answer suggestions are based on information in Bolt Energie’s knowledge base.

The usefulness of the generated answers could be further improved by various strategies: Adding previously processed customer service tickets could give the LLM more context to answer questions, but comes with the additional challenge of removing personal data. Changes in Bolt Energie’s way of working might also make some information drawn from previous tickets redundant, potentially resulting in wrong answers. Instead, ML6 and Bolt Energie are now focussing on increasing the size of the knowledge base by adding more question-answers pairs to cover more incoming questions, and on continued prompt-engineering with feedback from the customer service agents to further tailor the generated answers to the style of Bolt Energie.