7 minutes
We all want more accurate and more complete data. But sometimes it is difficult to turn unstructured text into processed, analyzed, and structured data. This is a particular problem for companies that handle a large number of customer-facing phone calls from either their sales or customer support teams.
How do you turn all of that information into actionable insights? If that sounds like a familiar problem then you’re in the right place to find the answer. Lettria’s natural language processor (NLP) allows you to import transcripts of your phone calls, other recorded spoken interactions, and voice-to-text directly into the platform and then automatically enrich your Customer Relationship Management (CRM) software with invaluable information.
It is the most efficient way to identify key trends and important pieces of information from thousands (if not hundreds of thousands) of hours of calls. We know that you don’t have the time and resources necessary to manually review each one of these calls or other voice recordings and analyze them for important details, so we’ve made the process as easy as possible.
You get to automate every step. From transcription right up through to when the data is imported into your CRM (or other software). This helps our users to improve their understanding of their customers, get a better sense of their overall business performance, and use the information to improve their sales and customer support processes.
Your phone calls with customers and leads are one of the most important aspects of your business. It is often the best, and occasionally the only, way to communicate directly with a customer or lead. So how can an organization turn those one-to-one interactions into large data sets where they can identify trends, search for key information, and monitor business performance? The answer is through natural language processing.
Interested in finding out more? Over the course of this article we will explain to you how natural language processing can help you to identify key business insights from your call transcripts and other voice-to-text use cases, allow you to enrich your CRM with key data from that unstructured data, and help you to get your NLP project started.
But let’s start with the basics. Just what is natural language processing?
A (quick) beginner’s guide to natural language processing
The first example of natural language processing was demonstrated in 1954 when IBM’s 701 mainframe was used to translate Russian sentences into English. In the nearly 70 years that have followed, natural language processing has become one of the fastest developing and most advanced areas of artificial intelligence.
Natural language processing allows computers to understand the human language. We typically refer to the language that computers use as machine code or machine language and it would appear as total nonsense to anyone unfamiliar with software engineering. You can think of natural language processing as the bridge between this machine code and human languages. The meaning of the language is extracted by breaking text into words, establishing a contextual relationship between different words, and structuring that information into data.
Its applications can be seen in a wide variety of different areas and most people will come into contact with some form of natural language processing on a virtually daily basis. Whether that is still in the form of a language translator as IBM first demonstrated, or through chatbots or voice assistants, NLPs form the foundations of all interactive systems.
Natural language processing has advanced since those early days at IBM and is now capable of analyzing huge sets of language-based data in a way that humans could not manage themselves. It’s totally consistent. Unbiased. Unrelenting.
So how exactly does natural language processing even work? If you’re a data scientist or developer this will all come across as pretty basic to you and you can skip ahead to the next section, but if you’re new to the world of NLPs then it is certainly worth understanding a little bit about the technology itself.
Natural language processing includes several different techniques for interpreting human language. These techniques include everything from a statistical approach to machine algorithmic methods, but the end goal is always the same.
By running a series of examples through an Artificial Intelligence (AI) model and gradually teaching it the relationships between different words and phrases it is eventually able to break down unstructured text into smaller elements and identify those associations and understand their meaning.
As NLPs have advanced, deep learning techniques mean that computers can now understand the full complexity of the human language — including processing grammatical errors, misspellings, abbreviated words, or incorrect punctuation. This is tremendously important for most modern NLP applications as it turns complicated unstructured text into a numeric structure that can be used by downstream applications.
It is this very process that allows your email provider to identify spam or your home assistant to play the song that you’re desperately yelling at it for. Who would want to live in a world where your smart speaker plays Ryan Adams when you’re really looking to relive your glory days through the nostalgic appreciation of Canadian icon Bryan Adams. I shudder at the thought.
These NLP fundamentals then help more advanced AI models and platforms to perform more complicated tasks, including:
- Content categorization: this includes search, indexing, and duplication detection
- Topic discovery and modelling: Identifying the meaning and themes within a text
- Corpus analysis: Understanding document structure and preparing it for further models
- Contextual extraction: Automatically pulling structured information from text-based sources
- Machine translation: translating text or speech from one language to another
- Document summarisation: Generating synopses of larger text
- Speech-to-text and text-to-speech: We probably don’t need to explain that one
- Sentiment analysis: Identifying the mood or emotion expressed within a text
{{ligne}}
And look, we know what you’re thinking. Online translations aren’t that accurate, spam filters miss a lot of bad emails, chatbots can be pretty infuriating to talk to, and your voice assistant often fails to understand you. But that isn’t a problem with the technology, it’s just a case of how advanced is the specialization and customization of the NLP that you’re interacting with.
That’s why choosing the right NLP and training your AI model for your specific requirements is essential if you’re going to get the results that you need. The functions and examples that we’ve just provided are simply very common use cases and some of the times when you’re likely to encounter a real-world interaction with natural language processing.
Each of those more specific functions allow NLPs to carry out the tasks that have made them an integral feature of everyday life. There’s a good chance that your organization is already using natural language processing in some way, shape, or form, but one new way in which companies are benefitting from the technology is by using it to enrich their CRMs with information from call transcripts and voice-to-text.
Using an NLP to enrich your CRM with data from call transcripts
That quick guide to natural language processing mentioned many of the applications for the technology, but also revealed how useful it could be for anyone looking to extract data from their call transcripts.
The issue is, not all off-the-shelf NLPs specialize in textual data processing and building your own requires a great deal of resources — in terms of skills, time, and money. On top of that, automating the workflow can be a challenge and manually inputting the information at any one of the steps in the process is a major hindrance for the project itself.
And we know what you might be thinking, you already use software to automatically transcribe your calls and these are monitored and reviewed by managers. On top of that, your sales team might be pretty good at logging most of the information from their calls in your CRM.
But, we hate to break it to you, but information is definitely going missing from those calls. Not through incompetence or mismanagement, but simply because it’s impossible to manually record all of that information — especially when we are talking about customer support or care interactions.
Roughly 80% of the average CRM’s database is unstructured and a large part of this will be comprised of information from call transcripts. That means that you’re making a lot of sales or customer success decisions based on only 20% of the information that is available to you. And that means that you’re definitely missing insights or trends that could have a huge impact on performance.
But the impact of enriching your CRM with the value from within that unstructured data goes far beyond your ability to make smarter business decisions. In a world where we are constantly looking to automate processes, it also improves your ability to accurately program workflows.
Say you want to automatically send an email to every customer that has expressed a negative experience or opinion in the past month. If you rely on manual call logging then you will miss the vast majority of customers that would fall into this category.
But with the right NLP, you will be able to automate this process by creating a tag for a negative experience, have customers automatically assigned to this tag via the natural language processing results, and include each of these profiles included in your next email campaign.
Let’s even say you want to ensure that customers expressing an interest in a certain product or with a specific type of complaint automatically receive a callback. That information will be highlighted, included in your CRM, and your teams will be alerted to the fact that a call needs to be made.
Enriching your CRM with unstructured data will improve your decision making process, improve your processes, and increase efficiency. It allows your sales team and customer success teams to focus on the person on the other end of the phone and not on notes that they might need to take. It ensures that every call receives exactly the same standard of care, analysis, and attention.
The Lettria solution
That’s why Lettria is uniquely positioned to address this particular use case. Not only does the platform offer the benefits of an off-the-shelf solution combined with the accuracies and strengths of a fully customized NLP, but we’ve also developed specific integrations and functionalities that make enriching your CRM with data from your call transcripts and other recordings…well…sort of easy.
We know. Easy is never a term that you want to hear applied to fairly complex technology or software and it might even be a red flag for some of our readers. But trust us. We’ve really worked hard on this and tried to make this process as simple as possible. We’ve built a platform that allows you to turn a foundation into a customized NLP and found every possible way to save you time without reducing your capabilities.
Don’t believe us just yet? We understand, but keep reading and you’ll find out just how Lettria solves this particular problem and some examples from some use cases that we’ve worked on with clients.
So, let’s get to the bit about how Lettria handles the problem of enriching your CRM with your unstructured data.
By now you should at least have some understanding of how natural language processing works. What you might not yet be aware of are the two major challenges that make enriching a CRM with unstructured data difficult.
The first is creating a specialized NLP that is capable of processing unstructured textual data to your requirements and using that to populate fields within your CRM. The second is having an NLP that automates the two other key steps within this workflow: those being automatically processing the call transcripts and then transmitting that data to your CRM once the processing has been completed.
Lettria handles both of these challenges by making the process as simple as possible. For the first issue, our platform approach means that you get to start with a sort of NLP foundation that you can fine-tune for your specifications.
As we’ve already discussed with any technical software decision, you always have to ask yourself whether it is more effective to build or buy what you need. The major advantage associated with building your own NLP solution would be the fact that it would be specialized for your industry, the terms used by your leads and employees, and any specifications that might be unique to your company, project, or sector. The issue is that most companies don’t have the resources necessary to successfully build their own NLP.
Those lack of resources automatically forces most companies to look at off-the-shelf solutions. These require fewer in-house specialists and can be much quicker to get up and running, but that sometimes means that you sacrifice some level of specialization.
Lettria’s platform approach means that you get the best of both worlds. With 15 pre-trained multilingual models to start from, you get access to an AI model that you only need to focus on fine tuning in order to get the best results. This makes starting a project with Lettria 4 times faster than building one your own, but much more accurate than other off-the-shelf solutions.
The Lettria platform also allows you to connect to third-party software via an API and solve any integration problems. This means that you can automate the importation of call transcripts into your NLP platform and automatically enrich your CRM with the data that it processes.
In short. You get a specialized NLP platform that requires less time to set up and less time to manage on a daily basis. It’s a win all round.
But the advantages don’t stop there. This might be sounding a bit like an infomercial, but this is the section where I show you that bonus feature that you might not yet have realized that you needed. What is it you ask? Collaboration.
Most NLP projects are managed exclusively by data scientists and developers and for some projects that might be OK, but it isn’t the case for a lot of them. In fact, 85% of NLP projects fail because of the challenge of moving from development into production.
The reality is that, whilst your data scientists and developers might be more interested and comfortable with training an AI model and improving accuracy, your sales or customer success teams will be the ones more interested in the actual results.
That is clearly the case when you look at a problem like getting more value out of the unstructured data from within your CRM. Sure, your data scientists will fall in love with how accurate the Lettria platform is, but it’s your sales and customer support teams that will benefit from the trends that are identified, the processes that are automated, and the time that is saved.
Where other NLPs fail to actively involve non-technical profiles in the project, the Lettria platform encourages participation and makes it simple enough for anyone to use. The no-code approach means that you don’t need to have advanced technical skills in order to use the platform and your non-technical teams will be able to be involved in the project from the very start.
This has a number of benefits:
- It encourages project buy-in, which is often the downfall of new initiatives. I’m sure that we’ve all seen projects that have failed almost before they’ve begun because of skepticism or an unwillingness to change or adapt from some of the stakeholders.
- Everyone understands what the project aims to do. Non-technical teams are often presented with complicated projects where they are told how incredible the results will be, but they often don’t really understand what is going on or where the data is coming from. By introducing a collaborative platform you ensure that your sales and customer success teams (or whoever it might be) understand what is going on from the very start.
- The project benefits from a range of different skill sets and knowledge. Natural language processing might be the natural domain of data scientists and developers, but they can still benefit from the knowledge that your non-technical teams have. Whether that’s helping them to train the AI model for the right terms or improving how data can be recorded in your CRM, their knowledge can make the project far more efficient and effective.
{{ligne}}
Collaboration really is the final step in ensuring that you can enrich your CRM using the unstructured data from your transcripts. If you’re looking at NLP solutions that don’t encourage and allow non-technical profiles to take an active role in a project then you are significantly reducing your chances of success.
At Lettria, we’ve not tried to develop an NLP for every use case, but we have tried to develop the best NLP platform for the use cases that we want to address. And we think that we’re pretty good when it comes to handling unstructured textual data and enriching your CRM with the information contained within your call transcripts.
But don’t just take our word for it. Here are a couple of real-world examples from clients that have used Lettria for some very different projects. What do they have in common? The huge challenge of taking large sets of unstructured textual data and extracting key insights from it that can improve processes, increase resource efficiency, and allow them to manage by exception.
Now, before we get there, we’ll address the elephant in the room in the fact that both use cases are from French organizations. Lettria is a French company and, whilst we take pride in our advanced multilingual models and our international footprint, some of our most advanced projects are from companies that started using us early on.
Let’s take a closer look.
Solving the problem of unprocessed data in CRMs
Created in 1991, La Poste started as the French national postal service but is now part of Groupe La Poste, which has expanded to included a bank and insurance company (La Banque Postale), a logistics service provider (Geopost), and a mobile network operator (La Poste Mobile).
With annual revenues of over €34 billion, the group now employs around 250,000 people across its various businesses. La Poste has more than 17,000 branches and post offices in France and relies on delivering a consistent level of customer service and experience across its business.
The nature of their business means that they communicate with customers and leads via phone calls on a daily basis. Their call centers and sales teams form the backbone of their business. Like many other organizations, they’ve implemented policies and introduced software that allow them to ensure consistency across those calls, but extracting key data from the calls themselves was a problem that they were struggling to solve.
La Poste’s problem is in no way unique. Any company handling thousands of calls will know how big of a challenge it is to extract key insights from each exchange and incorporate them into their CRM, knowledge base, and business strategy. Only 3% of their calls were being processed and reported to their CRM, so there was little doubt that valuable data was being missed.
When they started to look for a way to solve this problem, they not only found that natural language processing was the only viable option, but actually the perfect solution. After benchmarking various solutions and consulting with Illuin Technology, La Poste decided that they needed the customization that might be typically associated with an open-source solution combined with the advantages offered by a platform approach.
The Lettria platform was exactly what they were looking for and they started their NLP project in September of 2021. Whilst most natural language processing projects take a lot of time to get off the ground, Lettria’s platform helps users to reduce their time-to-market by 75% and La Poste’s proof of concept was released after just 4 months.
Training your own AI model can be incredibly time consuming and resource heavy, but Lettria has focused on optimizing this process by allowing you to customize the pre-trained models and this meant that the La Poste project only required an average of 2 hours of workshops per week during the development phase and under 100 hours of work to develop a fully customized AI model.
To make things even simpler for La Poste, Lettria developed connectors for both their speech-to-text provider (Allo-Media) and their CRM (Microsoft Dynamics) to simplify integration. This allows Lettria to automatically extract data once it has been processed by Allo-Media and push structured data directly to Microsoft Dynamics.
Although many NLP projects are handled exclusively by data scientists and development teams, the Lettria platform encourages collaboration and the inclusion of business profiles that helps users to get the most out of their projects.
In the case of La Poste, this meant that project managers, call center supervisors, and salespeople were all included in the process to draw on their knowledge and ensure that the platform was used by every stakeholder that would benefit from the insights that it identified.
In just a matter of weeks, 32 labels had been created and identified within conversations. Those labels refer directly to CRM fields allowing the AI to train itself in very precise classification patterns. This classification language model combined with an API means that raw text can be analyzed and key elements (like topics and entities) can be extracted.
Even with the reductions in time-to-market that Lettria offers, it can be easy to focus on the heavy-lifting that can be required at the start of any NLP project, but the advantages are clear to see once a project is up-and-running.
Whereas La Poste were previously only processing 3% of their calls, 75% of calls now have at least one label, and, going forward, Lettria will allow them to automatically process every phone call and input that data directly into their CRM.
This invaluable information has allowed La Poste to develop a more complete and sophisticated understanding of its customers, improve consistency and standardization across phone calls, and adapt its business offerings as a result of business insights that were identified through the analysis.
The La Poste use case is an excellent example of how a business can benefit from enriching their CRM with data from call transcripts. But, more than that, it is also an example of how important it is to choose an NLP option that allows both technical and non-technical profiles to take an active role in the project.
By choosing Lettria, La Poste was able to ensure that sales and customer success teams could contribute to the project, understand its value, and benefit from the insights that were being identified.
After all, as we discussed in the section above, although NLP projects are often thought of as the exclusive domain of data scientists and developers, the end-users of the information that they produce are often those with business profiles.
So that’s one example of how Lettria has been used to extract key information from the unstructured data of call transcripts, but not every company uses our platform for exactly the same use case or challenge.
Improving processes and structuring patient data
The applications for using natural language processing to analyze transcripts of spoken interactions actually goes far beyond calls from sales teams and customer support. There are instances where processing the unstructured data from other recordings and voice-to-text can also benefit your organization.
Similar to La Poste, Assistance Publique Hôpitaux de Paris (AP-HP) found themselves in a situation where they had more unstructured data than they could handle. With nearly 40 hospitals it is the largest university hospital in Europe and with 100,000 employees, it is also the largest employer in the Paris region.
All of those hospitals add up to a lot of patients. And a lot of patients means a lot of data. All of those patient reports contain crucial information that, in this instance, could very much mean life or death. But you don’t want doctors and nurses to be tied down dealing with generating reports when they could be focusing on the patients themselves.
Since 2020, AP-HP has been running the BoPA innovation chair that is focusing on identifying problems from the operating room in order to find human and technological solutions. This innovative project is based on 5 major blocks: the Human Factor Block, the Viz Block, the Bot Block, the Light Block, the Touch Block, and the Box Block (a name derived from the black box that is used for flight recordings on airplanes).
Their focus include the fields of surgeon-patient communication, surgical image capture, natural language analysis in the operating room, augmented reality using digital twice or fluorescent light, collaborative robotics or cobotics (design of collaborative robots), and the protection of operating room and patient data.
The project includes collaboration with a number of other organizations and institutions, including INRIA, Institut Mines-Telecom, and Université Paris-Sacaly and brings together leading researchers from the fields of virtual reality, digital twins, and artificial intelligence.
One of the problems that this initiative identified was the fact that patient reports were very difficult to manage. They are generated at every stage of a patient’s journey (pre, peri, and post) and constantly changed in format and in the usage of technical terms.
Even something simple as a voice dictation required 3 steps:
- The dictation had to be transcribed by hand by an assistant
- The text then had to be put into a report on the hospital’s letterhead by a separate department
- and then the report could be analyzed by a medical secretary to detect key variables about the patient which could then be added to their Electronic Health Record (EHR).
{{ligne}}
This process required 25 minutes of human time per report and this was an area where the BoPA innovation chair felt that technology could speed up the process and reduce the amount of resources involved in the analysis and data input.
So, in February 2020, right before the world was about to be plunged into a global pandemic that would put an unprecedented strain on healthcare infrastructure and medical resources, AP-HP decided to work with Lettria to create an AI model specific for the healthcare sector.
This process took only three months, which was an incredible achievement given the circumstance in which it was being developed. With tens of thousands of reports generated each month, natural language processing and automatic language processing (ALP) stands to save AP-HP an incredible amount of time by automating the processing and analysis of each report.
The problem isn’t unique, but the use cases can be
This only goes to show how the application for natural language processing to enrich CRMs and other software with the information from unstructured data can be found in virtually every industry. This article has very much focused on the call transcript use case as it is the most prevalent, but there are many situations in which it can save an organization resources and help them to improve their processes.
It’s actually part of the reason why we create content like this. We know how powerful our solution is, but we don't always know the problems that it can help our users to solve. By putting some examples out there and letting you think about your own business, there’s a good chance that new users will come to us with new use cases that we can help them with.
As with any good software, we’ve tried to create the best tool possible and we’re always happy to see the creative ways in which our users apply it to help them address their needs. But one thing is for certain, by specialiazing in the processing of unstructured data and simplifying the integration with all of the third-party tools that you already use, we’ve made the Lettria platform an incredibly powerful solution for finding key information that you’re currently missing.
Start enriching your CRM with information from unstructured data
We’ve explained the problem and shown you the solution, so what are you waiting for? If your company handles a large number of phone calls with leads or customers then it is absolutely certain that they can benefit from enriching their CRM with the information from unstructured data.
All too often we fall into the trap of thinking that processes and software solutions are good enough. They’ve got you this far, so why look for something better? But in this instance this is about turning the data that you already have into data that you can actually use.
All of those call transcripts contain business insights and trends that could make the difference for your next campaign, your next quarter, or your next product. And whilst you might already be using some form of natural language processing for some of your business needs, it’s unlikely that it has the capabilities required to make this process simple and automated.
Lettria’s decision to specialize in textual data processing makes it different to many of the off-the-shelf NLPs on the market. Its platform approach and functionalities also make it uniquely suited to use cases like this one.
So if you think that you could benefit from this type of project, then you should be clicking on one of those helpful links that we will have put into this article and getting in touch with our team. We can help you to develop a fully customized NLP platform in a fraction of the time that it would take to develop your own solution or even get a different NLP up and running.
We know, this is a long article with a lot of information in it, but that’s just the kind of thing we specialize in. You’ve handled the analysis on this occasion, let’s automate that process for your call transcripts.