AI (Artificial Intelligence): How could AI improve meetings technology for the better?

April 15, 2021 | Rohit Prasad

Artificial Intelligence or AI is a key innovation that is transforming how we think about, build and even use technologies. Put simply, AI is the simulation of human intelligence in machines that are programmed to think, learn and problem-solve like humans. Interestingly AI is now being discussed as a way to revolutionise video conferencing tools and applications for the better. 

Now that conferencing tools and applications are an essential component of this more connected, remote working world for modern businesses, it’s plausible that people are starting to recognise and talk about ways to advance these communication technologies. By integrating AI’s smart, human-like properties into video-enabled platforms, you may be able to dramatically improve the user experience, boost the quality of video interactions, eliminate common frustrations and drive efficiencies by completing time-consuming administrative tasks associated with virtual meetings autonomously.

In this article, we’ll explore the different applications of AI in the context of their addition to future video conferencing applications. Specifically, how can natural language processing, computer vision, virtual assistants and analytics turn things up a notch? We have managed to get a personal view from Babl CTO, Chris Martin who talks about what will need to happen for these to work and how and why people will benefit from smarter meetings technology.


AI natural language processing can be integrated into platforms to improve user experience 

Natural Language Processing (NLP) empowered by AI, can offer many benefits to the user experience when integrated into video meetings technology. For instance, it could allow the ability for platforms to recognise an individual based on their voice command, meaning people can join and even leave meetings with the sound of their voice, no need for PINs, entering names or clicking buttons which can be frustrating. Hence it would provide the user with much-needed convenience.

Another way that NLP can be beneficial is with its ability to recognise different languages spoken on a call and translate the conversation in real-time to easily followable captions. Thus allowing some participants to speak in their native, first language while other participants are still able to easily understand and respond without a human translator.

There are other applications of NLP, such as real-time transcription. In other words, the system can understand each individual channel in a conversation and create full automated conversational transcripts of meetings from voice to text easily. For instance, it recognises what is being discussed/said on the call to create personalised meeting notes for each attendee with key takeaways/action points.

Chris Martin says “NLP technology is getting better, they’re learning systems that are improving over time. I think NLP is getting to a stage now where it can seriously improve the user experience when conferencing. Voice commands already work so well with smartphones and I’m sure it will provide equal value for getting people into meetings more quickly. But for voice commands to be a hit we have to get it to a level of accuracy that we trust, the learning systems need to learn the context of the individual better for it to work optimally. As far as other NLP applications such as for real-time translations, it could be great for calls with multi-lingual participants and this kind of sophistication opens the door to collaboration with a greater pool of people across the globe for whom English may not be their first language. But this is a bit more complicated because the engine will need to understand it, convert the linguistic form, so there will always be delays that can impact user experience, and the NLP engine will need to be able to overcome nuances where some languages do not translate directly into English for example so there is a lot to improve but certainly a lot of encouragement. ”


AI personal assistants and AI-driven analytics can be integrated into platforms to help automate scheduling and action simple administrative tasks 

Simply, AI personal assistants can analyse, learn and make intelligent decisions for you. For example, when using conferencing platforms, the biggest frustration is the time-consuming administration associated with scheduling and rescheduling of meetings. AI assistants within conferencing platforms would be able to track usage data such as when a person or team of people tend to meet in the week, at what times, how long those meetings typically last for and if they tend to overrun. They then use those insights to automate scheduling and rescheduling of regular calls. Similarly, these assistants may recognise when a meeting request is made or if someone asks to reschedule a call via email and can proceed to schedule and reschedule these automatically before sending out adjusted invitations to the relevant people. 

Similarly, AI assistants can make predictions based on the nature of your meetings to suggest files and documents you may need before calls or even pull up search listings on the web relevant to the conversations you’re having in real-time. This all helps to facilitate better and more productive interactions.

Chris Martin says I see a lot of sense in AI assistants using insights and analytics to complete tedious tasks for conferencing users. We often have a meeting and will then follow up with people to say let’s meet next week and in that instance, because the system knows based on data it tracks that I only meet on certain days at certain times, it can access my dairy and proceed to suggest times to schedule that follow up meeting for you. This kind of application helps to save time and focus your energy on things that matter rather than less important elements like meeting administration. In a world where productivity and efficiency are key, this is big, however again, a lot will depend on learning and how much the AI engine can gather to make the most intelligent and accurate decisions.


AI computer vision can be integrated into platforms to improve the quality of video interactions

Computer vision combined with AI can drive massive improvements to video-based meetings. Specifically, it can drive real-time modifications to peoples video by automatically detecting and fixing unstable webcams, recognising when faces are either out of the frame or too close to the camera and readjusting and even identify poor lighting situations and applying auto brightening/light balance optimisation to make it easier to detect faces. A common issue experienced by participants on a video call is poor video resolution as a result of lower bandwidth. With AI-enabled computer vision, these problems can be masked with auto-correction meaning a higher quality video resolution is presented to participants on the call.

Chris Martin says “This is very interesting and could be a great advancement for eliminating all the little nuances we experience on video calls. There are so many examples where people haven’t quite got their heads around the video conferencing best practices but with computer vision stabilisation people don’t have to be as particular or mindful of their environments, their camera set up or the strength of their internet connection because it will be auto-corrected by AI. But while I think computer vision has lots of benefits for improving the quality of video interactions, I wonder how far the correction goes? Will it actually be to the point that the recipient’s video fails and the system can then produce a ‘fake’ video stream with the recipients face? I look forward to seeing more in this space and expect we will get to a point where computer vision becomes more prevalent in video conferencing technologies because of the obvious benefits.”


For more information on AI, check out this executive guide.

If you’re interested in learning about the causes of video meeting fatigue and how to overcome it, read our latest blog.