Saturday, February 10, 2018

Don't Build your NLP Bot like a GUI


There has been an ongoing buzz about conversational user interfaces (CUI) and how they can enhance human-to-computer interaction. Many of us have experienced them to some extent both in play with voice interfaces such as Siri and Alexa and at work with messaging interfaces in tools like Slack. When CUIs are infused with a good dose of NLP (and context aware, machine learning..etc), they have the potential to become more than just a primitive command line or simple minded messaging/voice interface. The potential for building NLP powered virtual assistants and benefiting from the productivity that they can provide does exist, but you will have to get out of your comfort zone as a developer.

The Mental Leap to CUIs

Building CUIs using NLP and AI can take some getting used to when you have spent your career or education building graphical user interfaces (GUI). Be wary of getting caught in the trap of building your natural language interactions like you would build a GUI. A virtual assistant, like how some NLP bots function, can behave in many of the same ways you would interact with another human that would be working on your behalf. Now, a GUI is a very capable and functional interface, but is not a virtual human. And you have to keep reminding yourself that the bot you are building needs to behave like a virtual human, because it is easy to get caught in the trap of trying to build your NLP virtual assistant like you would a traditional GUI. It ends up being the worst of both worlds if you end up doing that.

Not that there is Anything Wrong with GUIs

Now there is nothing wrong with GUIs. For some tasks and functions they will always be superior to natural language. But for many things we do in our daily lives, language is a superior medium especially if the engaging entity is intelligent.

Fundamentally how you would ask (via text or voice) a virtual assistant to perform an action or make an inquiry on your behalf is different than using a GUI. The way you reach a decision or action point in a CUI can be different than a GUI. Let's take a very simple example to compare.

Say you are using a business application to manage pending requests that you must take action upon on a daily basis. And there are different types or classes of requests you must review. Some requests are specific to you, some to your direct team members, and other requests are company wide actions you must review, but you are required to review and make decision/action on all of them at some point during the day or week.

In a GUI you might navigate to the screen and see some quick filters that let's you select which requests types to view or all requests might be shown in some sorted of grouped order on a scrolling page. And you would navigate through the information to decide what to do first. There are many ways to represent the request and perform filtering and sorting. It also depends on the your preferences for what tasks you like to tackle first and how you manage your day. The ways to make the GUI optimal for your particular usage pattern depends on many factors, many of whom are specific to you. This is where GUIs begin to breakdown. They can sometimes overwhelm us with information or not adapt to our unique usage patterns - basically they lack the ability to easily adapt to the extreme personalization you can achieve with a CUI.

All Things being Equal CUIs Win

Now this all depends on your implementation of the GUI and CUI. You can obviously build a horrible CUI and a wonderful ultra personalized GUI, but all things being equal I claim that a CUI will always be superior. The advantage the CUI has is that the user experience does not have to change significantly to improve personalization. The inherent nature of conversational interaction is something natural to all humans. So as a developer, as you improve the conversational flow of your virtual assistant, and as you release new versions of the bot, the user can adapt much more easily because the flow is fundamentally the same, the interaction is still a conversation and it is the bot that is getting smarter/better and always assisting you through the same conversational experience to help you accomplish your objectives.

So let's go back to our request/action application we described earlier. In the CUI case, a smart virtual assistant might look at all the pending requests and tell you have many pending requests of different types and suggest you start with your direct reports first, since there is only one of those, if that was the case. The ability of the CUI to branch off in different directions is much more dynamic than what a GUI can do, and this can be done in a way that does not force the user to learn a brand new interface with each new release of the software, since the virtual assistant is your interface and your personal guide at the same time. The CUI has a sort of built-in help capability, since it is a virtual assistant by design.

Mixed Mode CUI and GUI

Many of the popular bot platforms such as Slack and Facebook Messenger have incorporated some very convenient visual GUI components into their messaging and flows which allow developers to mix conversation question/answer dialog with short-cut GUI actions and interactions. This can be great way to meld conversational with GUI, but at the same time it can get CUI developers sucked back into the GUI world, and get developers focusing too much on injecting too many GUI interactions into the CUI flow. So use these tools wisely and keep the interactions focused on the conversational flow between the user and the human like virtual assistant.

Grow your Bot Like You are Raising a Child

So be careful, always keep in the forefront of your mind that you are building a smart virtual assistant and not a GUI. It is easy to get caught in the trap of building a messaging driven GUI. You will not have all the smarts built into the bot from day one, but make that your mission to keep the conversational flow focused on the interactions and dialog between the human user and the virtual assistant, and everything else will follow as your bot gets smarter.

Saturday, February 3, 2018

Can AI Put the Human Back into HR?


Over the past couple of decades Human Resources (HR) has moved from paper and manual processes to more automation and providing employees with more and more web and mobile powered experiences. But has HR lost its human essence and personalization in the process? Nowadays managing a company's "human" assets seems to involve less and less human interaction with the HR organization or an HR person when you need one.

In the Beginning there were Humans

I recall my first few jobs at both major corporations and at startups, there was always a HR person I could reach on the phone, or simply walk over to their office to get small and big matters resolved. HR personnel where typically visible and accessible. Human Resources representatives interacted at a personal level with employees and often knew you on a first name basis.

It seems with more technology that HR has lost its human to human interaction and become more impersonal and mechanical. Today, if you need some payroll or benefits issue resolved, you typically need to submit a "ticket" in some online system and wait for someone (you have never met) to contact you back over email or if you are luck over the phone. Or if you are lucky you can click your way through a labyrinth of HR GUI applications to find what you need.

Virtual Assistants are the New Humans

So what is the solution, more or less technology? Maybe the answer is better technology. AI powered HR virtual assistants have the potential to be your personalized guide to do everything from answering general HR questions to requesting time-off and helping guide you to finding the information or actions you need to take quickly. Machine learning powered intelligent assistants could help resolve problems and questions with your payroll, for example. Interactive NLP powered bots could help guide you through, an often times, complicated benefits and open enrollment process by knowing your preferences and history to match you with the optimal recommendations for your particular situation.

The Future is an AI Powered Voice and Messaging-First World

The future of HR is not more technology, but more intelligent technology powered by machine learning, natural language processing, personalized recommendations engines, and other AI enabled technology that bring hyper-personalization and a human-to-computer interaction model that goes beyond impersonal graphical user interfaces. Voice and messaging-first interfaces (endowed with NLP) are a step in the right direction and can bring back a bit of humanity to your technology overloaded workplace.

Sunday, January 7, 2018

Recommending Actions for Your NLP Bot


At first glance, the machine learning methods typically used in NLP applications (such as chatbots) and those used in recommender systems (for recommending products) are not often leveraged together in the same applications.

NLP is the machine learning domain that makes your virtual assistant capable of engaging in human language based conversation and recommender systems, as the name suggests, recommend products/services you will hopefully like (thus saving you the trouble of discovering them on your own); but it is not often you see NLP and recommender systems together.

Where Conversational UI Meets Recommendations

But let's think about that for a moment. Is there a solution space where NLP and recommender systems intersect and why would anyone want to do such a thing? I will make the case that every so called AI powered virtual assistant (aka chatbot and their kin) needs context and part of that context can be provided by a personalized recommender system that helps guide the conversation and streamline the conversational user experience.

A Messaging First World

We are in the midst of a messaging application revolution. A new generation of users are making messaging based applications their preferred medium for communicating with people, places and things around them, especially when it comes to the digital world (and contextual world). And there is no lack of applications from fintech to social applications leveraging and rediscovering the command line interface as the new mode (or not so new mode for many command-line geeks) of communication between humans and computers.

Your Next Question or Answer is a Recommendation Away

Ok, so where do recommender systems fit into the world of NLP and conversational driven user interfaces? Well, conversational applications are not without their own challenges. Typing and speaking takes effort on both the user and virtual assistant, in order to engage in timely and efficient interaction. But what if your virtual assistant new what you wanted to do next (or what you might/should like to do next)? What if your NLP powered bot could suggest to you actions you might want to take and save you the trouble of verbalizing it - maybe give you a quick one click shortcut (can be voice powered as well) to driving and continue the conversation?

This is where recommender systems can play a vital role in making your virtual assistant not just clever at understanding intent and named entity recognition from voice or text, but also present a sense of intelligence in remembering your past behavior or predicting what you might do next (or should do next) by relating your behavior to what others in similar roles and situations have done next. So even with no prior knowledge of "you", the virtual assistant might prescribe next actions based on what others in similar roles and situations have done. Does that not sound like a recommendation system?

Prescribing vs Predicting

Recommender systems are inherently about prescribing things (which can include actions not just items) applicable to your context at a given point in time (time based being a critical context as well). I foresee a future where both business and consumer application oriented virtual assistants and NLP bots will leverage highly personalized recommender systems to take the human-to-computer interaction to the next logical evolution (as promised by many sci-fi books and movies :)

Matrix Factorization and LSTMs are Your Friend

So for all you NLP bot developers, make things like matrix factorization and collaborative filtering your friend. Hybrid recommender systems based on collaborative filtering and content filtering (product and customer meta data) have been the state of the art for the past few years (since the Netfix contest). However the future of recommender systems will be powered by deep learning and concepts like LSTM and product and item embeddings. Research in this space is evolving fast. A mix of shallow and deep learning techniques are racing to enable this world of intelligent NLP bots and efficient conversational user interfaces.