Saturday, February 10, 2018

Don't Build your NLP Bot like a GUI

There has been an ongoing buzz about conversational user interfaces (CUI) and how they can enhance human-to-computer interaction. Many of us have experienced them to some extent both in play with voice interfaces such as Siri and Alexa and at work with messaging interfaces in tools like Slack. When CUIs are infused with a good dose of NLP (and context aware, machine learning..etc), they have the potential to become more than just a primitive command line or simple minded messaging/voice interface. The potential for building NLP powered virtual assistants and benefiting from the productivity that they can provide does exist, but you will have to get out of your comfort zone as a developer.

The Mental Leap to CUIs

Building CUIs using NLP and AI can take some getting used to when you have spent your career or education building graphical user interfaces (GUI). Be wary of getting caught in the trap of building your natural language interactions like you would build a GUI. A virtual assistant, like how some NLP bots function, can behave in many of the same ways you would interact with another human that would be working on your behalf. Now, a GUI is a very capable and functional interface, but is not a virtual human. And you have to keep reminding yourself that the bot you are building needs to behave like a virtual human, because it is easy to get caught in the trap of trying to build your NLP virtual assistant like you would a traditional GUI. It ends up being the worst of both worlds if you end up doing that.

Not that there is Anything Wrong with GUIs

Now there is nothing wrong with GUIs. For some tasks and functions they will always be superior to natural language. But for many things we do in our daily lives, language is a superior medium especially if the engaging entity is intelligent.

Fundamentally how you would ask (via text or voice) a virtual assistant to perform an action or make an inquiry on your behalf is different than using a GUI. The way you reach a decision or action point in a CUI can be different than a GUI. Let's take a very simple example to compare.

Say you are using a business application to manage pending requests that you must take action upon on a daily basis. And there are different types or classes of requests you must review. Some requests are specific to you, some to your direct team members, and other requests are company wide actions you must review, but you are required to review and make decision/action on all of them at some point during the day or week.

In a GUI you might navigate to the screen and see some quick filters that let's you select which requests types to view or all requests might be shown in some sorted of grouped order on a scrolling page. And you would navigate through the information to decide what to do first. There are many ways to represent the request and perform filtering and sorting. It also depends on the your preferences for what tasks you like to tackle first and how you manage your day. The ways to make the GUI optimal for your particular usage pattern depends on many factors, many of whom are specific to you. This is where GUIs begin to breakdown. They can sometimes overwhelm us with information or not adapt to our unique usage patterns - basically they lack the ability to easily adapt to the extreme personalization you can achieve with a CUI.

All Things being Equal CUIs Win

Now this all depends on your implementation of the GUI and CUI. You can obviously build a horrible CUI and a wonderful ultra personalized GUI, but all things being equal I claim that a CUI will always be superior. The advantage the CUI has is that the user experience does not have to change significantly to improve personalization. The inherent nature of conversational interaction is something natural to all humans. So as a developer, as you improve the conversational flow of your virtual assistant, and as you release new versions of the bot, the user can adapt much more easily because the flow is fundamentally the same, the interaction is still a conversation and it is the bot that is getting smarter/better and always assisting you through the same conversational experience to help you accomplish your objectives.

So let's go back to our request/action application we described earlier. In the CUI case, a smart virtual assistant might look at all the pending requests and tell you have many pending requests of different types and suggest you start with your direct reports first, since there is only one of those, if that was the case. The ability of the CUI to branch off in different directions is much more dynamic than what a GUI can do, and this can be done in a way that does not force the user to learn a brand new interface with each new release of the software, since the virtual assistant is your interface and your personal guide at the same time. The CUI has a sort of built-in help capability, since it is a virtual assistant by design.

Mixed Mode CUI and GUI

Many of the popular bot platforms such as Slack and Facebook Messenger have incorporated some very convenient visual GUI components into their messaging and flows which allow developers to mix conversation question/answer dialog with short-cut GUI actions and interactions. This can be great way to meld conversational with GUI, but at the same time it can get CUI developers sucked back into the GUI world, and get developers focusing too much on injecting too many GUI interactions into the CUI flow. So use these tools wisely and keep the interactions focused on the conversational flow between the user and the human like virtual assistant.

Grow your Bot Like You are Raising a Child

So be careful, always keep in the forefront of your mind that you are building a smart virtual assistant and not a GUI. It is easy to get caught in the trap of building a messaging driven GUI. You will not have all the smarts built into the bot from day one, but make that your mission to keep the conversational flow focused on the interactions and dialog between the human user and the virtual assistant, and everything else will follow as your bot gets smarter.

Saturday, February 3, 2018

Can AI Put the Human Back into HR?

Over the past couple of decades Human Resources (HR) has moved from paper and manual processes to more automation and providing employees with more and more web and mobile powered experiences. But has HR lost its human essence and personalization in the process? Nowadays managing a company's "human" assets seems to involve less and less human interaction with the HR organization or an HR person when you need one.

In the Beginning there were Humans

I recall my first few jobs at both major corporations and at startups, there was always a HR person I could reach on the phone, or simply walk over to their office to get small and big matters resolved. HR personnel where typically visible and accessible. Human Resources representatives interacted at a personal level with employees and often knew you on a first name basis.

It seems with more technology that HR has lost its human to human interaction and become more impersonal and mechanical. Today, if you need some payroll or benefits issue resolved, you typically need to submit a "ticket" in some online system and wait for someone (you have never met) to contact you back over email or if you are luck over the phone. Or if you are lucky you can click your way through a labyrinth of HR GUI applications to find what you need.

Virtual Assistants are the New Humans

So what is the solution, more or less technology? Maybe the answer is better technology. AI powered HR virtual assistants have the potential to be your personalized guide to do everything from answering general HR questions to requesting time-off and helping guide you to finding the information or actions you need to take quickly. Machine learning powered intelligent assistants could help resolve problems and questions with your payroll, for example. Interactive NLP powered bots could help guide you through, an often times, complicated benefits and open enrollment process by knowing your preferences and history to match you with the optimal recommendations for your particular situation.

The Future is an AI Powered Voice and Messaging-First World

The future of HR is not more technology, but more intelligent technology powered by machine learning, natural language processing, personalized recommendations engines, and other AI enabled technology that bring hyper-personalization and a human-to-computer interaction model that goes beyond impersonal graphical user interfaces. Voice and messaging-first interfaces (endowed with NLP) are a step in the right direction and can bring back a bit of humanity to your technology overloaded workplace.

Sunday, January 7, 2018

Recommending Actions for Your NLP Bot

At first glance, the machine learning methods typically used in NLP applications (such as chatbots) and those used in recommender systems (for recommending products) are not often leveraged together in the same applications.

NLP is the machine learning domain that makes your virtual assistant capable of engaging in human language based conversation and recommender systems, as the name suggests, recommend products/services you will hopefully like (thus saving you the trouble of discovering them on your own); but it is not often you see NLP and recommender systems together.

Where Conversational UI Meets Recommendations

But let's think about that for a moment. Is there a solution space where NLP and recommender systems intersect and why would anyone want to do such a thing? I will make the case that every so called AI powered virtual assistant (aka chatbot and their kin) needs context and part of that context can be provided by a personalized recommender system that helps guide the conversation and streamline the conversational user experience.

A Messaging First World

We are in the midst of a messaging application revolution. A new generation of users are making messaging based applications their preferred medium for communicating with people, places and things around them, especially when it comes to the digital world (and contextual world). And there is no lack of applications from fintech to social applications leveraging and rediscovering the command line interface as the new mode (or not so new mode for many command-line geeks) of communication between humans and computers.

Your Next Question or Answer is a Recommendation Away

Ok, so where do recommender systems fit into the world of NLP and conversational driven user interfaces? Well, conversational applications are not without their own challenges. Typing and speaking takes effort on both the user and virtual assistant, in order to engage in timely and efficient interaction. But what if your virtual assistant new what you wanted to do next (or what you might/should like to do next)? What if your NLP powered bot could suggest to you actions you might want to take and save you the trouble of verbalizing it - maybe give you a quick one click shortcut (can be voice powered as well) to driving and continue the conversation?

This is where recommender systems can play a vital role in making your virtual assistant not just clever at understanding intent and named entity recognition from voice or text, but also present a sense of intelligence in remembering your past behavior or predicting what you might do next (or should do next) by relating your behavior to what others in similar roles and situations have done next. So even with no prior knowledge of "you", the virtual assistant might prescribe next actions based on what others in similar roles and situations have done. Does that not sound like a recommendation system?

Prescribing vs Predicting

Recommender systems are inherently about prescribing things (which can include actions not just items) applicable to your context at a given point in time (time based being a critical context as well). I foresee a future where both business and consumer application oriented virtual assistants and NLP bots will leverage highly personalized recommender systems to take the human-to-computer interaction to the next logical evolution (as promised by many sci-fi books and movies :)

Matrix Factorization and LSTMs are Your Friend

So for all you NLP bot developers, make things like matrix factorization and collaborative filtering your friend. Hybrid recommender systems based on collaborative filtering and content filtering (product and customer meta data) have been the state of the art for the past few years (since the Netfix contest). However the future of recommender systems will be powered by deep learning and concepts like LSTM and product and item embeddings. Research in this space is evolving fast. A mix of shallow and deep learning techniques are racing to enable this world of intelligent NLP bots and efficient conversational user interfaces.

Monday, August 28, 2017

AI vs Paradox of Choice

The paradox of choice is a problem we see more and more of in our modern world. It goes beyond what products Amazon should recommend or friends Facebook should suggest. In the business world and in enterprise applications this is also a challenging problem as our applications and processes grow in complexity. The potential for machine learning powered recommender systems to augment human decision making is one of the next frontier for AI in the enterprise . 

Recommender systems can do more than just suggest what articles you should read on Linkedin or what jobs are most suited for you. In the future machine learning (and more likely deep learning) powered recommender systems will guide enterprise decision making by helping business process owners take the most effective actions and decisions in a timely manner and with hyper-personalization. And as with all ecosystems, once you introduce a new input variable (such as in this case a personalization/recommender system itself), this will affect future human or system behavior that you are personalizing today - it is a moving target.

Recommender systems will move from solving B2C optimization problems (how they are typically used today in our data saturated and over marketed world) to solving problems in B2B and enterprise applications. Ultimately recommender systems are about prescribing (they are not really about predicting) an optimal decision at the right time and place/context, so they can naturally deal with a variety of B2B scenarios such as optimizing workflow paths, streamlining supply chain actions, to augmenting human decisions for common day to day business operational functions. Enterprise decision makers are in vital need of these AI super powers. Stay tuned they are coming :)

Sunday, August 20, 2017

Messaging-First Applications with Slack or FB Workplace?

Conversational UX design is evolving as more and more apps begin to incorporate conversational UI functionality. While the concept of a messaging-centric UI can seem simple, the melding of a messaging-first user experience is nothing to underestimate. Conversational user interfaces can be simple for humans to interact with (you are just chatting back and forth), however, blending in and balancing rich visualization and complex interactions is not simple to get right. Just like any other UX, it is a balance of minimalism while allowing for rich expressiveness in the UI without overwhelming the user.

Slack is one of the leading platforms for building bots, especially for enterprise applications. However Slack has a number of bot conversational UX features that are still missing relative to other platforms such as FB Messenger and FB Workplace.

To give some perspective, here is my compiled list of features I would like to see in Slack's bot framework to improve its messaging UX and bring it on par with platforms like FB Messenger:

1) Conversational Streams and UI Alignment

Slack bots (especially in direct-messaging one-on-one dialog flows) force the bot and the user to both be left justified in the messaging UI stream. This goes against UI norms found in the majority of messaging application and related best practices for messaging apps. Typically in a streaming messaging flow, your conversational stream (you being the person interacting with the bot) is on the right of the screen and the party you are talking to (in this case the bot) is on the left side of the screen (or it can be visa-versa).

This is something not supported in Slack and makes a number of things awkward and cluttered in a bot-to-human dialog, especially when it is one-on-one (as opposed to Slack group channel). In Slack the entire conversational interaction is left justified, which can make the UI look cluttered when there are visual rich elements involved and and things like "Quick Replies" in the back and forth stream.

I hope that Slack will allow for aligning the bot vs the user on different sides of the messaging stream, something more similar to how FB Messenger works. This will allow for a more natural conversational interaction.

2) Horizontal Scrolling Carousel UI Components

Slack (mobile and desktop/web) does not provide any kind of horizontal card or horizontal scrolling carousel. While some might consider this bad design (to allow for horizontal scrolling of cards), it is often necessary to minimize the vertical area needed to display information in rich messaging interactions. FB Messenger allows for limited horizontal scrolling carousel that I find to be very useful when building bots. Hopefully Slack will incorporate this. Slack already supports rich "attachments", so it would be a natural fit to allow for some limited level or horizontal scrolling.

3) WebView Integration

Slack does not have explicit support for messaging buttons that open a webview UI. Sometimes a webview is needed to show rich web content (again here this kind of feature should not be abused). FB Messenger has this ability and allows for controlling how the webview window is opened and closed. This can be mimicked in Slack by using embedding links in the "field" elements for example, but is a bit of a hack.

4) Quick Reply Buttons

One particularly nice feature I got accustomed to using in Facebook Messenger is the feature referred to as "Quick Reply". This allows the bot to display "Quick Reply" buttons that are shortcuts for the user to enter commands that they would normally have to type.

There is a away to mimic quick replies in Slack, but again it is a bit of hack. Check this open source node/slack project for an example of how this works with Slack. Quick replies are a real necessity in a rich messaging interaction. Again here, I hope that Slack adds this feature natively instead of making bot frameworks jump through hoops to emulate this feature.

Hopefully the Slack product team will address these issues as Slack is by far the best team and enterprise collaboration/messaging platform on the market today.

FB Messenger might have some superior bot-to-human interaction and UX capabilities, but it inherently lacks the team collaboration functionality and the many third-party integrations that Slack has to offer.

I do believe FB Workplace will close the gap over time, and in many ways has advantages over Slack in terms of out of the box social collaboration functionality. Slack is a bit of geeky technical tool when it comes to social collaboration and thus not as intuitive to use.

I expect both FB Workplace and Slack to evolve as head to head competitors and battle for the hearts and minds of developers much like how Netscape battled Microsoft's Internet Explorer for web domination. For enterprise owners and enterprise end user, intelligent AI endowed virtual assistants and bots will usher in a new era of innovation not seen since the dot-com days. The battle has moved from the mobile app store to the AI app store where natural language understanding and deep learning are the killer technologies in the arsenal of AI sophisticated developers.

Monday, December 26, 2016

The Death of Visual Analytics and the Dawn of Conversational BI

In the last several years we have seen the emergence of a new breed of business intelligence products that have made it possible to build highly interactive and visually expressive and rich dashboards and reporting experiences. Products like Tableau, Domo, and Looker to name a few are replacing established BI heavyweights with a focus on self-service and rich visualizations.

What is driving this trend? Well anyone not living under a rock for the last then years will tell you that the explosion of data on the internet coupled with the advancement in Big Data related technology have made storing and accessing data much easier than ever before. But this alone is not the whole story.

Self Service BI is Good but not Good Enough

Products like Tableau have come onto the seen to lower the barrier for connecting to internet accessible data sources and as well to traditional sources locked up in relational databases and in the billions of excel spreadsheets sitting around the enterprise world. Driven by this, Tableau, for one, has been successful for three primary reasons:
  1. Provides many out of the box data source connectors with an easy to use interface - connect to just about any data source.
  2. Self service analytics without some of the heavy lifting - you don't need an army of data and tech experts to model your data and meta-data.
  3. Highly compelling and visually rich analytics features - the visualization you can create with Tableau are stunning - not always easy to do, but much more achievable than ever before.
So this is all great, but what does this have to do with the death of visual analytics? I seem to be saying richer BI visualization is blossoming and inspired by tools like Tableau. Well, I will argue that item number three listed above is an evolutionary dead end and that are going to see a gradual trend away from visually rich analytics.

There is such a thing as too much of a good thing. More visualization does not mean you are solving business problems more effectively, answer questions faster, finding root cause (answering why questions), or getting better predictions and trends? In fact ,too much visualization might be overwhelming users.

A Stroll Through BI History

Let's take a quick ride back in time before we look forward. Human civilization has been evolving for thousands of years and our way out of the stone age was guided by the development of human language and communication. While it is true that a picture can say a thousand words, the spoken or written word, on the other hand,  can express all of human existence in short phrase, e.g "to be or not to be" or "I love you". Human expression through words is powerful - more powerful than any picture.

My point is that human communication is the most powerful expression and exchange of information. It is a fact that visualization is a powerful tool, but it pales in the presence of the written or spoken word. You can probably guess where I am going now.

Computers and computer to human interfaces have evolved over the past sixty or so years on a twisted evolutionary path. We started with simple command line tools and interfaces (mainframe), where we issued simple grunting commands and got back simple grunted responses from our computers. We then saw this lead to the evolution of rich graphical computer windows, icons and the mouse (point-drag-click). While this helped advance our interface and interaction with the computer and with extracting data from within these artificial devices, this path of human to computer interaction is effectively an evolutionary dead end. It pales in comparison with what is coming next.

Evolving Toward AI Conversations - More Than Pictures

Products like Tableau, Looker and others will need to evolve in the coming years or be left in the dustbin of technology. While we have seen amazing advancement in rich and interactive visualizations of data, I argue this is the wrong path and effectively an evolutionary dead end. How many times have you looked at Tableau dashboards (or other BI visualization) and saw beautiful and rich colors, shapes and graphics only to be overwhelmed by the information? What does this information mean, what does it tell me, what questions and answers are buried in this beautiful and rich visualization?
Tableau: Endangered Safari Animals
What if instead of being bombarded by visualization alone, you can converse with the data - converse with the machine? Having rich visualizations can be fantastic, but I would want to ask the machine to answer questions about the visualization - make predictions or tell me "why" this is occurred - point me a the root cause. We are moving to a new dawn where machine learning and AI will help us make sense of the information around us that that is currently locked up and visualized by computers. And this requires a new way (back to the future) for humans to interact with BI.

While computers started out as simple command line beasts, our current evolution toward more and more visualization is an evolutionary dead end. We will soon be moving toward a voice and messaging first world - where visualizations will augment our experience of information and are a tool for us to engage in conversation with our AI powered BI applications and virtual assistants. Chatbot BI virtual assistants are on the horizon.

More Than Just Looking Pretty - Answering Questions

You can see the beginning of this already. Tableau recently announced they will be releasing, in 2017, a new NLP interface to their platform - competitors will follow - and this is only the beginning. We will one day be able to ask questions of your BI in natural human language. The AI powered analytics revolution is coming. Conversational interfaces are a game changer for BI. Analytics as a conversation will no longer be the stuff of movies and sci-fi.

Driven by the advancement in AI and machine learning and with the massive surge in adoption in virtual assistants, chatbots and messaging/voice applications, the future will be here sooner than we think.

Tuesday, October 25, 2016

Goodbye Apps and Hello Bots

The shift in the market is undeniable. Bots are beginning to challenge the established mobile app store ecosystem. There is plenty of evidence that mobile app adoption has plateaued and that the average user has lost their excitement for downloading and experimenting with new apps. There are more than 2 million apps in the Apple app store now! Ask the average mobile developer - it is almost impossible to get your app noticed or discovered in such a crowded space. Apps will always be with us, much like desktop applications and the company website, but there is a sea change.

Disruption and the New Players
It is becoming clear that jumping from one mobile app to the other is not a great experience for most users (especially for enterprise users) and this is giving messaging apps like Slack and Facebook Messenger the opportunity to become the new app/bot marketplace. GUI-less bots are more easy for users to transition from and to, and they make it seamless to switch between bots and more natural to interact with an application service using human like conversation (something people are already doing in droves on messaging apps). These bots are basically mini-apps with conversational interfaces. Slack (for the enterprise) and FB Messenger (for consumers) are both becoming the new application playground; and the promise of an AI enabled world is lurking within them to provide a user experience that traditional GUI apps are not capable of.

Microsoft is chasing Slack (using Skype) to establish itself in the enterprise team messaging market and in this new emerging bot marketplace. For Microsoft, this is obviously an opportunity to disrupt the mobile app market (where they have lost) and establish an early beachhead with bots, AI and enterprise team communication. Microsoft has clearly been leading the charge with products like Cortana, LUIS and their Bot Framework. All the other big players are in the bot and AI game as well, and the race is definitely on for who can deliver the best bot solution for developers. There is a new land grab in the making between the big tech giants, developers and startups.

How Do I Deliver My Bots?
I describe all this because to deliver conversational applications (aka bots) to end users, developers need a platform and a bot marketplace. Messaging apps will be that vehicle to supplant the traditional app store ecosystem, because building your own custom bot infused mobile app will not be the way to go for most developers in the future. Building a custom mobile app for your bot might still be possible in some situations where an app already has an established user base - like a banking app - but for the average bot developer messaging apps, like Slack, will be the delivery platform.

Messaging apps like Slack also offer a lot of out the box backend integration to help deal with single-sign-on, identity management, permissions, roles and executing custom business logic (via webhooks) for backend integration. Apps like Slack provide much of the platform plumbing for this backend integration that your bots will need, and enterprises are already adopting team messaging apps like Slack. This all lowers the barriers for connecting your bot to a companies cloud and back office systems, in order to get access to the necessary the data and enterprise systems.

I think the future model for developers will be to deliver their bots and AI conversational services through tools like Slack and possibly others popular "platform messaging apps" such as Cisco Spark, HipChat, Fowdock, FB Messenger, Skype, Kik, and others. All these messaging centric platform apps are already spreading fast through the corporate and consumer world. Developers will be leveraging these messaging platforms to deliver their AI services in the form of bots and conversational user interfaces . Mobile app stores will always be with us, but the game is changing. The new AI marketplace is happening now, get your bots ready!

Thursday, October 20, 2016

Bots, AI and the Future of Augmented UX Design

We hear a lot these days about technologies such as futuristic looking VR goggles and mobile apps with augment reality that enhance our interactions with the physical world using a computer generated reality that overlays and assists in our interpretation of the world around us. As computer users, have become accustomed to rich visual interfaces as desktop, web and mobile app technologies have matured. However, the next leap forward in human-to-computer interaction will not be more visual effects, but in fact less, and we are seeing the beginnings of this shift in what we refer to today as "bots". This is only the beginning of a seismic shift in how we as end users interact with applications.

Now, what if we could have the same augmented reality type experience applied to the countless GUI applications we all deal with on a daily basis both at work and at home and from desktop to mobile? What do I mean? Computer applications are already computer generated, so why do they need an augmented reality? Yes that is true computer/mobile applications are already based in the virtual real-estate of the computer (or mobile device), but why do we need any augmented assistance while dealing with the computer application. If you reflect carefully on what is happening with bots and AI in type applications in general, we are seeing the creation of a new human to computer mode of interaction that can assist us in how we interact with the virtual world of the computer application. The dumb and boring old computer or mobile application screen is about to get a big dose of intelligence! Having an intelligent conversation with your application (not just mouse clicks and keyboard taps) will become the norm and is not just the thing of science fiction. Note, this bot/AI augmented application does not necessary have to voice converse, have a personality or hold a deep philosophical discussion with us (maybe some day), but it will be able to assist us in our current world of application beyond just the visual windows, buttons and menu options we have today.

We have come accustomed to interacting with our computers using an already mature human-to-computer model of clicking on buttons and visualizing our experience with a computer through drop down lists and dialog boxes among other widgets and interfaces. But what if we could augment our interaction to a consumer application (i.e. banking app) or an enterprise application (i.e. supply chain application) with a  an intelligent chatbot that could aid us in the interaction with said application and the many knobs, controls and actions you can invoke on the application screen? This bot assistant could remember what we have done in the app in the past and guide us through taking actions using a combination of chat/message exchanges sprinkled in with intelligently timed suggested actions. This could in fact lead us to to a situation where we do not needed the full blown array of buttons and menus that bloat the apps we have today. We could have a conversation with the application with the help of an intelligent and conversational chatbot assistant.

Well this is where the world will soon be fast moving towards. With the ubiquity in mobile communication and advancements in machine learning, AI and big data, the scene is now set for every application we have come accustomed to using to have a chatbot assistant that can aid us in our interaction with the application itself. No more stupid dark ages style help documents to sift through. Think of it as online docs help on steroids and this just the beginning.

Any enterprise or consumer application team not starting to think how to they can replace their outdated online help docs and bloated UIs with more efficient and engaging intelligent and interactive chatbot assistance will be left in the dustbin of technology history. Don't worry you have a few agile sprints before this happens :)

This transformation is not to be taken lightly although. It will be a significant investment of both engineering/technology and a big leap in thinking in how we design user experiences for end users and how we can expand the visual application metaphors we have grown accustomed to with new intelligent chat assistants that can guide us through the navigation of information and assist us in the potential actions that can be taken within an application.

This technology leap will require a big shift in thinking from product owners, UX designers, information designers and engineers. This will require everyone in the product development ecosystem working together using enhanced business and engineering processes that put this new augmented UX design philosophy in the forefront while on the engineering side leveraging fast maturing technologies in the areas of NLP, AI, and machine learning to enable captivating and predictive conversational engagements between humans, their devices and their applications.

The applications of the future, whether on your desktop or on your mobile device, will in the coming years begin to manifest augmented UX capabilities. Be prepared for this new world whether you are a developer, UX designer or end user of these applications. Conversational interfaces are coming to an application near you to augment and enrich your application user experience!

Analytics as a Conversation

The pendulum is swinging in the business intelligence and analytics world. The on going technology evolution driven in part by the adoption of Big Data, machine learning and other advancements in cloud computing have made the storing, modeling and analyzing of huge volumes and velocities of data possible. The tools and IT skills needed to turn this data into rich visual information is now more possible than ever before.

Products like Tableau, Splunk, Qlik, Birst, among others, have brought rich visualization and actionable-minded analytics (actionable analytics still not that common :) to the masses. It is now easier more than ever to build rich visualizations, reports and dashboards. Building BI solutions to tackle all that data percolating around us and across social networks, IoT and within the enterprise is available to the IT masses to build compelling visual user experiences.

But there is trouble brewing on the horizon. Is there such a thing as too much data? Too much information? Too much visualization? I have built my share of BI and I have seen many amazing and compelling visualizations and dashboards using powerful solutions like Tableau and many home grown SaaS BI platforms. But I think it is time to step out of the forest and look at how humans effectively interact with information.

While we rely heavily on our visual sense, even the most well intention and minimalistic BI dashboard (and its supporting drill-down reports) might not be the best solution all the time at getting to the information you want or need. Humans have another ability for consuming information, the conversation (question and answer).

There are many technologies now converging and making it possible for us to evolve our BI stack beyond purely visualization based analytics. Analytics-as-a-Conversation (A3C) in my mind is the next frontier for BI. It does not necessary replace today's rich visualization based BI, but augments it.

What is A3C? Well, in movie terms, it is sort of the Matrix. It is about having a conversation with your BI and getting at what you need (the what) through normal human-like conversation (think texting, hashtags, tweets and even emojis). Also, this conversational form of BI is a much more natural way of interacting with complex information and can more naturally lead to asking not just the "what" questions but the "why" questions to your BI Matrix. And this form of information interrogation lends itself to setting a more clear context to the information exchange, as the BI conversation progresses from one question-answer to the next question-answer. For example, perhaps you ask your A3C system the value of a particular KPI or which KPI is the most off its norm this quarter. And then this can naturally lead to such questions as to "why is this KPI higher this quarter?"

Obviously we are not Neo and we are not talking to the Matrix, so the system has to be taught (or programmed to learn) how to converse with a human-like grammar and has to programmed to extract what it needs from the grammar/questions using NLP and then translate that into queries against the target data and metadata system. There would have to be bounds on the grammar and enough knowledge of the system's metadata to compose the proper answers. No small engineering effort, to say the least, but from where we are today with AI, bots, machine learning, NLP and general computing stacks, the technology is there to accomplish this.

Why now? Because the technologies needed to construct the BI Matrix I am describing is largely here and the data volumes are now, in my mind, overwhelming even for the best BI visualizations. With a bit of creativity (and sweat), and with current availability and advancements in Machine Learning, AI and general computing power, it is possible today to begin to build such intelligent conversational analytics systems and user experiences. Don't forget this a about changing how the user "experiences" data.

It is not just about data volumes and technology capabilities, human interaction has itself evolved in the past decade. We have seen with the recent explosion of mobile and social communication that humans are using texting and short messages for communication more than ever and with no sign of ebbing. In fact, texting is quickly becoming the dominant form of communication and the main form of information exchange across the globe and across all demographics.

How is this better than the visualization based BI we have today? Well, I would say it is not necessarily a replacement for the BI we have today, but is instead complementary and can lead to BI answering questions of "what" and "why" that the original BI developer/modeler could not necessarily anticipate out of the box. And as artificial intelligence and machine learning systems continue to evolve and improve the potential is virtually limitless and no longer bounded by what can be rendered on a 2D display or a click of the mouse.

The revenge of the CLI (the command line interface) is upon us :) But don't underestimate the conversational CLI, it will prove to be orders of magnitude more powerful than any visualization a human can conjure up. The CLI is coming back, but it will be smarter and more interactive and have a bit of a human personality. It will not be called CLI anymore (that is for the techies), it will be called CUI/CUX (Conversational User Interface/Experience) and it will be embedded in the fabric of our mobile and desktop apps of the future.

Stay tuned....Analytics-as-a-Conversation is coming and we will all be talking about it (or talking with it).

What's Next? Conversational Enterprise Applications

There is a lot of chatter these days (excuse the pun) around AI, machine learning, and chat bots and how this technology stack can be used to engage with users, at a human like level, to exchange information and automate tasks. The elusive goal of an all intelligent AI machine that can by indistinguishable from a human to help us with day to day tasks has been with us since the Turing test and has been embedded in our psyche from countless sci-fi movies.

Today, that elusive goal is closer with the many advancements in computing and communication technology. We are starting to see the real application of such technology in tools like Slack where chat bots sit in the background ready to engage in channels/rooms to assist and respond to natural language chat communications to help solve/automate DevOps tasks or monitor and manage IoT infrastructure among many other applications. And we also see it with more casual consumer applications like Siri, Cortana and Alexa.

Where this is all headed is exciting both for consumer and enterprise applications. However a lot of the current focus for how and where conversational interfaces can be applied is still stuck in the past. In my opinion there is too much attention given, for example, to a chat bot's personality and if that chat bot is behaving with true human like mannerisms. I think this distracts from the actual transformation that is happening and the opportunities that lie ahead for where and how conversational interfaces can transform business applications. A conversational interface does not necessary need a human personality to be effective - keep that in mind when you go down the path of building this new form of user experience into your applications.

If we look back in time, we started first with "dumb" computer command line interfaces (i.e. the green screen CLI), then through the 80s, 90s, and early apart of this century we went through a steady evolution toward more visually rich human to computer UX (think desktop apps, web x.0 and mobile). Interestingly enough, this has brought us full circle and back to the command line interface (CLI). But this new CLI is now "intelligent" and has the potential to take us to the promised land of conversational human to computer interaction. I wont get into the theory of why the intelligent CLI (aka the conversational interface) has reemerged and why it will prove to be more effective than our current bloated visual UX application world. And remember that the conversational interface can also be voice driven, but voice to text is more or less an added bonus and part of the longer technology maturity in this space.

What does the future hold? I propose that this next generation intelligent CLI should augment every business application in the coming years. Every enterprise should take a hard look at their current UI applications (business and consumer facing applications at every level) and make it a high priority to embed AI chat bot like intelligent CLIs (with or without a personality :) into every user experience and business function they have and for every application persona. Whether you are building an ERP application for an HR manager or a sales executive, or an IoT monitoring platform or an analytics dashboard, every one of these applications should have an intelligent conversational interface to augment the visual interface. By 2020 any enterprise not beginning to bring to the market such investments in conversational interfaces within their applications will be left in the dark ages of the visual only UX world.