Rethinking User Interactions: Integrating NLP in UI

Context

Let’s begin with some historical perspective. In 1997, Amazon's One-Click feature revolutionized online shopping by allowing registered users to make purchases with just one click. This function swiftly became a paradigm for online shopping processes worldwide.

Fast forward to August 2022, Perplexity AI emerged. Acting as a powerful AI chat tool, it provided precise answers and endlessly generated more information based on each prior user interaction.

User Pain Points

For decades, the norm was to fetch information and update content using buttons and drop-downs, resulting in an experience reminiscent of filling out long, tedious tax forms.

Traditional UI
Traditional filtering UI reminiscent of tax forms.

Seeking Inspiration from Gaming

Imagine an approach similar to Pokémon’s onboarding in 1996. While these initial choices were purely cosmetic, they added personalization and immersion.

Pokemon Onboarding
Pokemon's immersive onboarding experience.

In 2015, games like "Until Dawn" emerged, emphasizing interactive storytelling based on player choices. These choices had ramifications, determining the fate of characters, using a "Butterfly Effect" system.

Until Dawn Gameplay
Until Dawn's Butterfly Effect System.
GTA 4 Story Choice Action
GTA 4 Story Choice Action.

Movies and TV shows also explored this interactivity. Netflix's "Bandersnatch" episode from the Black Mirror series in 2018 was a classic example, allowing viewers to choose the narrative's direction.

Bandersnatch Interactive Episode
Netflix's interactive Black Mirror episode, "Bandersnatch".

A Shift Towards NLP in UI

New technologies have recently emerged, redefining how we communicate with machines. The advancements in Natural Language Processing (NLP) promise a human-like interaction, enhancing the entire user experience.

Future of NLP in UI
The exciting future of NLP in user interfaces.

Hypothesis

We’ve seen games and television attempt to bring a new level of interaction and immersion to their end users, but just like Satoshi Tajiri, we’ve been limited by the technology of our time. Until now. My hypothesis is that the global landscape for how users interact with computers, and interfaces, how they create content, consume content, and request content is drastically changing. If we want to be at the forefront of how users are bound to interact with interfaces, we need to have the balls to completely rethink how these evolving expectations from users will change the requirements on our interfaces to stay relevant.

Application to Stay22

Alright, I’ve teased you enough. Layman’s terms: I propose adding a single floating button prompting the user to interact with an NLP backend to help with their search from the moment the product loads. The purpose of this button would be to act as a rapid “dynamic” filter that upon interacting with it, would have its content change and then populate new suggestions and apply the changes to the map based on the user’s repeated series of combined selection. The way Satoji Tajiri intended. The purpose of this button is to bring that hyperrealistic human interaction and immersion that games and television have been trying to create for the last 30 years. Imagine taking the intelligent power of recommendations and suggestions of Perplexity.ai and their dynamically changing suggestions based on previous inputs and applying it to our value delivery in a new interface.

To be continued...