COGNIZANT CONSULTING
Helping organizations engage people and uncover insight from data to shape the products, services and experiences they offer
Learn More
  • Working to reshape business models, modernize products and enhance customer experiences to drive growth.
  • Reinventing and managing your most essential business processes with new ways of working.
  • Simplifying, modernizing and securing the IT infrastructure and applications that are the backbone of your business.
COGNIZANT CONSULTING
Helping organizations engage people and uncover insight from data to shape the products, services and experiences they offer
Learn More

Contact Us

THANKS FOR YOUR INTEREST IN COGNIZANT.

We'll be in touch soon!

x CLOSE

Refer back to this favorites tab during today's session for access to your selections.
Refer back to this favorites tab during today's session for access to your selections.x CLOSE

Perspectives

From GUI to NUI/TUI — The Next Step

2017-10-17


Human interaction with machines is largely controlled by graphical user interfaces (GUIs). GUIs provide inputs to which humans react and provide outputs. However, GUIs are now moving away from physical input devices such as screens to more intuitive ways of interacting with machines called natural user interface (NUI). NUI is gesture-based and can be configured to various parameters such as eye movement, gestures, facial expressions — virtually to any variable. Think fingerprint sensors on smart phones and voice recognition systems such as Apple’s Siri.

Battling for Customer Attention

Every business is racing to be digital, and competing to gain customer attention on mobile devices and other media. As a result, customer attention is becoming more expensive by the day.

In a Google survey, 38% of the respondents said they're likely to download an app when it's required to complete a purchase. Once they've completed that purchase, however, half will uninstall the app and move on. Also, as many as 25% of app users open an app once and never return, the Google survey says.

This growing disenchantment with apps could also be the result of users’ increasingly limited attention span and GUI fatigue.

The Appeal of NUI

Enter the NUI, which isn’t visible but is embedded into everyday tasks and thus is not dependent on a device screen, input device or the user's ability to learn and pay attention to it.

Such an interface is attuned to the user’s intent by design; the user interface is built after first understanding the reason why a user would interact with the system in the first place. For example, if a user walks into a branch of a traditional bank it could be for a number of transactions such as cash withdrawal, loan servicing, cash deposit, etc. If the bank’s user interface isn’t optimized to these preferences, it drastically reduces the likelihood of the user not getting what he expects in terms of service as well as experience.

One of the best examples of such an interface is project Soli by Google. It uses a new sensing technology that deploys miniature radar to detect touch-less gesture interactions. Users only need to wave their hands in front of the device, using gestures from day-to-day life.

Other NUI examples include:

  • EMOTIV Epoc+ is a wearable device that works on a brain-computer interface (BCI) and thus is a step forward in the new generation of user interfaces. Users of this device no longer depend on hand-eye coordination. Companies such as National Geographic, Disney , Sunrise TV Australia are already investing in brain computer interfaces.

  • Microsoft PixelSense is another example where technology is turning a standard two-dimensional user interface into an advanced platform. It allows more than one person to simultaneously interact with a sight-based multiple-touchpoint interface, combined with real-world objects like surface dial, to interact with machines. Business uses of this interface range from education to surgery. For example, the University of Lemrick’s “Everyday Technology projects uses augmented everyday objects to assist learning. As users turn the page to reveal the marker they see an image overlaid onto the page of the book. Exhibitions like this can create a unique museum visit environment. A child’s experience would be tailored to her interests, such as exhibits and learnings that deliver and enhanced and quite unforgettable experience.

  • Sensor network user interface (SNUI) devices such as Siftables, Google Glass project and VR headseats are among the more popular interfaces used in various business applications. Augmedix is a platform powered by human experts and software that frees physicians from computer work and allows them to focus on what matters most: patient care. Augmedix uses Google Glass and has increased average physician productivity by 30%. Ubimax is the global market leader for Enterprise Wearables and Assisted Reality Solutions. Ubimax Frontline is the world’s first fully integrated productivity solutions suite for frontline workers. It provides hands-free solutions for logistics, manufacturing, field service, and remote support. Smart Glasses now guide the worker through the process with intuitive GUI. Seamlessly integrated assembly step confirmation allows for hands-free working. Workers who encounter a problem at the production line do not need to leave their working space to consult with the shift leader. They simply call him from their Smart Glasses using the xAssist remote support solution and get real-time support to solve the issue.

Figure 1

Path to NUI

For businesses that want to transform how they interact with potential customers, a first step is to identify the key areas in the business where traditional interfaces are a bottleneck to growth. Typical sticky points include:

  • Services to existing clients: Scenarios where delays due to user-interface response time and limited user attention span cause dissatisfaction are best suited for new user interface technology. For example, medical professionals typically view CT scans and MRIs on a flat screen display or a sheet of paper. A body VR anatomy viewer enables real-time, anatomically accurate VR simulations to visualize medical diagnoses, illustrate the impact of procedures and treatments, and more educated decision-making. Patients can then use the application to visualize the labelled sections of the scan using a VR headset.

  • Areas where a better user interface could drive engagement, loyalty or revenue: Existing user interface models such as voice-based, GUI, and in-person can limit the product or services experience. For example, next-generation interface technology when applied in the automotive sector delivers results such as Eyesight Technology’s solution, which has finger tracking and the capability to recognize the driver’s gestures. The solution once installed is able to understand the driver’s intent based on gestures and enables the driver to control the automobile’s infotainment system. It can also recognize the driver’s awareness and thus avoid accidents. The system tracks the driver’s eyelid movement, gaze, blink rate and head position using deep learning and other machine learning tools.

  • Opportunities to engage potential new clients: Businesses can wow users through an interface that does not have to beg for attention but instead commands it. For example the MIT Media Lab has created a tangible user interface called Sand Scape that allows users to project computer simulation on sand to model landscapes. It allows users to create models of digital elevations, slopes and curvature shadows, water flow, view from a particular angle, etc.

    Operations where limited attention span and hand-eye coordination curbs potential output are ripe for next-gen UI. For example, Leap Motino’s technology allows users to use their bare hands as an input device. This is done by using the Leap Motion controller, which replicates hand movements to a 3-D character or any other object in the virtual world. Animators can use relation constraint and connect output values of these properties to input plugs of other animation nodes in the scene.

Afterword: EMOTIV Epoc+ (Source): EMOTIV uses mobile EEG technologies for advanced brain monitoring and cognitive assessment technologies. It offers three different kinds of detection algorithms, all of them built on extensive scientific studies to develop accurate machine learning algorithms to classify and grade the intensity of different conditions. With proper coverage and electrode configuration, it is possible to reconstruct a source model of all important brain regions and to see their interplay.

To learn more about the digital workforce of the future, please read our white paper “People – Not Just Machines – Will Power Digital Innovation.”

Related Thinking

Save this article to your folders


Save

PERSPECTIVES

Getting Artificial Intelligence Right

Artificial Intelligence must break free from the shackles of single-point...

Save View

Save this article to your folders


Save

PERSPECTIVES

What It Will Take to Master the Digital...

With new machines — and new business models — based on information, the...

Save View

Save this article to your folders


Save

PERSPECTIVES

Cognitive Computing: The 'Human' Soul of...

As the impact of cognitive computing grows exponentially, an increasingly...

Save View
From GUI to NUI/TUI — The Next Step