December 11, 2015

524 Views

Service Design for User Experience: An Illustration

In my previous blog (click here to read my previous blog), I highlighted some of the key challenges of a 21st Century Enterprise. Among  them, the one challenge that clearly stands out  is the one that talks about  service design for user experience. In this blog, I wish to emphasize on  end user experience based service design, with an example of Apple’s 3D Touch.

Today’s most-discussed device, the iPhone 6(s) has introduced an entirely new way with which users can  interact with their phones. For the first time, the iPhone has senses that tells it how much pressure are we applying to the screen. In addition to the already familiar multi‑touch gestures such as tap, swipe, and pinch, 3D Touch introduces ‘peek’ and ‘pop’. This adds a new dimension to the functional experience of using an iPhone, it also makes the iPhone respond with subtle taps. This means that, not only will we be able to see what a press can do, but we will also be able to feel it. 3D Touch is one of the most innovative features to appear on a mobile phone touchscreen. 3D Touch is not just a new feature, but is  a fundamental change in the way we have been physically interacting with our devices.

3D Touch, starts with the idea of estimating the force applied on a thin device. From a traditional designer’s perspective, it would just be a matter of designing something to detect any force being applied to the device. Had that been  the case, then the designer would have failed to deliver  overall experience because  user experience is always beyond the force they apply on the device. In order to deliver positive user experience, the designer will need to design a device that senses the intent of the user rather than design one tha detects or estimates the force being applied. That would basically need the ability to read the users’ mind. That would also mean considering the user’s environment and situation. For instance, the user might be using his / her thumb or finger to access the device features, or might be emotional at the moment, or might be walking, or just relaxing on a couch. All these scenarios do not affect the intent. However, they do affect what a sensor inside the thin iPhone 6 sees.

Undeniably, there are a number of design-only challenges to deal with. That being said, it wouldn’t be appropriate to sideline the importance of capturing the user intent. This adds complexities to the component design, which will lead to further intent-related challenges. Here, a sensor cannot be a simple transducer. It should be combined with an accelerometer to nullify the effect of gravity. Again, nullify here  means adding or subtracting the force, depending upon the direction of the motion. Similarly, detecting whether the user was touching the screen with his thumb  or with his other fingers is important to interpret the force. This led Apple to fuse the accelerometer in the sensor to detect the nature of interaction and added algorithms. This was one of those things that if we fail to  get it right, nothing built upon it will work. Apple did get it right, and hence stood out among all it competitors and continue to maintain its lead in innovation.

Even though I took the example of a ‘product’., it sets the context for the user experience with services. When we think of user experience with service, we may realize that it is starting to get even more complex. Service designers must defy the fundamental rule of psychology—we see things not as they are but as we are—and learn to see things as a user sees. Although not easy, there do exist design principles to design a service for consistent user experience. These principles are non-technical. Unfortunately, enterprise IT is largely infected by the tendency of solving every problem with technology. If this tendency continues, it will multiply the challenge. I see the external business world now forcing enterprise IT to think differently, and that’s a positive sign.