So you were smart enough and went for the agile way in implementing software. You took your time with a proper market research and found a market fit for your service: you experimented with variety of ideas by prototyping them rigorously and synthesised the findings into a glorious product vision. You’ve aligned your organisation for common goals and navigated the development roadmap cautiously by building MVPs (Minimum Viable Products) that passed all the quality gates increment by increment.
You were mindful of the quality for a good reason: users have zero tolerance for poor performance - even if the app would solve a real problem: a research from 2013 shows that only 16 % of users give an app more than two attempts if it failed to work for the first time. Conduct a few more studies and the reality might be even more ruthless.
But you went the extra mile and got things right. Congratulations, You are ready to Ship!
Before we proceed from development to production along the application lifecycle, let’s take a few steps back. First of all, let’s discuss market research and finding a market fit.
Assumption is the mother of all problems
There is clever method for customer-centric product design called Lean UX (User eXperience). Lean UX follows the principles of scientific method by creating a hypothesis, testing it and analysing the results. The lessons learned are used to make a new and improved hypothesis.
Lean UX design cycles are short, maybe a week long iterations of rapid experimentation during which prototypes varying from low to high fidelity are built and usage observed in the hands of a target audience of real users. Any design issue that does not add value to the experimentation is abandoned or postponed as out of scope. The prototypes might be seen only as means to getting valuable insight and often discarded at the end of the iteration.
Innovating something and proving that it appeals to a customer is not an exact science. For conducting an UX experiment you have to mix both qualitative and quantitative research methods with multiple simulations in order to reach to any valid conclusions. Thou shalt not rush into conclusion, even if you’re in a rush - you’ll only be losing time and money and worst of all - learning nothing (also see Dilbert on Friday 9th of May, 1997: http://dilbert.com/strip/1997-05-09).
As the saying goes, assumption is the mother of all problems. Since hypotheses are made of assumptions the saying has to be clarified a bit. If you have no means to conclude whether your assumption were right or wrong you are only guessing. And everything starts to go bad when we treat guesses as facts. One does not experiment without the data to validate the outcome.
So as a rule of thumb, when you make assumption figure out a way to test it.
Enter the world of data-driven design.
CRISP-DM (Cross-Industry Standard Process for Data Mining) is seen as the industry de facto standard process model for analytics. CRISP-DM has six major phases that are iterated over. The phases range from understanding the business domain and it’s various data sources, data preparation to the deployment of analytical model(s) created to make predictions about the unknown.
Advanced analytics and UX design make a perfect couple. Since customer experience is not solely a data-oriented mix of problems and solutions, the intuition of an exceptional designer should get in sync with the data. Just supplement the UX design cycles with continuously improving analytical models and you just might find a perfect market fit for your service.
Let’s get the Designer and Data scientist hitched. Why not?
Easier said than done, right? Right.
The reality is that designers rarely have enough data science know-how to properly prepare or evaluate any research data. Data scientists on the other hand do not speak design enough to make sense what the design is; besides the designers keep changing stuff on a whim, so there’s no point on trying to map or model the underlying data anyway. Tech people are the worst, they don’t care if the UX design is engaging or not nor what the KPI metrics are - they just want to start coding and be done with it. Not to mention the corporate IT people, who just think any and all change is bad.
To make matters even worse, nobody in the R&D shop floor understands what the business really wants in order to deliver anything innovative or useful together. This in turn feeds the chasm between tech and business people. Therefore the business people only know how to complain about the tech people’s way of overpromising and under-delivering.
People hang on to their silos and traditional way of working because organisations let them. And it’s crippling for the outcome - every design decision gets lost in translation.
A Cross-functional team to the rescue
The best products are never built in silos. There should never be a separate business development team, no separate analytics or design team, no separate dev and ops but a single dynamic, collaborative, cross-functional and self-organising product team.
Product team is a band of professionals that shares vision that drives everybody towards common goals. Insightful information needs to flow from business to the design to development to production and back unhinged. Product team needs to have all the essential expertise and techniques for designing a product that delivers value that goes beyond users’ basic expectations.
The product team alternates its focus from problem definition to solution discovery in PDCA (Plan-Do-Check-Act) manner. The design cycle’s length or sequence of phases are not strict, the team continuously determines when it is time to re-evaluate the reasoning behind the designs and when it’s time to deploy a working prototype to the target market and start analysing the user data.
Embracing the customer journey
There are some proven methods, tools and techniques to be used to help us get started. The one I find especially interesting and works both as a strategy tool and as a team collaboration tool is customer journey map.
Customer journey map tells a visualized story of the customer experience. It complements the lean UX methodology by providing context for customer interaction observations. When everybody contributes in sketching the customer journey map it will serve as common ground for communication providing qualitative and quantitative vantage points for the different experts in their group effort.
By analysing the customer journey map designers are able to find greater motivations behind the various customer interactions and touchpoints helping them to discover the underlying pains and opportunities better. The pains and opportunities serve as the moments of truth and the basis for the UX design experimentation and its test criteria.
Data scientists are able to utilise customer journey map to clarify business needs by dismantling the journey into parts that are understandable from the point of view of analytics. An interesting approach reflecting both the product development process described earlier and the CRISP-DM process is to divide the hypothesis formulation into a problem and solution contexts as Tuulikki Markkula and Antti Syväniemi from Houston Analytics suggest in their book Analytics journey (T. Markkula, A. Syväniemi, 2017). First the decision-making points and questions relevant for a product experimentation are determined. Then the data sources and data content corresponding to the decision points are mapped. When the problem is understood at a sufficient level it’s time to iterate over possible solutions
Plan, Do, Check, Act
As we can see analytics can easily be applied to follow the plan-do-check-act cycle, which is in the core of agile. The synchronisation of UX design and analytics happens when UX prototypes get launched to target users and the experimentation data and its analytical models are linked back to the UX designs. The synchronization results in incremental and iterative product development process based on creativity, analytics and optimization.
After successful synchronisation of the different crafts an enterprise can proceed to the actual utilisation of analytics. T. Markkula and A. Syväniemi distinguish traditional reporting as a tool to look back to the past, whereas advanced analytics gives you the power to both explain the present phenomena and to predict the future ones.
Wouldn’t it be nice if instead of a generic market analysis report, or as Markkula and Syväniemi depict it - a rear-view-mirror-image, you could actually gain user insight from prototype experimentations in real-time. With the help of UX design excellence accelerated with rapid prototyping techniques and advanced analytics you can pivot with your business ideas as many times as you like with a fraction of cost compared to failing in the markets with a bad product.
If you go Live with a bad product you will end up with something terrible - an unhappy customer. The good news is she has probably already turned to your competitor. The bad news is that you have probably also lost your business since word gets around equaling lost sales, negative social media shares, low net promoter scores, accelerating churn rates, you name it.
So check your data before you go Live. And If you have no data you have no business going out there.
Vesa Teikari is a Member of the Board, Partner, Business director and Agile Application Lifecycle Management Enthusiast in Houston Inc and Member of the Board in Houston Analytics.
Vesa has over fifteen years of working experience in the software industry working as a member of various software development teams from hands-on software developer and technical lead to project management roles. He has trained, coached and consulted multiple organisations in agile practices. Vesa’s expertise so far includes providing software consulting services for media, telecom, payment and IOT industries.
Vesa’s professional interest lies in the success factors of customer-supplier collaborations in software-centric service development. To shed light on the subject, he published a book at the end of 2012 called "Ketterän kehityksen ostajan opas / The Buyer’s guide for Agile” http://opas.houston-inc.com, that depicts key findings concerning on how to purchase complex tailored software effectively.