How it started

After learning and practicing UX Design on my own, I was finally able to secure my first internship at Rogers Communications for the summer of 2017. I joined the Digital Optimization team, which is in charge of gathering insights, running A/B tests, and recommending design changes for the Rogers and Fido online stores.

Unfortunately, my project is protected by a non-disclosure agreement, which means I cannot show any of my work.

Combining analytics and user research

The Digital Optimization team at Rogers had a unique mix. The team was made up entirely of Analytics specialists and UX Researchers. On the one hand, the Analytics specialists would gather quantitative data and insights about different metrics, such as where users were dropping off when making an online purchase. They would also run A/B tests to evaluate how different design variations affected conversion rates. On the other hand, the UX Researchers would conduct studies and interviews with users to gather qualitative data and gain a deeper understanding of user needs and expectations when purchasing a new phone. The Digital Optimization team was a perfect example of combining qualitative and quantitative data to improve the user experience.

Data-driven + user-centred design

When I joined the team, I was the only UX Designer among the Analytics specialists and UX Researchers. My project for the summer was to redesign the end-to-end purchasing experience (from the homepage until checkout) for new customers who wanted to purchase a phone on the Fido online store. My approach to the redesign had to be both data-driven and user-centred: I had to connect the dots between the wealth of quantitative data we had from our analytics tools and the insights we've gained from our research with users.

Understanding user behaviour with analytics

I had multiple analytics tools at my disposal, which allowed me to dive deep and understand how users were behaving on the Fido online store. I could understand what people were clicking at, scrolling to, or spending more time on. These tools were beneficial because they helped me discover problems that weren't so obvious. The value of analytics for designers is being able to observe what people do, as opposed to what they say. As humans, we unconsciously act and behave in certain ways that we may be unaware of, so looking at data can reveal certain behaviours that we may not find from interviewing users or conducting a survey.

Making sense of analytics with user research

I was fortunate to be working alongside some very talented UX Researchers who had completed extensive studies and interviews with Canadians in all age groups. They uncovered the different goals, expectations, and frustrations with purchasing a phone online instead of visiting a store. In addition to all these findings, I had access to customer feedback submitted on the Fido online store. Reading these helped me better understand some of the behavioural patterns I found using the analytics tools. Here's an example to illustrate what I mean:

→ Analytics Insight: The most clicked area on a page was three tabs that allowed visitors to discover different types of plans.

→ User Research Insight: Many users expressed their frustration that the plans were too wordy and that they could not easily compare them to find the best plan for them.

→ Interpretation: Customers were switching back and forth between different tabs to compare different plans and evaluate each plan's offerings and price.

⇒ Design Outcome: I redesigned the page and cut down the words in a way that allowed users to view all plans side-by-side so they can easily compare them without going back and forth between tabs.

Testing my new design

By the end of my internship, I had completed three iterations of the new design for the Fido online store. For the last iteration, I created a high-fidelity prototype to test with users. With the help of one of the UX Researchers on my team, we created a plan to test 10 users to see how my redesign of the online store compared to the current version. During each session, we asked the user to complete the same set of tasks on both the current version of the site and the redesigned version that I created to see if there was any difference. At the end of each task, the user completed a task evaluation to rate the overall experience, ease of use, and user confidence. Finally, the Net Promoter Score was recorded since the NPS was the metric used to evaluate the existing version of the online store.

The testing results were positive: my redesign increased the NPS of the site by 14.3 and scored higher on all usability measures compared to the current version of the store.

What I learned

Analytics are extremely valuable to designers

There's a lot of value in user research and talking to users to understand their frustrations. However, leveraging analytics tools can help us gain deeper insights into behaviours and problems that users aren't conscious of. Analytics can help us better understand how people use products, which may not always align with what they say.

Small changes can make a big difference

We know that being detail-oriented is a must for designers, but this was my first time seeing the impact of small design details on a product that serves millions of people. From the analytics insights, I was surprised to see how changing the button location, colour, or copy can lead to a significant increase/decrease in the conversion rates.

Conducting user testing

Since this was my first internship, it was my first time working with UX Researchers. From this experience, I learned how to write a testing script, ask non-leading questions to users, and decide which metrics to use to quantify the results. I also got the chance to shadow UX Researchers conducting usability testing with customers in a usability lab!

← Home