Trading system for financial company

Guerrilla UX in a pandemic: encouraging best practices in adverse situations

Summary

Working as UX Designers, we can often be put into situations where clients are unaware of the value of UX practices, and we have to not only explain that value, but to also demonstrate it. In this portfolio piece, I describe how I worked remotely on a technically complex project during a pandemic, with no affordance made for discovery and research, how I helped the client understand the value of what I was doing, and worked tactically to demonstrate value beyond their expectations to make an outcome that everyone was proud of.

Introduction

Just before lockdown was announced in March 2020 in the United Kingdom, I had started work with a software consultancy whose clients included some of the more well known financial institutions in the City of London. One such company wanted us to help them update their fixed-income bonds trading software, as it ran on the soon-to-be retired Adobe Flash platform, meaning that their software, through which millions of pounds flowed through every day, would stop working. Alongside this mainly engineering led objective, I was included in the team with the somewhat abstract goal of “improve the design while we fix the platform”.

One of the major challenges that I also faced was the complexity of the system. Fixed income bonds trading can take up to six months to learn to just get started, and I was working with a client who felt that I should be producing designs as soon as possible, as they had only previously worked with User Interface designers, and did not appreciate the value of a proper UX process.

Demonstrating value, while conducting discovery

I therefore needed to be able to demonstrate value quickly, and the easiest way to do this is to ensure that the client is kept regularly appraised of progress, and sees a constant series of artefacts to demonstrate the progression in thinking. I started by persuading the client to grant me access to stakeholders and users, so that I could begin to get a better appreciation of the project context. I started with the stakeholders, who consisted of project members and subject matter experts on the client side. I was able to set up a number of calls with them, for them to explain their perspectives and opinions on the project. This provided a firm foundation from which I could then begin to assemble my own picture of the situation, which I coalesced into a user journey, expressed on a whiteboard. On this whiteboard, I was able to define the definite known parts, assumptions and questions I still had yet to answer.

Whiteboard with post-it notes and written text, describing simple discoveries in the user journey
Mapping out the overall stages and observations around the trading software user journey

This whiteboard became my first “information radiator” – an asset where I could share my thoughts with others on the team, and get their inputs and insights. Usually, a whiteboard like this would be placed somewhere prominent in the workspace, ensuring that people saw it regularly, and inviting people to leave comments and ideas. However, as we were all stuck at home, I had to ensure that it was freely available to everyone on the project in a digital form that could be easily updated, so I carved out my won section of the project Confluence wiki for my UX work, where I began to write up my thoughts, post pictures, and encourage people to check up regularly, leave comments and feedback.

A typical confluence page showing how information can be shared in wiki format for everyone working on a project
Sharing information in a project using confluence (note: this is a generic screenshot from Atlassian, and not a page from our project confluence)

Conducting dynamic interviews

Using my explorations on the whiteboard, I could then collect the “unknowns”; the questions, assumptions and ideas I had written down, and use them to make a interview guide, which was also shared with my colleagues for feedback. I could then begin to arrange some sessions with actual users, with a list of ready questions to ensure that we filled in as many gaps in my understanding as possible.

For user interviews, I would usually try and meet with the user face to face, discuss their working practices, and examine the ways in which the product would fit with their requirements. However, as this was during the pandemic, and we were all stuck at home, I had to arrange calls with them instead. This situation was useful in some ways, as their isolation meant that they weren’t as distracted as if they were on a busy trading floor. However, they were still working, and our conversations were often interrupted by a steady stream of notification sounds coming from their trading software, with them often having to pause answering my questions to deal with a new trading ticket that had come in, and required their urgent attention, which could often interrupt the flow of the conversation.

Image of obfuscated text, showing grouping of questions on a page
Mock-up of my questions sheet, showing how I grouped the questions to anticipate regular interruptions during interview

In order to deal with this, I had to work around these distractions by arranging my questions into small groups. This would help me to ensure that I could get answers I required, hopefully before the next distraction came. It didn’t always work, but it did help me to arrange my questioning to something of a more successful outcome.

Demonstrating value through discovery

Previous to conducting the interviews, the client had voiced skepticism around the value of interviewing clients, in the light of their decades of technical expertise, and my lack of previous knowledge of the subject matter. However, upon finishing each interview, I would collect up my notes and define clear observations from my discussion. I would the compare and contrast those with what others had said, and was able to produce a series of discoveries, which I was them able to share with the client, to demonstrate the value of my efforts by providing ideas for improving the product. These included:

A sliding scale of requirements

One key discovery was understanding the relationship between the different types of trader, which provided a scale between the amount of tickets that they dealt with each day, against the amount of supporting analytics information that they needed to deal with those tickets. The current solution had been a “one size fits all” approach, which was then tweaked as users complained that it lacked the functionality they required for their specific working style. This led to a product that was full of “experience rot”, and ran slower and slower, as more features were demanded and added. This increasing slowness of the system led to traders actually missing important trades, and a loss of potential revenue. By understanding the basic needs (as shown on the Kano model) of different types of user before the system was rebuilt, we were able to factor in the requirements along the sliding scale, without impairing the smoothness and speed of the final product.

Graph showing the relationships between different types of users along a sliding scale of numbers of tickets versus complexity of tickets
Graph showing the sliding scale of different user types, based upon complexity of tickets against “traffic” (number of tickets dealt with per day).

A noisy user interface

Speaking to the users, I also discovered that the current user interface was very noisy – lots of bright colours and sounds vying for the user’s attention. This was again the product of the previously executed “bolted on” process, where functionality had been added upon demand, with no consideration for the relationship between different features on screen, leading to a screen full of notifications and a visual and auditory cacophony. What’s more, tickets would “pop up” on the screen, often obscuring other vital information, or other tickets. In busy periods, the combinations of tickets rapidly popping up, as well as all the colours and noises that came with them, led to a very frustrating experience.

Implementation

Having presented these and other observations to the client, I could then proceed with designing solutions to address them:

Number inputs

For traders who needed to deal with large numbers of tickets in a short time, being able to accurately adjust values on the ticket was key. I therefore explored ways in which traders could adjust numbers incrementally for small adjustments, as well as type new numbers in for larger changes.

Ticket prioritisation

Designed an interface which allowed users to triage tickets automatically, setting up rules that would divert tickets under certain criteria away from the main interface, so that they could focus on those that they found more important. The diverted tickets would be relocated to “drawers”, allowing the user to still check in on those tickets, when they needed to.

A modular interface

Diagram showing spaces on a screen to indicate areas for users to customise their screen
A customisable screen layout, ensuring that users can tailor the screen to their needs, and be able to see all the information they need at once.

In combination with the previous two ideas, I provided a customisable, modular interface, allowing users to tailor their screen with the information that they needed, and leave out anything that they didn’t. This also prevented the need for pop-up tickets, as they could be shown on the screen using the triage system, thereby ensuring that they didn’t cover any other important information.

Atomic design system

To complement the modular layout, and to help the development team, I created an atomic design system that regulated how each component looked, worked and fitted together. The atomic design system was developed using symbols in Sketch meaning that any changes would disseminate throughout the whole system, and, most importantly easily provided an overview to ensure that colours and layout did not override the overall information architecture of the screen.

Conclusion

The client was pleased with the outcome of my work, and the new system has now been implemented. Due to the nature of the engagement, I was moved on to another project, and I would have liked the opportunity to stay on and be able to test my work more with users, so that I could have a more defined metric of success. I also hope that my work has demonstrated the value of UX research to the client, and, in future, they consider taking a more user-centric approach, instead of assuming that they can successfully dictate product success with their own experience. As a result of my experience, I wrote an in-depth analysis of how and why you should conduct better user research in your projects, using my experience on this and other projects to help inform others of the discoveries I have made.

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>