Overview
ADP is a comprehensive global provider of cloud-based Human Capital Management (HCM) solutions, including Human Relations, payroll, and benefits.
I conducted a usability analysis of a proprietary application called the MDF Admin Console, built in-house at ADP.  This web-based app is used by developers to translate UX designs into Web user interfaces using customized versions of Bootstrap and React and leveraging ADP’s user interface standards.  From the usability analysis, I identified and designed enhancements to improve usability and developer productivity. I also researched how designers communicate designs to developers, and made suggestions to enhance the design-to-development workflow.
Usability Evaluation
Before analyzing the usability of the tool, I wanted to understand the context in which the tool was used. After initial meetings with stakeholders to understand how UX and development worked together, I drew this conceptual workflow. To determine how to sample the user population for interviews. I annotated the key processes with an approximate numbers of users and locations represented.
To provide some initial results quickly, and to prepare to discuss the tool with users, I completed an initial usability evaluation using two sets of commonly used criteria: Jakob Nielsen's "Ten Usability Heuristics for User lnterface Design", and the six quality attributes described in Nielsen's "Usability 101: Introduction to Usability”. Click on the thumbnail below to download a PDF extracted from the original document. 
Research
To fully understand the context in which the tool was used, and the needs of the various user communities, I teamed up with two other interns to interview 7 UX Designers, 6 Developers, and others involved in the design-to-development workflow. I created the interview guide for Developers.  Click on the thumbnail below to download a PDF of the entire document.
To compliment the qualitative, open-ended information I expected to gather during interviews, I wanted to gather some quantitative information, without biasing interview responses (or giving the impression that recommendations had already been determined). To gather this information and address these concerns, I created the very tentatively worded survey shown below, given to developers at the end of interviews.
Design
I analyzed the qualitative and quantitative information from interviews, then drafted the recommendations shown below to discuss with my team members and management. 
Based on direction from management, I designed, prototyped, and evaluated approaches for ways to improve communications and developer productivity, through easier selection/entry of text styling for color and size.
Prototyping and User Evaluation
To create enhanced screen designs which appeared as authentic MDF Admin Console screens, I used a test version of the MDF Admin Console to create modified screens, then screen captured images. When necessary, I used Visio to create images of additional features, which I overlaid onto the captured screens.
To create a prototype for evaluation, I imported screen images into PowerPoint and added evaluation script prompts into Slide Notes. I chose developers to evaluate the prototypes with a range of experience in both programming and using the MDF Admin Console. The evaluators seemed equally engaged whether the prototype was presented as a paper prototype, or as a PowerPoint show. Click on the thumbnail below to download a PDF of the entire document.
Tool Evaluation
While reviewing the design-to-development workflow, developers expressed a belief that specifications containing specified styling information (spacing, font colors, font sizes) would save effort and improve accurate delivery of designs. I researched available software that could extract this detail for developers without adding to designer workload. (In summer 2016, few design or prototyping software products were providing these features, which were becoming more common by late 2017.) I reviewed Zeplin, a product that can provide details about designs created in Sketch or Photoshop.
To perform this evaluation, I built in Sketch a realistic example design which replicated an existing production screen,  using ADP’s visual UX standards. I then used Zeplin to review the design’s spacing, font colors, and font sizes as a programmer would. I concluded that Zeplin appeared to deliver the needed information for developers without requiring extra effort from designers, and recommended that management review license pricing and the security of cloud-stored designs. Click on the thumbnail below to download a PDF extracted from the original document. 
Results and Retrospective
What was delivered: I conducted and delivered a usability evaluation. I researched, designed, and evaluated two enhancements targeting improved developer productivity. I evaluated and recommended a tool to productively share design information with developers.
What I learned as a designer: This was my first experience with UX standards and design systems, and that was extremely valuable. I spent a fair amount of time studying ADP’s visual design standards and how they were communicated to designers and developers, which helped me apply ADP standards when I built the example design to evaluate Zeplin. 

I also developed experience at making design changes to “real live” UI, using two different approaches (authentically recreate an existing screen in Sketch then change it, or capture an existing screen then edit the image), and became comfortable with my judgment on which might be more productive for a given situation. 

I think I also developed a good understanding of how to consider workflow (process) as part of UX research and design. I believe the usability evaluation provided more value by understanding the context in which the tool was used. Investigating the process led to informal discussions about how designs were communicated to developers. I think those discussions helped both groups realize the opportunities to improve communications.

You may also like

Back to Top