Home Depot Task Management

Project Overview

For this project, my team and I wanted to focus on how to encourage energy savings. Our user group was young people who are living together in a home. The users were trying to conserve energy, save money, and be eco-friendly. They were also trying to accomplish these tasks while cooperating with other household members. A system was needed to address the numerous challenges users faced in trying to accomplish these goals.

Research Problem

Project Overview

My team and I  worked with Home Depot as our industry partner to figure out how to help their associates with task management and efficiency. They faced the issue of dealing with too many tasks and wanted to figure out a way to balance customer needs and store needs. We were given a broad scope, so we chose to narrow our user group to a specific group of Home Depot workers who are part of the merchandising team (known as MET associates). They needed a better system for bay checking (their main task) and project management.

Research Problem

My Team

Yannu Li, Xi Chen, Yizhou (Fred) Liu

My Role

In the research phase, I conducted 1 of our 5 contextual inquiries and took notes during our stakeholder's tour of the store and introduction to the problem. I also completed two observation sessions and contributed to our market research.

In the analysis phase, I contributed to building our affinity diagram, created the task analysis for both the associates and supervisors, and helped identify the design implications.

In the design phase, I participated in our multiple brainstorming sessions and contributed to our mind map. I worked with Yannu to make paper prototypes for one of our design alternatives and created the inVision prototypes for our final design. I also helped with conducting user feedback sessions.

In the evaluation phase, I created drafts of the evaluation plans and led or took notes for several of our evaluations. I also did the quantitative data analysis.

 
 
 

Research Methods

Competitive

Analysis

Observation 

Expert Interviews

During our competitive analysis, we found that there are several generalized products that effectively handle project management, but few retail specific project management systems on the market. Home Depot associates move around a lot on the job and require a more retail specific approach to help them manage tasks. Our team took these market findings into consideration as we moved forward in the research process.

We decided to start our Home Depot specific research by conducting observations at the Home Depot stores. Our goal was to familiarize ourselves with our target users and their contexts and get a better understanding of the workflow of associates. We did two visits, one to get a better sense of the Home Depot store, and the other to focus specifically on MET associates.

The first trip told us what kind of an environment associates are working in. The Home Depot store is noisy, maze-like, and huge. The internet coverage in the store is spotty. We needed to keep these considerations in mind as potential limitations when we went into our design phase. 

The second round of observations was focused on the roles of MET associates specifically. When we observed them, they were very busy working on a Christmas project, which involved several tasks related to straightening out shelves with Christmas products, cleaning up around the Christmas trees, etc. We realized that projects were a big part of a MET associate's daily work in addition to bay tasking and we were able to compile a list of the sorts of tasks an associate may complete on a project.

We interviewed 2 experts, people who work on software development for the first phone (the phone used by associates in Home Depot) and who are familiar with the MET team. We wanted to better understand the goals of the company, customer needs, and technological limitations. The word cloud below gives an overview of the topics discussed and we also pulled key points from our notes into consideration for our designs.

Contextual Inquiry

Key Points

  1. Both MET supervisors and MET associates have too many tasks and projects to keep track of and this creates a stressful work environment.

  2. Communication and delegation of tasks could be improved between supervisors and associates. Additionally, associates do not have many ways to give feedback.

  3. An important challenge they face is how to measure quality. Currently, supervisors do quality walks, but the experts were interested in exploring other ways this could be done.

  4. Future technologies that they are looking to incorporate include machine learning, AR, and voice interaction to allow for automation of key processes and better accessibility.

To get an in-depth perspective from the associates and supervisors themselves, we conducted 5 contextual inquiries. We asked the users to go through one of their daily tasks and asked questions were clarification was needed. To analyze the results, we used affinity mapping to identify common pain points.

For supervisors, we found the main challenge areas to be:

  1. Managing talent: need to choose the right associates for the right projects

  2. Quality management: need efficient ways to evaluate quality

  3. Task management: need flexible, efficient ways to manage projects

For associates, we found the main challenge areas to be:

  1. Accessibility: need increased mobility and visual accessibility

  2. Task management: need better methods for bay checking

  3. Communication: need better channels to convey information

 
 

Challenges

The main challenge in this project was setting up face time with our users. We chose a very specific user group within Home Depot, MET associates and supervisors, and they run on a very busy schedule. We were told by our industry contact to not offer compensation to them for participating in any interviews or user testing sessions. These restrictions made it difficult to set up meetings with them because our sessions would be taking valuable time out of our user's work day.

 

These limitations forced us to be flexible and change our plans, sometimes on the spot. We also learned how to maintain the balance between persistence (being assertive and asking supervisors for their time) and professionalism (being understanding of time limitations and limited availability). Although this meant we didn't always end up with the ideal number of users for some of the research methods and evaluation steps, we still managed to go through the process and gain valuable insight from some users. 

 

Synthesis of Findings

To better understand how our users felt in relation to the tasks they had to do, we decided to use the Jobs-To-Be-Done (JTBD) framework. This method takes the place of personas and makes our analysis more task centric.

JTBD

Empathy Maps

We also wanted to understand our users' emotions better. We created empathy maps to be able to step into their shoes for a day to fully grasp their experience through different senses.

MET Associate Empathy Map

MET Supervisor Empathy Map

Task Analysis

To ensure that our system would account for all the tasks that our users must complete, we did a breakdown of the task for both associates and supervisors through task analysis. This helped us get a sense of the work flow and all the main components that we would need to consider in our design.

Design Implications

Taking all of the above findings into consideration, my group and I held a brainstorming session where we considered the design implications. Then we grouped our implications together based on common themes, then prioritized which implications were the most important to address in our design. The tables below show the results.

Our priority system can be read the following way:

P0 = Very Important

P1 = Somewhat Important

P2 = Not Very Important

Design Implications for MET Associates

Design Implications for MET Supervisors

 
 
 
 

Design

As we considered our design implications, we wrote down any ideas we had individually on sticky notes. We then categorized the sticky notes and created a mindmap to better organize all of our ideas.

Divergent Brainstorming

Convergent Brainstorming

To converge on the best ideas, we first plotted all the possibilities on a feasibility versus creativity chart to help us focus on the ideas with the most potential.

Idea #1: Standard Tasking App + Supervisor Dashboard

This solution would be a redesign of the current tasking app. It would cluster tasks into categories, allow for novice v. expert work flows (free-style option), and have reporting. The supervisors dashboard would give a calendar view that showed the deadlines for all bays and projects.

Idea #2: AR Tasking

Associates can hold up their phone to a bay and take a photo. Then machine learning can be used on the photo to identify shelf-outs and price changes. Using AR would help the associate to complete their tasks faster. When finished with a tasklist, they could upload a photo for quality checking.

Idea #3: Voice for Tasking

With this solution, associates would put on headphones to listen as the system guides them through the bay checking process. They could ask for assistance or clarification at any point and would simply say "Done" to mark a task completed. They would have a visual app for support if the voice system failed.

Feedback Sessions

We conducted two rounds of feedback sessions. We were only able to recruit 1 participant for the first round of feedback (one of our industry partners), but it was detailed enough that we were able to consolidate the best features of the 3 concepts into one system.

We used the wireframes of the combined system (included free-style, AR, and voice features) to test with in our second round of feedback. We were able to recruit 2 MET associates and 1 MET supervisor. We asked them to think aloud as we went through the prototypes and then we asked questions to get more specific feedback during the sessions. The table below shows the synthesis of our findings from the second round of feedback. We were also able to identify some key issues that needed to be fixed.

Final Design

We corrected all the issues we had identified in the feedback sessions. Below is a breakdown of the final design.

Associate's App

(1) Bay Overview: The home screen displays bays that associates are required to check daily. Potential shelf-outs and the deadlines for bays will be marked. Tasks are categorized into several task lists for better management.

(2) Expert Flow: Once an associate has gained enough experience, they can use the free-style method for checking tasks. This means that associates do not have to check tasks one by one. Instead, they will only be reminded of a few key tasks.

(3) AR Planogram: Instead of using the paper planogram to check layout and prices, our system will scan all the product barcodes and provide this digital AR planogram to assist comparison and completion of bay tasks.

(4) Project Overview: This screen displays the projects that associates are required to complete daily and weekly. Users can set timers for the project deadlines and upload a photo to supervisor for the quality checking.

Feel free to explore the full inVision prototype below.

Supervisor's Dashboard

(1) Dashboard: This tablet application gives a calendar overview for supervisors to better manage deadlines and associates. They can view detailed information, associate profiles (which shows their skills), and goals for the week. It also recommends which associates would be best for a certain task.

Feel free to explore the full inVision prototype by following this link:

https://invis.io/FBPCHTVYHXS 

 
 

We used this chart to focus on ideas in the top right corner to narrow down to three design alternatives.

 
 

Evaluation

We conducted two different evaluation sessions: one with experts (Home Depot UX designers) and one with our users (MET associates & supervisors).

For our expert evaluation, we used the heuristic evaluation method. We gave them the following heuristics to consider: 

  1. Match between system and real world

  2. Flexibility

  3. Efficiency of use

  4. Accessibility

  5. Error prevention

  6. Consistency and standards

  7. Aesthetic and minimalist design

When they found an issue, we asked them which heuristic they thought it violated and took note. We tested with 6 experts.

For our user evaluation, we used moderated user testing. We used the following procedure:

  1. Give introduction to project and design solution.

  2. Ask user to complete tasks.

    1. ​For associates:​​

      1. Complete regular bay checking and use AR to complete a tasklist.​

      2. Use free-style to complete a tasklist.

      3. Complete a project.

    2. For supervisors:​

      1. Get an overview of the week.​

      2. Assign a project to an associate.

      3. Check the status of bays 15-20.

  3. After each task, give After-Scenario Questionnaire (ASQ) to measure the user's satisfaction of the task.
  4. After all the tasks have been completed, administer SUS form to measure overall usability of the system.
  5. Thank user for their time.

We asked users to think aloud as they completed the tasks and we took notes. We tested 3 MET associates and 1 MET supervisor.

Our overall SUS score was 85.83, which met our target of 68 for a passing system. We used the rest of our qualitative and quantitative results to inform the changes we would make to our design in the future. Some of the major changes we would make include the following:

  • Add descriptions for key features.

  • Make key features more discoverable.

  • Improve the flexibility of the AR planogram.

  • Fix wording problems to match users' mental models.

Methods & Results

 

©2018 by Taylor Stillman. Proudly created with Wix.com