Carputty Vehicle Valuation Tool

INDUSTRY CLIENT • ITERATIVE DESIGN • CONTEXTUAL INQUIRY • HEURISTIC EVALUATION

An overhaul of Carputty’s Vehicle Valuation tool, enhancing how their customers input vehicle data, make vehicle-related decisions, and view a vehicle’s value over time.

Project Context

• Industry Client (Master's course)
• Fall 2023, 15 weeks
• Team: Ellie Park, Jisu Kim, Aby Kottoor, Ziyi Shao

Roles and Responsibilities

Helping customers make informed financing decisions.

• Figma/FigJam
• Adobe Illustrator
• Qualtrics
• MS Office (Excel, Teams)

Background

In my Psychology Research Methods course last semester, we partnered with industry clients in teams of 5 to work in a problem space with a given problem statement. Throughout the semester, our team used iterative research and design methods to arrive at a solution that addressed both user and business problems we identified.

Our group chose to work with Carputty, an auto-financing startup based in Atlanta. Carputty is rewriting the rules of traditional auto-financing by encouraging individuals to view their vehicles as investments. The company offers a Vehicle Valuation Tool that uses predictive AI and industry data to show customers the future value of their vehicles over time.

At the time we started working with Carputty, customer interaction with the existing valuation tool had been stagnant; the product and design teams for the Valuation tool looked to us to reimagine the tool.


In a Snapshot…


INITIAL PROBLEM STATEMENT

Upon partnering with Carputty, they provided us with the following problem statement. Throughout our design process, our team refined and restructured the problem statement based on insights we generated.

Primary Tools

• UX Research: Literature Review, Contextual Inquiry, Interviews, Competitive Analysis, Heuristic Evaluation
• UX Design: Sketching, Wireframing, Prototyping, Usability Testing, Iterative Design

UX Audit

Figure 1

2) Constricting input options:

The tool restricts customers by only allowing them to run the tool by inputting their VIN and mileage. What about customers who are looking to buy a vehicle and don’t have a VIN? (Figure 2)

Figure 2

3) The tool only runs one valuation at a time:

Customers can only run one valuation at a time using the tool and aren’t able to compare valuations of different vehicles. This may result in them navigating away from the tool or discontinuing use over time. (Figure 3)

Figure 3

The existing valuation tool is constricting and easy to miss.

First, our team needed to develop a solid grasp on the current state of the Valuation tool and any roadblocks Carputty customers could be facing with using the tool and analyzing data produced from it effectively. We went about this by doing secondary research, starting with a website audit on the Vehicle Valuation tool experience. I spearheaded this audit by guiding our team in evaluating the process of finding the tool, learning about the tool, entering information into the tool, and analyzing data using the tool.

Insights

1) The tool is easy to miss:

The Valuation tool is located at the bottom of the home page, so customers are likely miss the tool due to not scrolling all the way down (Figure 1).

WEBSITE AUDIT

Process

Weeks 3-6

Weeks 7-12

Weeks 1 and 2

Weeks 13-15


The importance of trust in predictive models.

Next, our team wanted to delve into the role data visualization plays in developing trust in predictive and AI models. We conducted a thorough literature review of 9 academic papers found through the ACM and Google Scholar digital libraries. Below are our primary takeaways from the review:

“Black Box” Scrutiny leads to Loss of Trust

Users of AI systems rely heavily on trust in the system’s output and functionality, but sometimes trust is lost because users cannot understand the system’s inner workings.

Design Decisions can Improve Trust

• Explain the importance of individual attributes to the outcome
• Show the smallest change needed to flip the machine prediction
• Show the outcome of similar data points

With insights from secondary research, our team now had a baseline understanding of the problem space and potential factors that could be contributing to low customer interaction with the Valuation tool.

At this point, we decided to move forward with our primary research.

LITERATURE REVIEW

Understanding what customers value in their vehicles.

Next, our team deployed a Qualtrics survey aimed at potential Carputty customers. Our 22 survey respondents had a variety of prior experience with Carputty; most, however, were unfamiliar with the company. We asked participants about their familiarity with Valuation tools, how important they deem different aspects of a vehicle, and what kind of information they would readily have available to provide as input to the tool.

Our team distributed surveys in relevant Slack, Discord, and Reddit forums.

Q: Select all the information you would have readily available to provide a Valuation tool:

The VIN isn’t the most readily available vehicle identifier.

The existing tool only allows customers to enter in their VIN, but there are other vehicle indicators that customers may prefer inputting instead, such as make, model, and mileage.

Car condition, mileage, and price are indicators of a vehicle’s value

• 100% of respondents indicated car condition is either very or extremely important
• 72% of respondents indicated mileage is either very or extremely important

Q: Which of the following vehicle-related tools have you used before?

Customers may not have prior experience with these tools.

10 respondents indicated that they’ve never used a Valuation tool before. This suggests that we will need to focus on the explainability of the data we’re presenting.

SURVEY

Observing customers using the tool in action.

After deploying the survey, our team set out to recruit for contextual inquiry participants. We decided to conduct a contextual inquiry because we wanted to assess how customers were using the Valuation tool in real-time, instead of just through a Website audit. This gave us insight into where exactly they were getting confused or losing interest in the tool.

We conducted 45-minute inquiry sessions with 3 participants who were familiar with Carputty and the vehicle purchasing/selling process. These participants were recruited from our pool of survey respondents.

Difficulty Locating the Tool

Participants expressed difficulty in locating the Valuation tool. More specifically, all 3 participants needed guidance from a facilitator to find the tool.

Unsure about Reliability

All participants expressed hesitancy in trusting the Valuation result, despite information provided about where data is coming from.

Participants mentioned that having more granularity in the results would help increase reliability.

We observed our participants as they locate the Valuation on Carputty’s website, learn about and understand what the tool does, input information into the tool, and interpret the results. Below are our findings:

Insights

Trouble Interpreting Results

2 out of 3 participants faced difficulty in interpreting the results (data visualization). This stemmed from not understanding symbols on the graph (such as the triangles) and not knowing what steps to take, if any, to buy or sell.

CONTEXTUAL INQUIRY

A deep-dive into the customer’s vehicle exploration process.

With insights from our contextual inquiry, our team moved forward with conducting semi-structured interviews with potential customers. We conducted 4 online interviews with participants selected from survey respondents. These interviews focused specifically on the significance of Valuation tools in the car purchasing and selling process, tools or methods used while making vehicle decisions, and a further deep dive into the perceived reliability of the tool.

Sample Questions

Our team analyzed qualitative data from the interviews in pairs in Excel to ensure inter-rater reliability. Some of the codes we used included: “more data necessary”, “what is a VIN?”, and “family purchasing vehicle together”.

Data Analysis

Findings

We synthesized our codes from the interviews into the following 6 key insights, a few of which map up to our findings in the UX audit and contextual inquiry:

🌟 Discovering a New Customer Segment!

Two of our interview participants were actually Turo hosts. Turo is a company that connects a so-called “Turo host” with an individual looking to temporarily rent a car. As such, Turo hosts typically have a fleet of cars that they manage and rent out.

Our Turo host participants were intrigued by the Valuation tool and potential use cases for their business.

SEMI-STRUCTURED INTERVIEWS

Mapping out key themes…patterns emerge!

Our team synthesized our qualitative data from the UX audit, contextual inquiry, and interviews by using an affinity map. This mapping technique allowed us to let patterns emerge naturally as we created themes.

We created a total of…

Low Level Themes

Affinity Map

Mid Level Themes

High Level Themes

Later in our design process, we used our low and mid-level affinity map themes to develop design requirements.

DATA SYNTHESIS

Identifying opportunities in the customer journey.

Persona Mapping

In parallel with data synthesis, our team created 2 personas to encapsulate our customers’ needs, frustrations, and goals while using the Valuation tool and identify differences in the needs of an individual customer and a Turo host customer:

Meet Jordan, an individual Carputty customer who is in the market for a second car

Journey Mapping

Meet Will, a Turo host who wishes to effectively manage his fleet of cars and use the Valuation tool to set rental prices

I led the effort of putting our personas on a journey map to track their current experiences with the Valuation tool, all the way from finding it to eventually interpreting its output.

Key Points in the Journey

Finding the Valuation Tool:
• Extra effort spent in finding the tool on the website induces negative emotions at the beginning of the journey
• Confusion about what the tool is meant to accomplish deters them away from using it


Using the Valuation Tool:

• The Turo host must navigate to their Dashboard to grab their VIN and mileage to put into the tool, which is frustrating
• The individual customer must go outside to their vehicle and check their registration papers if they wish to input their VIN

Interpreting Valuation Results:
• The individual customer likely ends the journey more frustrated than the Turo host due to difficulties in interpreting valuation results that may be explainable to the Turo host, but not to them

PERSONA AND JOURNEY MAPPING

A more granular set of design opportunities.


PROBLEM REFRAMING

At this point in our process, we sat down as a team to reframe our problem after conducting the majority of our research prior to the design stage. This new problem statement reflects themes found in our data synthesis and helped inform our design requirements.

A focus on tool navigation, clarity, data points, and improving reliability.


DESIGN REQUIREMENTS

Prior to entering our design phase, we formulated the following design requirements to help guide our ideation and iterative design process moving forward. The starred requirements (chosen based on a combination of research insights and feasibility) are ones we prioritized throughout our design stage.

5 distinct design concepts.


CONCEPT IDEATION

Our team developed 5 design concepts through sketches. Each concept addresses some of our design requirements.

Concept 1: New Navigation and Onboarding

Concept 2: First Valuation Tool Concept

Concept 3: Second Valuation Tool Concept

Concept 4: Interactive Car Visualization

Concept 5: Turo Assist

• Highlights Valuation tool by giving it its own tab in the navigation bar
• Upon being clicked, the page overview of the tool would include both visual and textual details about what the tool can achieve with specific use cases

• Enhanced user input options for Make/Model and License Plate
• Chance to edit info including car condition and accident history
Tooltips to indicate meaning of terminology and provide clarification
Time range options for valuation graph
• Improvement in reliability by allowing user to change desired input parameters and compare new valuation with existing one

• Comparing two vehicle valuations
• Improvement in reliability by showing user a) how each parameter is affecting their valuation and b) how much weight the model places on each parameter
• Saving and sharing features

• A second iteration of showing how each parameter affects the valuation, this time using a bi-directional bar chart

• A 360-degree interactive vehicle assessment generated from photos and videos the user provides when entering vehicle information
Touch-points for exterior and interior damage and correlated affect on valuation

Provide investment recommendations for Turo hosts based on rental trends
• Allow for comparing vehicles based on renter demographic, profit over time etc.

Putting our concepts in front of users.


CONCEPT VALIDATION

After creating our concepts, we set out to validate them with 2 potential Carputty customers who both had prior knowledge about the company and were interested in buying or selling a vehicle.

Validation Sessions

The validation session was held as one focus-group style session. We chose to do this because we recognized that our initial concepts were still relatively “fuzzy” and extremely lo-fi; holding the session as a focus group would allow for participants to build off each others ideas and give helpful feedback.

🌟 Something interesting we did during our testing for Concept 1 was show participants a side-by-side, lo-fi comparison of the existing navigation and our new concept for it

From our team, we had 2 facilitators and 1 notetaker and followed this testing protocol.

From Concepts to Wireframes.


ITERATION 1

Concept Feedback Analysis

Based on feedback from the concept validation session, our team began creating wireframes out of our initial concepts and incorporated some changes based on insights we gathered.

Design Changes Made

1. Users preferred our navigation concept as opposed to the existing navigation.

2. Users found it overwhelming when presented with several input options from the get-go.

Wireframes

3. Users were perplexed by the bi-directional bar chart and found it unecessary.

4. Users had questions about the Customize Timeframe feature and how it would look when used.

Below are the wireframes we designed at the end of our first iteration while incorporating feedback from concept testing. The wireframes are a higher fidelity and show fleshed-out versions of features included in our previous concepts. The blue bubbles indicate significant changes we made from our concept iteration to our wireframe iteration.

An interesting turn of events…


A SLIGHT PIVOT

Within a few days of us finishing our wireframes and incorporating a combination of a few of our initial concepts, including the Turo Rental Assistant concept, we discovered that Turo actually has something quite similar called the Turo Carculator. This tool intends to help build their fleet and maximize ROI by providing information about renter trends and estimated earnings per year.

Upon finding this tool and sharing it with the design team at Carputty, they suggested that we pivot away from providing fleet assistance to Turo hosts outside of valuation outputs and instead focus on iterating the other features.

Putting our wireframes in front of users.


WIREFRAME VALIDATION

After designing our wireframes and incorporating concept feedback, we put them in front of 2 additional participants who were familiar with data visualization and Valuation tools.

Validation Sessions

The validation sessions were held separately with each participant, given that our wireframes were at a more fleshed-out stage than our initial concepts, and were actually clickable, interactive wireframes.

🌟 Unlike the concept testing sessions, we adopted an exploratory approach here, first allowing the participant to click around and provide insights, with our facilitators guiding them through the experience where necessary.

We had a set of "guiding" tasks to help the participant along if necessary

From Wireframes to Hi-fi Prototype.


ITERATION 2

Wireframe Feedback Analysis

Based on feedback from our wireframe testing sessions, our team began creating hi-fidelity designs and incorporated some changes based on insights from the sessions:

Design Changes Made

1. Both users were unsure what the purpose of the “Follow-Up Mail” button was.

2. Both users expected there to be a disclaimer of sorts right before showing Valuation results.

No disclaimer present right before Valuation output loads.

A Glimpse at the Design System

3. Both users expressed uncertainty in identifying which input fields were mandatory vs. optional.

Below is the design system we used to create our hi-fidelity designs and prototype. Many of our components are directly from Carputty’s existing design system, but since we were adding several new features, we came up with a few of our own components, including the tables, triangle symbols, and the vehicle cards in the Garage.

Conducting expert testing on our designs.


HEURISTIC EVALUATION

With our hi-fidelity prototype in hand, we had 3 UX experts evaluate our prototype against the 10 Nielsen Heuristics. Evaluators completed a set of pre-determined tasks (set by us) in the prototype, and were asked to fill out a heuristic table using the scale shown below (Figure 3).

Sample Tasks

Problems Identified

We had our evaluators rate two things on this scale: a) overall usability for a specific heuristic and b) severity of the particularly usability problem

Putting the final prototype in front of users.


USER-BASED TESTING

Finally, our team put our heavily iterated on and tested prototype in front of users for the last time. We tested the final prototype with 4 participants diverse in age, demographics, and tech literacy. Our team had a set of 10 task scenarios and 3-sub tasks for participants to complete. We kept track of quantitative data including task completion rate, time it took to complete each task, and number of clicks.

At the end of the session, participants completed a System Usability Score (SUS) questionnaire using Google Forms.

Sample Tasks

Problems Identified

# of clicks (attempts) to complete task:

A reimagining of Carputty’s vehicle valuation tool.


FINAL SOLUTION

Below is our final solution: a reimagined and enhanced valuation tool that improves customer interaction, reliability, and data explainability.

Finding the Tool and Inputting Information

• Optimized Navigation Experience
• Wide Range of Input Fields into Valuation tool
• Personalizing the customer’s experience through a new car, used car, or selling a car flow
• Enhanced side-by-side view of Valuation information and the tool itself to increase efficiency

Interacting with the Data Visualizations

• Enhanced transparency with the disclaimer
• Time-based filters for data
• Ability to save a valuation and be notified if it changes later through email
• Increased tool reliability by showing a textual description of how each parameter affects the output

Comparing Vehicle Valuations

• New data table showing each model added to graph and corresponding parameters
• Ability to change singular parameters and observe changes to the valuation output

360° Vehicle Inspection

• Explore comprehensive exterior and interior view of a vehicle through an assessment of its condition
• Identify specific touchpoints that have influenced changes in car’s valuation

Post-project reflections.


TAKEAWAYS

Working on such a delightful, diverse team taught me many new things about design, collaboration, and stakeholder management. This was my first industry client outside of working as an intern, so I was constantly learning from those around me!

Collaborative features and applications in gig-work space.


NEXT STEPS