Product Comparison
How we reinvented search to redefine the shopping experience.
Product omparison
Product Comparison
Product omparison
Product omparison
Customers loved our marketplace for its vast selection—over 12 million SKUs covering almost anything they could want. But with such an overwhelming variety of options, many shoppers kept asking the same question:
"Where's your product comparison tool?"
The truth was, we didn’t have one. And for a long time, we thought we couldn’t.
Customers loved our marketplace for its vast selection—over 12 million SKUs covering almost anything they could want. But with such an overwhelming variety of options, many shoppers kept asking the same question:
"Where's your product comparison tool?"
The truth was, we didn’t have one. And for a long time, we thought we couldn’t.
Customers loved our marketplace for its vast selection—over 12 million SKUs covering almost anything they could want. But with such an overwhelming variety of options, many shoppers kept asking the same question:
"Where's your product comparison tool?"
The truth was, we didn’t have one. And for a long time, we thought we couldn’t.






The Final Results
The Final Results
The Final Results
Results
Results
Results
Start at the End
Start at the End
Start at the End
Start at the End
We launched the Comparison feature strategically, starting with categories where we expected the highest impact, such as Power Tools, Fasteners, and Electrical Supplies. While user testing provided valuable insights, nothing beats real-world data. By focusing on these spec-heavy categories, we tracked key metrics: Add-to-Cart Rate (ATC), Average Order Value (AOV), and Abandonment Rate.
We quickly saw clear results.
We launched the Comparison feature strategically, starting with categories where we expected the highest impact, such as Power Tools, Fasteners, and Electrical Supplies. While user testing provided valuable insights, nothing beats real-world data. By focusing on these spec-heavy categories, we tracked key metrics: Add-to-Cart Rate (ATC), Average Order Value (AOV), and Abandonment Rate.
We quickly saw clear results.
We launched the Comparison feature strategically, starting with categories where we expected the highest impact, such as Power Tools, Fasteners, and Electrical Supplies. While user testing provided valuable insights, nothing beats real-world data. By focusing on these spec-heavy categories, we tracked key metrics: Add-to-Cart Rate (ATC), Average Order Value (AOV), and Abandonment Rate.
We quickly saw clear results.
Notable Outcomes for the Business
Notable Outcomes for the Business
42% of users add one of the compared items to cart
17% overall increase in add to cart rate
10% increase in AOV compared to control group
42% of users add one of the compared items to cart
17% overall increase in add to cart rate
10% increase in AOV compared to control group
42% of users add one of the compared items to cart
17% overall increase in add to cart rate
10% increase in AOV compared to control group
These early wins validated the feature, leading to a broader rollout across the site. The clean, table-based design allowed users to compare up to four items, with a responsive mobile layout ensuring a seamless experience on any device.
These early wins validated the feature, leading to a broader rollout across the site. The clean, table-based design allowed users to compare up to four items, with a responsive mobile layout ensuring a seamless experience on any device.
These early wins validated the feature, leading to a broader rollout across the site. The clean, table-based design allowed users to compare up to four items, with a responsive mobile layout ensuring a seamless experience on any device.
Introduction
Introduction
Introduction
Project Background
Background
Project Background
How We Got Here
How We Got Here
How We Got Here
How We Got Here
The feedback was impossible to overlook. Shoppers consistently told us that comparing products side by side was essential. Competitors had this functionality; they expected us to have it, too. Internally, we knew this feature was missing. During user testing, it came up so often that it became a running joke.
But this wasn’t just a nice-to-have. It was a clear gap in our shopping experience.
Our first attempt to address it had been frustrating. The third-party tools we used to display search results weren’t designed to pull detailed product data into a table. The workaround? API calls so expensive they would have cost us hundreds of thousands of dollars annually.
That roadblock left the feature on the shelf, collecting dust—until an unexpected breakthrough gave us hope.
The feedback was impossible to overlook. Shoppers consistently told us that comparing products side by side was essential. Competitors had this functionality; they expected us to have it, too. Internally, we knew this feature was missing. During user testing, it came up so often that it became a running joke.
But this wasn’t just a nice-to-have. It was a clear gap in our shopping experience.
Our first attempt to address it had been frustrating. The third-party tools we used to display search results weren’t designed to pull detailed product data into a table. The workaround? API calls so expensive they would have cost us hundreds of thousands of dollars annually.
That roadblock left the feature on the shelf, collecting dust—until an unexpected breakthrough gave us hope.
The feedback was impossible to overlook. Shoppers consistently told us that comparing products side by side was essential. Competitors had this functionality; they expected us to have it, too. Internally, we knew this feature was missing. During user testing, it came up so often that it became a running joke.
But this wasn’t just a nice-to-have. It was a clear gap in our shopping experience.
Our first attempt to address it had been frustrating. The third-party tools we used to display search results weren’t designed to pull detailed product data into a table. The workaround? API calls so expensive they would have cost us hundreds of thousands of dollars annually.
That roadblock left the feature on the shelf, collecting dust—until an unexpected breakthrough gave us hope.
The Path Forward
Path Forward
The Path Forward
Development
Development
Development
New Opportunities
New Opportunities
New Opportunities
New Opportunities
One day, while another team demoed our new in-house search engine, we saw a glimmer of possibility. This tool had capabilities far beyond our existing solutions. It could handle dynamic data retrieval seamlessly and without the astronomical costs of third-party APIs.
Suddenly, building a product comparison tool felt possible.
One day, while another team demoed our new in-house search engine, we saw a glimmer of possibility. This tool had capabilities far beyond our existing solutions. It could handle dynamic data retrieval seamlessly and without the astronomical costs of third-party APIs.
Suddenly, building a product comparison tool felt possible.
One day, while another team demoed our new in-house search engine, we saw a glimmer of possibility. This tool had capabilities far beyond our existing solutions. It could handle dynamic data retrieval seamlessly and without the astronomical costs of third-party APIs.
Suddenly, building a product comparison tool felt possible.






We knew this wasn’t a small project, but we also didn’t want to disrupt the broader work being done on our search engine. So, we scaled our ambitions. This would be a proof of concept—a limited rollout to a small category of SKUs where we could test and refine our ideas.
With input from the Merchandising team, we selected our pilot category and began mapping out a roadmap to align with ongoing search-related projects.
We knew this wasn’t a small project, but we also didn’t want to disrupt the broader work being done on our search engine. So, we scaled our ambitions. This would be a proof of concept—a limited rollout to a small category of SKUs where we could test and refine our ideas.
With input from the Merchandising team, we selected our pilot category and began mapping out a roadmap to align with ongoing search-related projects.
We knew this wasn’t a small project, but we also didn’t want to disrupt the broader work being done on our search engine. So, we scaled our ambitions. This would be a proof of concept—a limited rollout to a small category of SKUs where we could test and refine our ideas.
With input from the Merchandising team, we selected our pilot category and began mapping out a roadmap to align with ongoing search-related projects.
Final Product
Final Product
Final Product
Launching MVP
Launching MVP
Launching MVP
Getting On With It
Getting On With It
Getting On With It
Getting On With It
Before designing anything, we turned to the real experts—our users. We conducted unmoderated tests, watching customers navigate competitor sites to compare products. These observations taught us a lot about what worked—and what didn’t.
The Merchandising team helped us prioritize which product details mattered most for comparison, and we organized them into a hierarchy. This led to a clean, table-based layout that worked beautifully on desktop.
But mobile posed a challenge. It’s easy to see all four items side-by-side on my big ass monitor, but what about on my iPhone 12 Mini? How could we create a responsive design that kept the table intuitive without feeling cramped? The answer wasn’t obvious, so we sketched ideas and built two competing prototypes.
Before designing anything, we turned to the real experts—our users. We conducted unmoderated tests, watching customers navigate competitor sites to compare products. These observations taught us a lot about what worked—and what didn’t.
The Merchandising team helped us prioritize which product details mattered most for comparison, and we organized them into a hierarchy. This led to a clean, table-based layout that worked beautifully on desktop.
But mobile posed a challenge. It’s easy to see all four items side-by-side on my big ass monitor, but what about on my iPhone 12 Mini? How could we create a responsive design that kept the table intuitive without feeling cramped? The answer wasn’t obvious, so we sketched ideas and built two competing prototypes.
Before designing anything, we turned to the real experts—our users. We conducted unmoderated tests, watching customers navigate competitor sites to compare products. These observations taught us a lot about what worked—and what didn’t.
The Merchandising team helped us prioritize which product details mattered most for comparison, and we organized them into a hierarchy. This led to a clean, table-based layout that worked beautifully on desktop.
But mobile posed a challenge. It’s easy to see all four items side-by-side on my big ass monitor, but what about on my iPhone 12 Mini? How could we create a responsive design that kept the table intuitive without feeling cramped? The answer wasn’t obvious, so we sketched ideas and built two competing prototypes.
Our two options included a side-scroll table that matched 1:1 with what we had on desktop. The other used pagination so you could see each individual item side-by-side.
We tested both mobile prototypes with real users, and the feedback was basically unanimous—users loved the simpler side-scroll option. Users consistently described it as “intuitive” and “easy to use.”
You can try the two prototypes by clicking on the images if you want to see for yourself.
Ready, Set, Launch!
We launched the product comparison tool as a proof of concept, limiting it to a small subset of SKUs in two categories. This allowed us to gather data, observe real-world user interactions, and identify any unexpected challenges.
The initial rollout lasted six weeks, during which we carefully monitored user behavior and feedback. We were able to collect data on its effectiveness and on usage rates. Some of us were surprised that only a small percentage of users ever interacted with it, but one user in our test summed this up nicely: “I won’t always need it, but when I do, I’ll be very glad it’s there.”
We launched the product comparison tool as a proof of concept, limiting it to a small subset of SKUs in two categories. This allowed us to gather data, observe real-world user interactions, and identify any unexpected challenges.
The initial rollout lasted six weeks, during which we carefully monitored user behavior and feedback. We were able to collect data on its effectiveness and on usage rates. Some of us were surprised that only a small percentage of users ever interacted with it, but one user in our test summed this up nicely: “I won’t always need it, but when I do, I’ll be very glad it’s there.”
We launched the product comparison tool as a proof of concept, limiting it to a small subset of SKUs in two categories. This allowed us to gather data, observe real-world user interactions, and identify any unexpected challenges.
The initial rollout lasted six weeks, during which we carefully monitored user behavior and feedback. We were able to collect data on its effectiveness and on usage rates. Some of us were surprised that only a small percentage of users ever interacted with it, but one user in our test summed this up nicely: “I won’t always need it, but when I do, I’ll be very glad it’s there.”
Lessons Learned
Lessons
Lessons Learned
Conclusion
Conclusion
Conclusion
The Conclusion
The Conclusion
The Conclusion
The product comparison feature was a resounding success. After refining it based on the POC, we rolled it out across the entire site, where it became a standard part of the shopping experience.
Beyond its functionality, the project taught us invaluable lessons about listening to users, balancing ambition with feasibility, and embracing flexibility in the face of new challenges.
Today, the comparison tool stands as a testament to what happens when cross-functional teams, thoughtful design, and user feedback align to solve a real problem. It’s not just a feature—it’s a symbol of how we’ve evolved to meet customer needs and expectations for online shopping experiences.