• WORK IN PROGRESS

  • WORK IN PROGRESS

  • WORK IN PROGRESS

  • WORK IN PROGRESS

  • WORK IN PROGRESS

  • WORK IN PROGRESS

  • WORK IN PROGRESS

  • WORK IN PROGRESS

  • WORK IN PROGRESS

Warwick Gavaghan

Product Designer

Improving Google review conversion rate from 8% to ~50%

Jan 25, 2026

Outcome

Transformed Google Review cross-posting, from a struggling feature with 8% baseline conversion, into one of RateMyAgent's most effective tools, reaching 55% conversion through iterative development and behavioural design. Over 65,000 Google reviews were successfully converted, directly enabling agents to build a competitive online presence while improving platform engagement.

Challenge

RateMyAgent (RMA) faced a critical problem: while 65% of clients left a review on RateMyAgent, only a small fraction posted it to an agent's or agency's Google profile. Studies have shown that 40% of people research their agent online, and 61% look up a business online before engaging with it. This meant agents were giving themselves a disadvantage by not adding their reviews to their Google Profiles. They were making their otherwise stellar reputation invisible where it mattered most.

The challenge was two-fold:

  1. How might we build trust with agents to connect their Google account with our platform?

  2. How might we encourage their clients (sellers & buyers) to leave a Google review after completing an RMA review?

Solution

Rather than shipping a single solution, we built incrementally. Each phase addressed a specific barrier to adoption before optimising further.

Phase 1 - Google Account Connections

We started by addressing the fundamental prerequisite, getting agents to connect their Google profiles to RateMyAgent. Research showed agents knew they should be on Google but didn't understand the benefits, and found the setup process confusing.

We streamlined the Google connect flow and communicated clear value with messaging like "Grow your Business with Google Reviews." This removed friction at the connection point and helped agents understand why this integration mattered.

Result: With more Google accounts connected, we saw a natural influx of RMA-to-Google review conversion opportunities. Conversion jumped to 20.5%.

Phase 2 - Review Flow & Reminders

Even with accounts connected, consumer adoption remained low. Clients were skipping the Google Review step in our flow, partly due to review fatigue and partly because they didn't understand what they were supposed to do.
We made three changes designed to reduce friction and encourage Google review completion:

  • Review Flow Optimisation: We simplified the flow to 3 steps and optimised the Google step with clearer instructional language. The key innovation was a single CTA that copied their review text and opened Google Reviews directly, eliminating the need for consumers to manually retype or search.

  • Email Reminders: Reviewers who skipped the Google step received a variant of our review completion email, encouraging them to share their review on Google.

  • SMS Reminders: A text message sent 10 minutes after a client completed an RMA review, perfectly timed to coincide with peak engagement, nudged them to post to Google.

These features shipped sequentially. After review flow optimisations, conversion jumped to 33.2%, with 1 in 3 reviewers leaving a Google review. When we subsequently added email and SMS reminders, conversion accelerated further to 48% or almost 1 in 2 reviews converting.

Result: 48% conversion through staggered feature releases that each showed measurable impact.

Phase 3 - Historical Reviews

The final optimisation addressed scale. Agents could request Google reviews from past clients one-by-one, but this was repetitive, manual work. We introduced a batch process allowing agents to send targeted Google review invites to past clients in bulk.

By default, the process was limited to clients from the past 12 months. Earlier testing had shown that this audience converted more than those with older reviews.

Result: Conversion peaked at 55% in August 2023. This eventually settled between 40-50% month-over-month, indicating sustainable adoption rather than a temporary spike.

Reflections

This project taught me that sustainable adoption comes from understanding why users resist a feature, not just optimising the feature itself.

A poor user experience created friction, but that wasn't the root problem. Agents didn't understand the value of connecting Google, and they didn't trust our system with their clients' data. This insight reframed how we approached the project from the beginning—focusing on communicating value and building trust before removing friction.

By August, we'd exceeded what we initially thought was possible. The business wanted to keep optimising through better copy and A/B testing. But we hit a ceiling that can't be solved through behaviour change alone. Full automation of RMA-to-Google conversion isn't possible within Google's API constraints. Recognising this technical limitation prevented us from investing in optimisations with diminishing returns.

The staggered release approach, shipping one improvement at a time over several months, gave us clear attribution for each feature's impact. This is a discipline I've carried into subsequent projects. One meaningful change, measured rigorously, beats multiple changes shipped together, because you actually understand what drove results.

@ 2026 All rights reserved