In late 2025, a well-established Sydney restaurant contacted us in crisis. Over the span of 10 days, their Google Business Profile had been flooded with 47 one-star reviews. Their rating had plummeted from 4.6 stars to 3.6 stars. Bookings had dropped by an estimated 35%. The owner was devastated and on the verge of responding publicly to each review — which, as we'll explain, would have been the worst possible move. Here's the complete story of how we detected the attack, built the evidence, disputed all 47 reviews, and saw 41 of them taken down by Google within 8 days.
The Client: Background
Our client — we'll call them "The Harbour Kitchen" for this case study — is a family-owned restaurant in Sydney's Inner West. They'd been operating for 12 years with a loyal customer base, consistent 4.5—4.7 star rating on Google, and approximately 380 genuine reviews. The restaurant specialised in modern Australian cuisine with a focus on locally sourced ingredients. They had a strong reputation in the local dining scene and regularly appeared in "best of" lists for the Inner West.
The owner contacted us on a Tuesday morning, voice shaking, explaining that their rating had dropped nearly a full star in under two weeks. They'd noticed the first cluster of negative reviews over the weekend and initially assumed they'd had a bad service night they weren't aware of. But as the reviews kept coming — five, ten, twenty — it became clear this wasn't normal customer feedback.
The Problem: Anatomy of the Attack
The Trigger
Three weeks before the attack began, a competing restaurant had opened directly across the street. The new competitor was a well-funded establishment that had invested heavily in fitout and marketing. Within the first week of their opening, The Harbour Kitchen noticed a slight dip in foot traffic — normal competitive dynamics. But then the reviews started.
The Attack Pattern
Over 10 days, 47 one-star reviews appeared on The Harbour Kitchen's Google Business Profile. The reviews arrived in waves:
- Days 1—3: 8 reviews — the "testing" phase, with moderately detailed complaints
- Days 4—6: 19 reviews — the heaviest wave, with shorter, more generic complaints
- Days 7—10: 20 reviews — a second heavy wave, some with slightly more effort put into the complaint details
The rating impact was devastating. With 380 existing reviews at an average of 4.6 stars, the influx of 47 one-star reviews dragged the overall rating down to 3.6 stars. Based on the Harvard Business School research on the cost of negative reviews, this one-star drop likely represented $40,000—$72,000 in annualised revenue loss for a restaurant of their size — not counting the immediate booking cancellations and walk-in losses.
Detection: How We Identified the Fake Reviews
When The Harbour Kitchen engaged us, we immediately launched our detection protocol. Within the first four hours, we had conclusive evidence that the vast majority of the 47 reviews were fraudulent. Here's what our analysis uncovered:
Finding 1: Reviewer Profile Analysis
We examined every reviewer profile in detail. Of the 47 suspicious reviews, 39 came from accounts that were created within 30 days of the review being posted. 34 accounts had only one review — the one on The Harbour Kitchen. 28 accounts had no profile photo. 6 accounts had reviewed a cluster of the same 3—4 other businesses (a pattern consistent with review rings that review multiple businesses to appear more legitimate before targeting the actual victim).
Finding 2: Timing Analysis
We mapped the exact posting times of all 47 reviews. A striking pattern emerged: 31 of the 47 reviews were posted between 2:00 AM and 5:00 AM AEST. Legitimate customer reviews for restaurants overwhelmingly cluster around mealtimes and evening hours — the 2 AM posting pattern was a clear anomaly. Furthermore, 14 reviews were posted within the same 90-minute window on a single night, a statistical impossibility for organic reviews.
Finding 3: No Customer Records
We worked with The Harbour Kitchen to cross-reference every reviewer name against their reservation system, POS records, and email marketing list. Zero matches. Not a single one of the 47 reviewers appeared in any of the restaurant's customer records. For a restaurant that had been operating for 12 years and had a comprehensive booking system, the probability of 47 consecutive genuine customers leaving no trace in their records was effectively zero.
Finding 4: Content Analysis
We performed detailed content analysis on the review text. Multiple reviews used nearly identical phrasing: variations of "worst experience ever," "wouldn't come back if you paid me," and "food was disgusting" appeared across reviews from supposedly different, unrelated customers. Seven reviews mentioned a "rude blonde hostess" — The Harbour Kitchen doesn't have a blonde hostess and hasn't for the entire 12-year history of the restaurant. Three reviews complained about "the parking lot" — the restaurant doesn't have a parking lot; it's a street-front establishment. Two reviews referenced menu items that The Harbour Kitchen has never served.
Finding 5: Geographic Inconsistencies
Among the 13 reviewer accounts that had additional review activity, we found geographic patterns that made no sense for genuine Sydney diners. Several accounts had reviewed businesses in Melbourne, Perth, and Brisbane within the same 7-day window as their Sydney review — a travel pattern that would be unusual for any individual, let alone 13 of them. Two accounts had reviewed businesses in Southeast Asian cities within days of reviewing a Sydney restaurant.
Strategy: Building the Dispute Case
With our detection analysis complete, we moved to the dispute strategy phase. Our approach was methodical and designed to give Google's moderation team everything they needed to act quickly.
Evidence Package Construction
For each of the 47 reviews, we created an individual evidence file containing a timestamped screenshot of the review, a screenshot of the reviewer's profile and review history, the specific Google content policy violation the review transgressed, the evidence supporting that violation (customer record search results, timing data, content analysis), and a one-paragraph summary of why this review should be assessed for policy compliance.
We also created a master summary document that mapped the entire attack pattern — the timeline, the reviewer profile overlaps, the content similarities, and the competitive context. This document was critical for the escalation phase because it demonstrated the coordinated nature of the attack in a way that individual review disputes couldn't.
Policy Violation Classification
We classified each review under the most appropriate Google content policy violation. For this case, the primary violations were:
- Fake engagement (42 reviews): Reviews from non-customers with no evidence of any interaction with the business
- Conflict of interest (3 reviews): Reviews where the reviewer's profile activity suggested connection to the competing business
- Misrepresentation (2 reviews): Reviews containing provably false factual claims (the blonde hostess, the parking lot, menu items that don't exist)
Batch Dispute Submission
Rather than flagging each review individually through the standard one-click flag on Google Maps, we used a dual-track approach. We filed individual disputes for each of the 47 reviews through the Google Business Profile dashboard, selecting the most appropriate violation category for each. Simultaneously, we opened an escalation case through GBP support, submitted our master evidence package, and framed the situation as a coordinated abuse case requiring priority assessment. Following the process we detail in our guide to reporting reviews for policy violations, each dispute was concise, policy-specific, and evidence-backed.
Results: The 8-Day Resolution
Day 1: Disputes Filed
All 47 individual disputes submitted through GBP dashboard. Escalation case opened with Google Business Profile support via phone. Master evidence package submitted via the Google review management tool. Support agent confirmed the case was being flagged as a coordinated abuse investigation.
Day 2—3: Initial Action
Google's automated systems processed the initial flags. 12 reviews — those from the most obviously fake profiles (single-review accounts, no profile photos, created within days of the review) — were taken down within 48 hours. The restaurant's rating moved from 3.6 to 3.9 stars.
Day 4—5: Human Review
The escalation to the content moderation team triggered a more thorough human review. An additional 18 reviews were taken down during this period. These were reviews where the evidence of fabrication required more analysis — the timing patterns, geographic inconsistencies, and content similarities we'd documented. The rating climbed to 4.2 stars.
Day 6—7: Second Escalation
For the remaining 17 reviews still live, we filed a follow-up escalation providing supplementary evidence — specifically, the customer record search results showing zero matches for any of the 17 remaining reviewers. We also highlighted the three reviews with conflict of interest evidence linked to the competitor.
Day 8: Final Resolution
Google actioned 11 more reviews, bringing the total to 41 of 47 disputed reviews taken down. The restaurant's rating was restored to 4.5 stars. The remaining 6 reviews, while we believed them to be fraudulent, were not actioned — Google's moderation determined they didn't meet the threshold for policy violation based on the available evidence. We advised the client on professional responses for these 6 remaining reviews.
The Outcome: By the Numbers
The financial impact of the recovery was substantial. The restaurant estimated that the 10-day period of the attack cost them approximately
Lessons Learned
Every case teaches us something. Here are the key takeaways from The Harbour Kitchen case that apply to any business facing a similar attack:
1. Speed Matters More Than Anything
The Harbour Kitchen contacted us within 10 days of the attack beginning. That was fast enough. Businesses that wait 30, 60, or 90 days before taking action face significantly harder dispute paths — evidence degrades, reviewer accounts build up activity that makes them look more legitimate, and the compound damage to ratings and search visibility deepens. If you suspect a coordinated attack, act within the first 48 hours if possible.
2. Don't Respond Publicly to Suspected Fake Reviews
The owner's initial instinct was to reply to each fake review, publicly calling them out as fraudulent. We advised strongly against this. Public responses to fake reviews alert the attacker that you're aware and investigating — which can prompt them to edit reviews to remove detectable patterns, delete accounts and create new ones to continue the attack, or escalate the volume of fake reviews. Instead, we crafted a single, professional response that the owner posted on two of the most visible fake reviews: a brief acknowledgement that they didn't recognise the reviewer and an invitation to contact the restaurant directly. This signalled professionalism to genuine readers without tipping off the attacker.
3. Evidence Quality Determines Outcomes
The 87% success rate in this case wasn't luck — it was evidence. The detailed reviewer profile analysis, timing data, customer record cross-referencing, and content analysis gave Google's moderation team clear, actionable information. Disputes that simply say "this is a fake review" fail far more often than disputes that say "this review violates Google's fake engagement policy — here's the evidence."
4. The Dual-Track Approach Works
Filing individual disputes AND opening an escalation case is more effective than either approach alone. Individual disputes ensure each review is assessed on its own merits. The escalation case provides the coordinated attack context that individual disputes can't convey. Together, they give Google both the granular evidence and the big picture.
5. Some Reviews Will Survive — and That's OK
We disputed 47 reviews and 41 were taken down. The 6 that survived weren't necessarily legitimate — but Google's moderation team didn't find sufficient evidence to action them under their policies. An 87% success rate is an excellent outcome. The remaining 6 reviews, now surrounded by hundreds of genuine positive reviews at a 4.5-star average, have minimal impact on the restaurant's overall rating and perception.
6. Prevention Is Part of the Solution
After the crisis was resolved, we implemented ongoing monitoring for The Harbour Kitchen through our reputation management service. Real-time review alerts mean that any future attack would be detected within hours, not days. We also helped them reinvigorate their review generation strategy, building their review count from 380 to over 450 in the months following the incident — making their profile even more resilient to future manipulation.
Is Your Business Under Review Attack?
The Harbour Kitchen's story is one of dozens of competitor attack cases we've handled. Every case is different, but the fundamentals are the same: rapid detection, thorough evidence, strategic disputes, and effective escalation. Our free review audit is the first step — we'll analyse your profile, identify any reviews that may violate Google's policies, and give you a clear action plan.
Get Your Free Review AuditCompliance Note: Review dispute outcomes vary based on the evidence available and Google's assessment of policy compliance. Past results, including those described in this case study, do not guarantee future outcomes. Every case is assessed individually by Google's content moderation team. Review Dispute Pro facilitates the dispute process but does not control Google's moderation decisions.