• Stop being a LURKER - join our dealer community and get involved. Sign up and start a conversation.

Is DealerRater breaking Schema rules for leverage against dealers?

Apr 13, 2012
208
173
Awards
6
First Name
George
When you Google your dealership name, what 3rd parties appear in the search results? On your Google My Business page, what "Reviews from the web" appear? For many dealers DealerRater appears with their dealer listing. They use schema, along with their large amount of reviews to rank on page 1 for many dealers, and appear as a 3rd party review on their GMB pages. However, the review average is not based on the stated number of reviews, and could be misleading to consumers.

In the attached screenshot, Win Chevrolet in Carson California has a 2.0 stars with DR, but this is only based on the 18 reviews over the last 2 years. To be accurate, I feel the average should be based on 88 reviews, since that is the large number of reviews they are using to score well with schema (image attached also).

The challenge becomes, a dealer works hard to build up a 4-5 star rating, but then if they don't work to drive new DR reviews, their average could dramatically fall. I think dealers should be sheltered from a few recent low reviews by having their large number averages help them. Shouldn't review sites either calculate average on the aggregate, or tell schema that average is based on a lower number of reviews? What do you think?

win_chevrolet_carson_california_-_Google_Search.jpg

Win_Chevrolet_-_Chevrolet__Used_Car_Dealer__Service_Center_-_Dealership_Ratings.jpg
 
  • :light:
Reactions: Jeff Kershner
I completely agree. It is so frustrating to see this happen again and again in my stores. I think Google needs to do something about this. It reminds me of times before we had geo location when people would use meta data to spew google results. Google needs to keep cracking down now like they did then.
 
  • Like
Reactions: George Nenni
The methodology behind the 24-month rating was really a result of the high percentage of staff turnover within dealerships. We felt it was important for a consumer to view a relatively timely representation of the reputation and type of consumer experiences taking place at the dealer over a 2 year period. We all know dealers struggle to retain talent at the sales and management level and because of that a dealer's culture can significantly improve or decline depending on leadership and staff. Now that being said it wasn't lost on us that the 24-month rule ensured that dealers stay current on DealerRater... but that was not the motivation for the rule. When the 24-month rating was put in place, DealerRater was really the only game in town. We were not concerned with Google reviews or other review sites for that matter. That nail-biting concern came a few years later;0. Now I'm not sure if the motivation behind the 24-month rule still stands true today with Cars.com at the helm but certainly staff, particularly sales staff turnover at the dealership is still a huge issue if not even a greater issue than 10- 12 years ago. Personally, when I read reviews about any product, service, or business I tend to only pay attention to the reviews posted within the last year. I consider reviews older than that almost irrelevant but that is just me :)
 
I'm with Heather on this. Many business, including dealers can often go through many personnel changes that can drastically change how they operate. I often ignore reviews that are older than a year and really focus on the last 6-12 months and I'm hesitant with a business that doesn't have ANY or very few reviews in the last 3-6 months.
 
George, appreciate the conversation starter here as it's important we're transparent.

I can assure you and others who read this thread that there are no nefarious or subversive intentions at DealerRater behind the markup - we're simply telling Google that we have review content for a particular dealership and then ensuring that Google pulls the average rating correctly. We've worked with them in the past on this markup and they are intentionally not prescriptive with how the rating must be calculated. All ratings sites calculate their average rating a bit differently, and we tell consumers up front on our dealer profile pages exactly how we calculate this rating.

As for the logic behind the 24-month rating, Heather nailed it in her post above...reviews have the shelf life of a carton of milk and it's been our best practice from day 1, now nearly 20 years ago, to have the rating be primarily reflective of "recent" experiences. Dealers shouldn't be penalized today for poor service three years ago that has since been cleared up through process improvements and staff turnover. We do make those older reviews available on our site for consumers who choose dig deep in their research, however they don't factor into the current overall rating. Google takes a very similar approach with respect to ratings; they heavily weight recent reviews in their scoring and also factor review recency into their larger algos.

Having said all that, we're always looking for ways to improve how our site (and display of our pages in search) serves consumers in their dealer research process, and in a fair and balanced way, particularly when it comes to a rating calculation - so George I appreciate you sharing that important observation and the chance to contribute to this discussion.
 
I think George's point is that DR shouldn't be able to have the best of both worlds.

If DR wants to use the total aggregate review number to increase the chance they are listed on a dealer's GMB (which then is probably leveraged to sell their products to dealerships) they should then use all of those reviews when calculating the score they reflect. I can see the perspective that using all reviews to get listed on a GMB, but then only showing a certain subset when showing the rating, can be viewed as misleading.

I hopped over to the DR page for Win Chevrolet and found it amusing how DR displays this data as "2.0 / 88 lifetime reviews" but then when you hover over it it clarifies that "a dealership's score is calculated by averaging scores received in the last 24 months"... so it isn't really 2/5 based on the 88 reviews.
 
I think George's point is that DR shouldn't be able to have the best of both worlds.

If DR wants to use the total aggregate review number to increase the chance they are listed on a dealer's GMB (which then is probably leveraged to sell their products to dealerships) they should then use all of those reviews when calculating the score they reflect. I can see the perspective that using all reviews to get listed on a GMB, but then only showing a certain subset when showing the rating, can be viewed as misleading.

I hopped over to the DR page for Win Chevrolet and found it amusing how DR displays this data as "2.0 / 88 lifetime reviews" but then when you hover over it it clarifies that "a dealership's score is calculated by averaging scores received in the last 24 months"... so it isn't really 2/5 based on the 88 reviews.
@Josh S you nailed it, that is exactly my point. If I'm a dealer, and a 3rd party review site has found their way onto my GMB page, I need to make sure they are legitimately there. @oldershawj02 Jamie, I appreciate you sharing in the discussion. Using the full count of reviews to land on dealer pages, but only factoring recent reviews feel disingenuous. Here's an example: A dealer has a 4 star rating with DR, but doesn't have any reviews for 1.5 years. They then receive 1-2 negative reviews and are sitting at a 1.5 star rating (when they really are a 3.9 or so). What is the prescribed advice? Drive new customers to DR of course to leave more positive reviews, thus increasing the relevancy connection between DR and my dealership, starting the entire cycle over. Plus if those new reviews need response, the dealer must pay DR to respond to those reviews. Your system forces dealers to keep you relevant on their page, and ultimately subscribers. Schema.org says to list the total number of reviews, and the aggregate average. I don't feel you are doing that. You are listing the total number of reviews, and then a false average to force the dealer to keep DR relevant to their name. Sorry, but that feels good for DR, but not for the dealer, and not playing fair. Sorry, that's how I see it. I was shocked to hear that is how you all are operating, feels to much like Yelp for my taste.
 
@georgenenni Yes, dealers should absolutely be actively soliciting fresh review content on all platforms given its many health benefits. Certainly we try to encourage this through the average rating calculation, similar to how Google and many other review sites calculate ratings. And your observation about lifetime vs. 24-month review count in the markup is important and useful feedback.

However, dealers do not need to pay us a dime to respond to reviews on our platform. We have thousands of non-paying dealers that have basic admin access on DealerRater to respond publicly to every review, positive or negative, that is left on our platform. Dealers can contact our support team at [email protected] to get set up for free access.
 
George- is Win Chevrolet a client of yours? Based on their reviews for the past year or so on all the major review sites (Google Yelp, etc.) it seems like they could use some help with customer experience. I'm not sure why DealerRater was singled out here, but the fact is the 24 month limit allows dealers who have improved their processes to not be weighed down by former bad reviews. If Win is able to get good reviews it absolutely should be encouraging a portion of them to be left on DR.

Great to see Jamie was able to clarify for you that dealers don't need to pay DealerRater anything to respond to reviews. Also- I'm curious how you feel about Google reviews, since a dealer is forced to provide reviews there or risk losing core visibility for his or her store. It's literally an existential necessity. And Google has no option for a mediation capability to allow a dealer to try to fix a customer problem before the review is posted (a capability that is part of the DR subscription). DealerRater is demonstrably more pro dealer than any other review site, and is the leader in showcasing the performance of individual salespeople dedicated to providing good experiences for their customers.

No review site does more for its dealers.

And for transparency- former DealerRater-er here- I'm in the Auto industry but don't sell to dealers so I have no skin in this. Haven't worked for DR in nearly three years and still love the company. Great people, great product. Every dealer ahould be leveraging DealerRaters' passion and expertise, whether they choose to pay or not.

Brian Epro
 
@georgenenni Yes, dealers should absolutely be actively soliciting fresh review content on all platforms given its many health benefits. Certainly we try to encourage this through the average rating calculation, similar to how Google and many other review sites calculate ratings. And your observation about lifetime vs. 24-month review count in the markup is important and useful feedback.

However, dealers do not need to pay us a dime to respond to reviews on our platform. We have thousands of non-paying dealers that have basic admin access on DealerRater to respond publicly to every review, positive or negative, that is left on our platform. Dealers can contact our support team at [email protected] to get set up for free access.
Glad to hear dealers can reply to reviews without subscribing, I stand corrected. However, I still feel DealerRater is not being upfront with consumers and penalizing dealers. DR should either restate the number of reviews in average, or follow schema for true calculation of averages. For me this math is intended to force dealers to keep DR relevant and on page 1.