Strategies, research, industry trends — your pulse on the marketplace
The   Bazaar   Voice
Strategies, research, industry trends — your pulse on the marketplace
 

Picking a trustworthy ratings and reviews provider: Moderation

October 6, 2020

By Brittany Shulman

An essential part of our business at Bazaarvoice is review moderation. In no way does this mean moderating out negative reviews. In fact, we encourage our clients to find value in negative reviews. With moderation, our goal is to ensure that the user-generated content that appears on our client’s websites is not only pertinent to consumers, but that it’s safe for general consumption. Throughout the moderation process, we look for content that is not offensive, is on the correct product, and that is actually relevant, and helps consumers make a decision about if that product is a good fit for them.

In a world of fake news and misleading social media posts, it’s easy to look at online content and assume nefarious origins. It’s up to our moderation team to make sure that we’re transparent with consumers about what we’re doing, so that they can feel like they can trust the content that they see on brands’ and retailers’ product pages. At the same time, the reality is there are nefarious players out there, so it’s crucial to also build processes that weed out any fraudulent content or posters.

As part of our trust and privacy blog series, we interviewed Abi Schuman, the Senior Director of Content Management Services. She oversees our moderation team and helps to regularly innovate on our moderation processes and standards. Here’s what she had to say: 

Can you walk us through Bazaarvoice’s moderation processes?

“At Bazaarvoice, content moderation can be done by a machine, a human, or a combination of both. Even our machines are constantly learning and being reviewed by humans to make them better. Our process is ever-evolving and has pretty high quality standards. A statistically significant portion of every client’s content gets moderated a second time for quality purposes.

When a consumer is submitting content, like a product review, they’re typically going to see some terms and conditions that give them some guidance around what’s important to talk about and maybe some topics they shouldn’t talk about.

In the moderation process, there are a few things we look for – does it appear to be a real review? Is it on the right product? Is the reviewer staying on topic? A common reason a review may be rejected, for example, is if someone buys a product, but the entire review is about the person at the store that was rude to them. That’s a perfectly valid complaint, but that doesn’t help another consumer decide whether or not that specific product is useful to them.

It’s important to note that we’re sentiment neutral. The decision about what content does or doesn’t appear, doesn’t have anything to do with whether the content is positive or negative. So in that example I just used about customer service, even if the review said they just thought the person at the store was the nicest person they’ve ever interacted with, that review would still get rejected, because if the product is a blow dryer, that does not tell me anything about whether or not that blow dryer is going to be useful to me.”

         Our authenticity process is separate from moderation, and you can learn more about that here.

How does Bazaarvoice handle moderation and authenticity processes?

“The processes happen in tandem. When content comes in, it goes into a stream to be moderated, because we want to be really quick about getting it to display. Authenticity can sometimes take a little bit longer, and it’s cumulative.

Moderation and authenticity are two technically separate processes, but they’re typically happening in the same timeframe.”

What is the split of computer moderation versus human moderation at Bazaarvoice?

“About 70% of our content gets auto-moderated by machines, that’s up from about 20% two years ago. This leaves 30% that’s moderated by humans. And that’s because the science is getting better. We have a team that’s been working very, very hard to get there. Like I said, even when humans are moderating, we’re approving nearly everything. Most reviews really are, “This lamp is great. It said that it was great for desks and movable, and it fulfilled all those purposes.”The machine’s looking for key terminology, key themes around customer service or shipping complaints or liability issues, or defamation or offensive key terms. The great thing about machine learning and having that upfront is that it’s faster than humans. If you’re somebody trying to make a decision, we’re getting reviews out there quickly by making that quick assessment using the machines.

The types of content that will always go to a human for moderation is anything in a highly regulated industry such as pharmaceuticals or anything that requires Adverse Events reporting. All of that by default goes to a human along with a statistically significant portion of each individual client’s data set. We also moderate in close to 40 languages, and so we work to cover those as well with humans and unique models.” 

How does Bazaarvoice train moderators to make sure they meet our standards?

“That was actually my first job at Bazaarvoice 12 years ago, building training curriculum for new moderators. We’ve come a long way. Moderators go through a series of web-based trainings and take 4 initial exams.

When a moderator starts moderating, 100% of their content’s reviewed by a second tenured moderator. When the second moderator reviews it, they leave notes with feedback for the first moderator. The first moderator, the new one, can see all of those notes, so it’s allowing them to adapt their training logic.

Over time, we gradually reduce the quantity of work being reviewed by a second moderator as we see new moderators build up their skills. There’s a combination of upfront training, and there are also weekly individual coaching sessions based on specific work and what we’re seeing that specific moderator having a harder time with.”

What makes Bazaarvoice’s moderation process stand out among ratings and review providers?

“First, I think languages are a big differentiator for us – we moderate in a lot of languages. This allows a lot of our global brands to offer consistent experiences across their sites.

Second, is experience. In a fairly new industry, there’s really no one that’s been doing it longer than we’ve been doing it. I was able to sit on the group that wrote the international standard on moderation online authenticity with the ISO. I’ve been at Bazaarvoice for 12 years, and Liz Jury, our manager of client services operations, has been here for 13 years. That level of expertise helps us to define best practices for our clients and their industries, as well as react quickly to the various day to day things that can arise and become present in reviews in response to a product or brand being in the media for example..

Third, is our moderators. A lot of them are folks who quite honestly have had our jobs before, they’ve just gotten to a point in their life where they need more flexibility in their careers. Some of them are retired or raising their families, and it gives them an opportunity to work by balancing out their life, and so it allows us to get a really educated, dedicated workforce.

Finally, we’re able to build thorough learning algorithms because of the quantity of data we have access to from our clients.”

Where do you see Bazaarvoice’s moderation process and policies heading in the future?

“I think that the big future for us is going to be around visual [content]. While we definitely moderate visual content today, we don’t do a whole lot of syndicating for visual content. I think with our acquisition of Curalate, you’re going to see a lot more brands sharing their visual content with retailers.

I think the big future thing for Bazaarvoice will really be around duplicating and building these processes for visual content.

Additionally, I think it’ll be an interesting path around moderation of social content. We do a little bit of that today, but with the acquisition of Influenster, often times when people are sampling products, they’re not only leaving reviews, but they’re leaving social content too. In all of these circumstances, we are really evaluating content as is and it is important to us that in the future, as with all types of content, we moderate, but never edit content, to make sure we are putting out the most on topic, helpful, appropriate content for consumers to help drive purchasing decisions.”

––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––––

Want to to learn more about how Bazaarvoice helps our clients  enhance their user-generated content programs? Connect with us here.

Brittany Shulman

Brittany Shulman

Senior Content Strategist

Read more from Brittany

Chapters

Want the latest content delivered straight to your inbox? Join our monthly newsletter.