Harnessing the Power of Feedback — Whiteboard Friday

In this week’s episode of Whiteboard Friday, Mozzer Meghan Pahinui takes you through the process we use to implement customer feedback, in the hopes that you can take it and apply it to your own content creation and maintenance strategies.

infographic outlining how Moz collects and implements customer feedback

Click on the whiteboard image above to open a high resolution version in a new tab!

Video Transcription

Hey, Moz fans. Welcome back to another edition of Whiteboard Friday. My name is Meghan, and I'm on the Learning team here at Moz. Today, I'm going to talk to you about harnessing the power of feedback when it comes to content iteration.

So one of the projects that I contribute to in my position here is taking care of our customer help center, which we call the Help Hub. If you're not familiar with the Help Hub, this is where we house all of our how-to guides, tips and tricks, workflows, and troubleshooting guides for the Moz tools. I do encourage you to check it out if you have some time later or if you have questions about the tools.

A key part of maintaining the Help Hub includes gathering, monitoring, and implementing customer feedback, and this is a crucial component for us. Why is that? Well, because we want to be sure that we're providing quality, helpful content to our customers. In addition, this process allows our customers to find answers to their questions quickly and easily at any time. It does take some of the lift of our Help team as well by reducing the number of tickets that they receive asking these very questions. So I'm going to go ahead and take you through the process that we use to implement customer feedback, in hopes that you can take it and apply it to your own content creation and maintenance strategies.

Gather data

So what is the first step? Well, first, we gather data, because we don't know what feedback people have if we don't ask for it, right? So if you are familiar with the Moz Help Hub, you may have seen our surveys that are at the bottom of all of our articles. Here, we ask if the article was helpful, and we do this with a series of emojis that indicate if it met their needs or not. If the customer indicates that it did not, they do have the option to enter a comment letting us know why.

When it comes to gathering data on your own content, you may opt to add a survey like this, or there are plenty of other ways that you can start to gather data to work with. So if you have a social media presence, you can start keeping track of feedback there or ask your followers directly for insight into what they find most helpful and least helpful about your content. Or you can send a survey out via email, ask your customer service team for feedback, look at customer emails and tickets to see what questions the customers may be asking, which you're not currently answering on your site. We do that as well. We work closely with our Help team to ensure that everything that we can answer is answered in the Help Hub.

Analyze

So once you have that data to work with, it's time to analyze it. So we review both vote counts, meaning helpful versus not helpful, as well as comments here at Moz. When comments are left, we look to see if there's an opportunity to implement a change in that guide based on the feedback. Additionally, if we're seeing a trend in votes for a particular page or section, we will take a look at how we can improve or reword, update the content to better serve the customer.

One thing to keep in mind during this step, however, is that not all feedback you receive will be actionable, and that's okay. A few questions that I like to ask myself when looking at feedback for an article are: Was the customer on the right page to find their answer? If not, how did they get to this page? Is there an opportunity to help them find their way to the correct page, whether that's through links or additional resources, etc.? Is there a question that I can answer on this page, or should this question have its own dedicated page? Sometimes we end up writing whole new guides based on feedback that we get from customers. What was the customer trying to achieve? How did this guide fall short in helping them achieve that goal?

Implement change

So now that we've identified areas for improvement, it's time to implement changes to that content. So this may be as simple as adding an FAQ to answer a specific question or as involved as writing a new workflow or troubleshooting guide, as I mentioned previously. Just as an example, some specific things, some specific changes that we implemented based on customer feedback include adding quick links to all of our pages for easier navigation, creating separate pages for each of our keyword metrics, and building out multiple workflows based on questions that customers have asked.

This step in the process may look different depending on the type of content that you create and the type of feedback that you receive. For example, if you primarily work in creating video content and you receive feedback that customers wish the videos had subtitles, you may opt to implement those on past videos as well as any that you release moving forward. However, if you have a blog or a newsletter or some other type of long-form content, it may not make sense to use resources to update older pieces of content. Instead, you may opt to start implementing those changes in your content moving forward. It may be a combination of those. Maybe you have some really popular articles that can be updated from the past and start implementing those changes in your content moving forward.

Track results

So after implementing your changes, you want to be sure to track your results. We track our votes and survey responses regularly to help monitor for update opportunities and to see if the responses have changed for that particular piece of content.

Finally, we start the cycle all over again, gathering more data, analyzing it, implementing changes, and then tracking the results.

Implementing this process here at Moz has allowed us to see a correlation between changes that we've made to the Help Hub content and the number of helpful votes that we receive. We treat this part of our content library as a living document that is always evolving to not only account for tool changes but also to take into account customer feedback. Gathering feedback on your content can help to identify trends in what your customers are engaging with and how you can further improve your offerings moving forward. That's key. You want to always be improving.

It can also help to identify resources that may need updating or ideas for future content. For example, if you have a blog post about how to bake a cake and your readers are commenting that they don't know how to pick the right kind of pan for the recipe, there may be an opportunity to publish a new blog post about the best types of cake pans. Or if you publish help guides, like myself, if a customer says they couldn't find the answer to their question in that article, there may be an opportunity to look into questions like the ones that we outlined earlier in our analyze step. What are they trying to achieve? How did they end up on this page? How can I help them to reach their goal?

I hope that you found this helpful and that you're ready to get out there and start harnessing the power of feedback. Thank you so much, Moz fans. We'll see you next time.

Video transcription by Speechpad.com


by Meghan Pahinui via moz (en-US)
Previous
Next Post »
Thanks for your comment