WebProNews

Google: You Don’t Have To Dumb Your Content Down ‘That Much’

Google’s Matt Cutts answers an interesting question in a new “Webmaster Help” video: “Should I write content that is easier to read or more scientific? Will I rank better if I write for 6th graders?”

Do you think Google should give higher rankings to content that is as well-researched as possible, or content that is easier for the layman to understand? Share your thoughts in the comments.

This is a great question as we begin year three of the post-Panda Google.

“This is a really interesting question,” says Cutts. “I spent a lot more time thinking about it than I did a lot of other questions today. I really feel like the clarity of what you write matters a lot.”

He says, “I don’t know if you guys have ever had this happen, but you land on Wikipedia, and you’re searching for information – background information – about a topic, and it’s way too technical. It uses all the scientific terms or it’s talking about a muscle or whatever, and it’s really hyper-scientific, but it’s not all that understandable, and so you see this sort of revival of people who are interested in things like ‘Explain it to me like I’m a five-year-old,’ right? And you don’t have to dumb it down that much, but if you are erring on the side of clarity, and on the side of something that’s going to be understandable, you’ll be in much better shape because regular people can get it, and then if you want to, feel free to include the scientific terms or the industry jargon, the lingo…whatever it is, but if somebody lands on your page, and it’s just an opaque wall of scientific stuff, you need to find some way to pull people in to get them interested, to get them enticed in trying to pick up whatever concept it is you want to explain.”

Okay, it doesn’t sound so bad the way Cutts describes it, and perhaps I’m coming off a little sensational here, but it’s interesting that Cutts used the phrase, “You don’t have to dumb it down that much.”

This is a topic that we discussed last fall when a Googler Ryan Moulton said in a conversation on Hacker News, “There’s a balance between popularity and quality that we try to be very careful with. Ranking isn’t entirely one or the other. It doesn’t help to give people a better page if they aren’t going to click on it anyways.”

He then elaborated:

Suppose you search for something like [pinched nerve ibuprofen]. The top two results currently are http://www.mayoclinic.com/health/pinched-nerve/DS00879/DSECT… and http://answers.yahoo.com/question/index?qid=20071010035254AA…
Almost anyone would agree that the mayoclinic result is higher quality. It’s written by professional physicians at a world renowned institution. However, getting the answer to your question requires reading a lot of text. You have to be comfortable with words like “Nonsteroidal anti-inflammatory drugs,” which a lot of people aren’t. Half of people aren’t literate enough to read their prescription drug labels: http://www.ncbi.nlm.nih.gov/pmc/articles/PMC1831578/

The answer on yahoo answers is provided by “auntcookie84.” I have no idea who she is, whether she’s qualified to provide this information, or whether the information is correct. However, I have no trouble whatsoever reading what she wrote, regardless of how literate I am.
That’s the balance we have to strike. You could imagine that the most accurate and up to date information would be in the midst of a recent academic paper, but ranking that at 1 wouldn’t actually help many people.

This makes for a pretty interesting debate. Should Google bury the most well-researched and accurate information just to help people find something that they can read easier, even if it’s not as high quality? Doesn’t this kind of go against the guidelines Google set forth after the Panda update?

You know, like these specific questions Google suggested you ask about your content:

  • “Would you trust the information presented in this article?” (What’s more trustworthy, the scientific explanation from a reputable site or auntcookie’s take on Yahoo Answers?)
  • “Is this article written by an expert or enthusiast who knows the topic well, or is it more shallow in nature?” (Uh…)
  • “Does the article provide original content or information, original reporting, original research, or original analysis?” (Original research and analysis, to me, suggests that someone is going to know and use the lingo.)
  • “Does the page provide substantial value when compared to other pages in search results?” (Couldn’t value include educating me about the terminology I might not otherwise understand?)
  • “Is the site a recognized authority on its topic?” (You mean the type of authority that would use the terminology associated with the topic?)
  • “For a health related query, would you trust information from this site?” (Again, are you really trusting auntcookie on Yahoo Answers over Mayo Clinic?)
  • “Does this article provide a complete or comprehensive description of the topic?” (Hmm. Complete and comprehensive. You mean as opposed to dumbed down for the layman?)
  • “Does this article contain insightful analysis or interesting information that is beyond obvious?” (I’m not making this up. Here’s Google’s blog post listing these right here.)
  • “Are the pages produced with great care and attention to detail vs. less attention to detail?” (You get the idea.)
  • Maybe I’m missing something, but it seems like Google has been encouraging people to make their content as thorough, detailed, and authoritative as possible. I don’t see “Is your content dumbed down for clarity’s sake?” on the list. Of course that was nearly three years ago.

    If quality is really the goal (as Google has said over and over again in the past), doesn’t the responsibility of additional research and additional clicking of links rest with the searcher? If I don’t understand what the most accurate and relevant result is saying, isn’t it my responsibility to continue to educate myself, perhaps by looking at other sources of information and looking up the things I don’t understand?

    But that would go against Google trying to get users answers as quickly as possible. That must be why Google is trying to give you the answers itself rather than having to send you to third-party sites. Too bad those answers aren’t always reliable.

    Cutts continues in the video, “So I would argue first and foremost, you need to explain it well, and then if you can manage to do that while talking about the science or being scientific, that’s great, but the clarity of what you do, and how you explain it often matters almost as much as what you’re actually saying because if you’re saying something important, but you can’t get it across, then sometimes you never got it across in the first place, and it ends up falling on deaf ears.”

    Okay, sure, but isn’t this just going to encourage users to dumb down content at the risk of educating users less? I don’t think that’s what Cutts is trying to say here, but people are going to do anything they can to get their sites ranked better. At least he suggests trying to use both layman’s terms and the more scientific stuff.

    “It varies,” he says. “If you’re talking only to industry professionals – terminators who are talking about the scientific names of bugs, and your audience is only bugs – terminator – you know, exterminator experts, sure, then that might make sense, but in general, I would try to make things as natural sounding as possible – even to the degree that when I’m writing a blog post, I’ll sometimes read it out loud to try to catch what the snags are where things are gonna be unclear. Anything you do like that, you’ll end up with more polished writing, and that’s more likely to stand the test of time than something that’s just a few, scientific mumbo jumbo stuff that you just spit out really quickly.”

    I’m not sure where the spitting stuff out really quickly thing comes into play here. The “scientific mumbo jumbo” (otherwise known as facts and actual terminology of things) tends to appear in lengthy, texty content, like Moulton suggested, no?

    Google, of course, is trying to get better at natural language with updates like Hummingbird and various other acquisitions and tweaks. It should only help if you craft your content around that.

    “It’s not going to make that much of a difference as far as ranking,” Cutts says. “I would think about the words that a user is going to type, which is typically going to be the layman’s terms – the regular words rather than the super scientific stuff – but you can find ways to include both of them, but I would try to err on the side of clarity if you can.”

    So yeah, dumb it down. But not too much. Just enough. But also include the smart stuff. Just don’t make it too smart.

    What do you think? Should Google dumb down search results to give users things that are easier to digest, or should it be the searcher’s responsibility to do further research if they don’t understand the accurate and well-researched information that they’re consuming? Either way, isn’t this kind of a mixed message compared to the guidance Google has always given regarding “quality” content? Share your thoughts.

    For the record, I have nothing against auntcookie. I know nothing about auntcookie, but that’s kind of the point.