What We Can Learn from Mozilla’s Penalty

Mozilla.org was slapped by Google with a manual penalty for “user-generated spam”. You can read more details here.

Some takeaways:

1) You can never be too big to fail – Even sites like Mozilla can be a target of spammy links from comments. Even though you’d think that a popular site like this would have enough natural links to outweigh a potentially negative attack, it can still happen.

2) Spammy tactics can hurt your website and brand – You need to monitor your site and check Webmaster Tools for messages about unnatural links, detected malware, increase in errors and more.

Prevention is the key to being attack and active monitoring can help keep your site in top health. You wouldn’t want your website to be associated with payday loans or generic medications, right?

3) A site can receive different types of penalties – Whether it’s intentional (like JCPenney or even Chrome) or unintentional, like the case with Mozilla, Google is ready to give out appropriate penalties. In this case, Mozilla didn’t receive a “full” penalty but rather, had penalties for pages where there were spammy comments.

4) Ask for help – The Webmaster Portion of Google Product Forums is a great place to ask for help and suggestions from other webmasters. Although we all won’t be as lucky to receive immediate and direct comments from John Mueller or Matt Cutts, there are a lot of helpful tips you can receive from outside people who aren’t biased and naive about your site/industry.

Speaking of help, feel free to contact us about questions about your website. You can also connect with us on Twitter or Facebook.

What Does it Take to be Trust(Rank) Worthy?

We’ve previously discussed PageRank, so let’s dive right into TrustRank.

The Concept of TrustRank
Researchers from Stanford University and Yahoo! combined forces to come up with a link analysis technique to “semi-automatically” separate useful websites from spammy websites. The idea is that feedback from human reviewers can only go so far and people can only go through so many pages. There has to be another way to distinguish good content from the bad content.

This is where the concept of TrustRank comes in. The basic idea is that established, authoritative sites (like NY Times, WebMD, etc.) will rarely link out to pages that aren’t equally well-regarded. On the other hand, it’s not uncommon to come across webspam that links out to good sites (in very high and unnatural volume), or even name drops in order to trick visitors in an attempt to be a credible source.

The Difference Between Spam and Non-Spam
There are many indicators that can be used to detect spam. Spammy sites will often work in a network where they all link to each other in order to sculpt PageRank and pass link juice to a certain page. Volume of linking out, anchor text, keyword density, black hat techniques (like cloaking) are also indicators of spammy or malicious sites.

Then, there’s the issue of linking C-blocks. Tools like SEOmoz’s Open Site Explorer give you a better idea of the amount of linking C-blocks to your domain. Sites hosted on the same domain may all link to each other in a scheme to gain higher rankings.

To make a long and technical story short, you can think of it like this: Your IP address is where your website resides. The C-block you belong to is your “neighborhood”. If your IP is located in a “bad” C-block, it could be harmful as your site may be associated with spammy sites.

Google’s Official Word?
Although Google doesn’t officially recognize a single page ranking factor of TrustRank, they do acknowledge that the search algorithm takes the concept of trust into consideration. So, it might not be a bad idea after all to think about TrustRank as it can improve your PageRank and overall rankings and reputation.

Have you ever heard this quote? “You are the average of the 5 people you spend the most time with”. Apply that to who your website is associated with and keep good company!