A reminder about links in large-scale article campaigns

Lately we’ve seen an increase in spammy links contained in articles referred to as contributor posts, guest posts, partner posts, or syndicated posts. These articles are generally written by or in the name of one website, and published on a different one.
Google does not discourage these types of articles in the cases when they inform users, educate another site’s audience or bring awareness to your cause or company. However, what does violate Google’s guidelines on link schemes is when the main intent is to build links in a large-scale way back to the author’s site. Below are factors that, when taken to an extreme, can indicate when an article is in violation of these guidelines:
  • Stuffing keyword-rich links to your site in your articles
  • Having the articles published across many different sites; alternatively, having a large number of articles on a few large, different sites
  • Using or hiring article writers that aren’t knowledgeable about the topics they’re writing on
  • Using the same or similar content across these articles; alternatively, duplicating the full content of articles found on your own site (in which case use of rel=”canonical”, in addition to rel=”nofollow”, is advised)
When Google detects that a website is publishing articles that contain spammy links, this may change Google’s perception of the quality of the site and could affect its ranking. Sites accepting and publishing such articles should carefully vet them, asking questions like: Do I know this person? Does this person’s message fit with my site’s audience? Does the article contain useful content? If there are links of questionable intent in the article, has the author used rel=”nofollow” on them?

For websites creating articles made for links, Google takes action on this behavior because it’s bad for the Web as a whole. When link building comes first, the quality of the articles can suffer and create a bad experience for users. Also, webmasters generally prefer not to receive aggressive or repeated “Post my article!” requests, and we encourage such cases to be reported to our spam report form. And lastly, if a link is a form of endorsement, and you’re the one creating most of the endorsements for your own site, is this putting forth the best impression of your site? Our best advice in relation to link building is to focus on improving your site’s content and everything–including links–will follow (no pun intended).

Posted by the Google Webspam Team

A reminder about links in large-scale article campaigns

Lately we’ve seen an increase in spammy links contained in articles referred to as contributor posts, guest posts, partner posts, or syndicated posts. These articles are generally written by or in the name of one website, and published on a different one.
Google does not discourage these types of articles in the cases when they inform users, educate another site’s audience or bring awareness to your cause or company. However, what does violate Google’s guidelines on link schemes is when the main intent is to build links in a large-scale way back to the author’s site. Below are factors that, when taken to an extreme, can indicate when an article is in violation of these guidelines:
  • Stuffing keyword-rich links to your site in your articles
  • Having the articles published across many different sites; alternatively, having a large number of articles on a few large, different sites
  • Using or hiring article writers that aren’t knowledgeable about the topics they’re writing on
  • Using the same or similar content across these articles; alternatively, duplicating the full content of articles found on your own site (in which case use of rel=”canonical”, in addition to rel=”nofollow”, is advised)
When Google detects that a website is publishing articles that contain spammy links, this may change Google’s perception of the quality of the site and could affect its ranking. Sites accepting and publishing such articles should carefully vet them, asking questions like: Do I know this person? Does this person’s message fit with my site’s audience? Does the article contain useful content? If there are links of questionable intent in the article, has the author used rel=”nofollow” on them?

For websites creating articles made for links, Google takes action on this behavior because it’s bad for the Web as a whole. When link building comes first, the quality of the articles can suffer and create a bad experience for users. Also, webmasters generally prefer not to receive aggressive or repeated “Post my article!” requests, and we encourage such cases to be reported to our spam report form. And lastly, if a link is a form of endorsement, and you’re the one creating most of the endorsements for your own site, is this putting forth the best impression of your site? Our best advice in relation to link building is to focus on improving your site’s content and everything–including links–will follow (no pun intended).

Posted by the Google Webspam Team

How we fought webspam – Webspam Report 2016

With 2017 well underway, we wanted to take a moment and share some of the insights we gathered in 2016 in our fight against webspam. Over the past year, we continued to find new ways of keeping spam from creating a poor quality search experience, and worked with webmasters around the world to make the web better.

We do a lot behind the scenes to make sure that users can make full use of what today’s web has to offer, bringing relevant results to everyone around the globe, while fighting webspam that could potentially harm or simply annoy users.

Webspam trends in 2016

How we fought spam in 2016

Working with users and webmasters for a better web

  • In 2016 we received over 180,000 user-submitted spam reports from around the world. After carefully checking their validity, we considered 52% of those reported sites to be spam. Thanks to all who submitted reports and contributed towards a cleaner and safer web ecosystem!
  • We conducted more than 170 online office hours and live events around the world to audiences totaling over 150,000 website owners, webmasters and digital marketers.
  • We continued to provide support to website owners around the world through our Webmaster Help Forums in 15 languages. Through these forums we saw over 67,000 questions, with a majority of them being identified as having a Best Response by our community of Top contributors, Rising Stars and Googlers. 
  • We had 119 volunteer Webmaster Top Contributors and Rising Stars, whom we invited to join us at our local Top Contributor Meetups in 11 different locations across 4 continents (Asia, Europe, North America, South America). 

We think everybody deserves high quality, spam-free search results. We hope that this report provides a glimpse of what we do to make that happen.

Posted by Michal Wicinski, Search Quality Strategist and Kiyotaka Tanaka, User Education & Outreach Specialist 

How we fought webspam – Webspam Report 2016

With 2017 well underway, we wanted to take a moment and share some of the insights we gathered in 2016 in our fight against webspam. Over the past year, we continued to find new ways of keeping spam from creating a poor quality search experience, and worked with webmasters around the world to make the web better.

We do a lot behind the scenes to make sure that users can make full use of what today’s web has to offer, bringing relevant results to everyone around the globe, while fighting webspam that could potentially harm or simply annoy users.

Webspam trends in 2016

How we fought spam in 2016

Working with users and webmasters for a better web

  • In 2016 we received over 180,000 user-submitted spam reports from around the world. After carefully checking their validity, we considered 52% of those reported sites to be spam. Thanks to all who submitted reports and contributed towards a cleaner and safer web ecosystem!
  • We conducted more than 170 online office hours and live events around the world to audiences totaling over 150,000 website owners, webmasters and digital marketers.
  • We continued to provide support to website owners around the world through our Webmaster Help Forums in 15 languages. Through these forums we saw over 67,000 questions, with a majority of them being identified as having a Best Response by our community of Top contributors, Rising Stars and Googlers. 
  • We had 119 volunteer Webmaster Top Contributors and Rising Stars, whom we invited to join us at our local Top Contributor Meetups in 11 different locations across 4 continents (Asia, Europe, North America, South America). 

We think everybody deserves high quality, spam-free search results. We hope that this report provides a glimpse of what we do to make that happen.

Posted by Michal Wicinski, Search Quality Strategist and Kiyotaka Tanaka, User Education & Outreach Specialist 

Protect your site from user generated spam

As a website owner, you might have come across some auto-generated content in comments sections or forum threads. When such content is created on your pages, not only does it disrupt those visiting your site, but it also shows some content that you may not want to be associated with your site to Google and other search engines.

In this blog post, we will give you tips to help you deal with this type of spam in your site and forum.

Some spammers abuse sites owned by others by posting deceiving content and links, in an attempt to get more traffic to their sites. Here are a few examples:


Comments and forum threads can be a really good source of information and an efficient way of engaging a site’s users in discussions. This valuable content should not be buried by auto-generated keywords and links placed there by spammers.

There are many ways of securing your site’s forums and comment threads and making them unattractive to spammers:

  • Keep your forum software updated and patched. Take the time to keep your software up-to-date and pay special attention to important security updates. Spammers take advantage of security issues in older versions of blogs, bulletin boards, and other content management systems.

  • Add a CAPTCHA. CAPTCHAs require users to confirm that they are not robots in order to prove they’re a human being and not an automated script. One way to do this is to use a service like reCAPTCHA, Securimage and  Jcaptcha .
  • Block suspicious behavior. Many forums allow you to set time limits between posts, and you can often find plugins to look for excessive traffic from individual IP addresses or proxies and other activity more common to bots than human beings. For example, phpBB, Simple Machines, myBB, and many other forum platforms enable such configurations.
  • Check your forum’s top posters on a daily basis. If a user joined recently and has an excessive amount of posts, then you probably should review their profile and make sure that their posts and threads are not spammy.
  • Consider disabling some types of comments. For example, It’s a good practice to close some very old forum threads that are unlikely to get legitimate replies.
    If you plan on not monitoring your forum going forward and users are no longer interacting with it, turning off posting completely may prevent spammers from abusing it.
  • Make good use of moderation capabilities. Consider enabling features in moderation that require users to have a certain reputation before links can be posted or where comments with links require moderation.
    If possible, change your settings so that you disallow anonymous posting and make posts from new users require approval before they’re publicly visible.
    Moderators, together with your friends/colleagues and some other trusted users can help you review and approve posts while spreading the workload. Keep an eye on your forum’s new users by looking on their posts and activities on your forum.  
  • Consider blacklisting obviously spammy terms. Block obviously inappropriate comments with a blacklist of spammy terms (e.g. Illegal streaming or pharma related terms) . Add inappropriate and off-topic terms that are only used by spammers, learn from the spam posts that you often see on your forum or other forums. Built-in features or plugins can delete or mark comments as spam for you.
  • Use the “nofollow” attribute for links in the comment field. This will deter spammers from targeting your site. By default, many blogging sites (such as Blogger) automatically add this attribute to any posted comments.
  • Use automated systems to defend your site.  Comprehensive systems like Akismet, which has plugins for many blogs and forum systems are easy to install and do most of the work for you.

For detailed information about these topics, check out our Help Center document on User Generated Spam and comment spam. You can also visit our Webmaster Central Help Forum if you need any help.

Posted by Anouar Bendahou, Search Quality Strategist, Google Ireland

Protect your site from user generated spam

As a website owner, you might have come across some auto-generated content in comments sections or forum threads. When such content is created on your pages, not only does it disrupt those visiting your site, but it also shows some content that you may not want to be associated with your site to Google and other search engines.

In this blog post, we will give you tips to help you deal with this type of spam in your site and forum.

Some spammers abuse sites owned by others by posting deceiving content and links, in an attempt to get more traffic to their sites. Here are a few examples:


Comments and forum threads can be a really good source of information and an efficient way of engaging a site’s users in discussions. This valuable content should not be buried by auto-generated keywords and links placed there by spammers.

There are many ways of securing your site’s forums and comment threads and making them unattractive to spammers:

  • Keep your forum software updated and patched. Take the time to keep your software up-to-date and pay special attention to important security updates. Spammers take advantage of security issues in older versions of blogs, bulletin boards, and other content management systems.

  • Add a CAPTCHA. CAPTCHAs require users to confirm that they are not robots in order to prove they’re a human being and not an automated script. One way to do this is to use a service like reCAPTCHA, Securimage and  Jcaptcha .
  • Block suspicious behavior. Many forums allow you to set time limits between posts, and you can often find plugins to look for excessive traffic from individual IP addresses or proxies and other activity more common to bots than human beings. For example, phpBB, Simple Machines, myBB, and many other forum platforms enable such configurations.
  • Check your forum’s top posters on a daily basis. If a user joined recently and has an excessive amount of posts, then you probably should review their profile and make sure that their posts and threads are not spammy.
  • Consider disabling some types of comments. For example, It’s a good practice to close some very old forum threads that are unlikely to get legitimate replies.
    If you plan on not monitoring your forum going forward and users are no longer interacting with it, turning off posting completely may prevent spammers from abusing it.
  • Make good use of moderation capabilities. Consider enabling features in moderation that require users to have a certain reputation before links can be posted or where comments with links require moderation.
    If possible, change your settings so that you disallow anonymous posting and make posts from new users require approval before they’re publicly visible.
    Moderators, together with your friends/colleagues and some other trusted users can help you review and approve posts while spreading the workload. Keep an eye on your forum’s new users by looking on their posts and activities on your forum.  
  • Consider blacklisting obviously spammy terms. Block obviously inappropriate comments with a blacklist of spammy terms (e.g. Illegal streaming or pharma related terms) . Add inappropriate and off-topic terms that are only used by spammers, learn from the spam posts that you often see on your forum or other forums. Built-in features or plugins can delete or mark comments as spam for you.
  • Use the “nofollow” attribute for links in the comment field. This will deter spammers from targeting your site. By default, many blogging sites (such as Blogger) automatically add this attribute to any posted comments.
  • Use automated systems to defend your site.  Comprehensive systems like Akismet, which has plugins for many blogs and forum systems are easy to install and do most of the work for you.

For detailed information about these topics, check out our Help Center document on User Generated Spam and comment spam. You can also visit our Webmaster Central Help Forum if you need any help.

Posted by Anouar Bendahou, Search Quality Strategist, Google Ireland