Is Negative SEO a threat to your website?
In the past recent years Google has got better in accessing the quality of backlinks and tries to design algorithms that are robust and that are resistant to negative SEO techniques and disavow links.
Negative SEO has a bad impact on both small and large websites. Many large brands have dropped in the Google rankings, and it is mostly due to negative SEO.
In 2011 Overstock was hit hard in February, for paying for backlinks by offering discounts. The penalty included the site not ranking in the first page of Googles search results for two months.
In April, 2013, Mozilla was penalized for one page of user generated content that Google deems too “spammy”. The page was removed, but before doing so, the major browser was degraded in Google’s rankings.
In December 2013, Rap Genius was penalized by Google for unnatural links. The site was no longer appearing in the top page for its own name.
In 2014, Expedia got hit with penalties from Google as they seemed to be fading away from the top search results. Expedia experienced about 25% organic visibility drop.
In 2015, Google slapped Thumbtack, a company they had a $100 million investment in for bad links.
Google clearly states that any tactics that are outside of their Webmaster Guidelines are considered “black hat SEO”, and will be penalized. Since these updates started to take place, website owners everywhere have to become cautious.
What is negative SEO?
Negative SEO refers to action that sabotages a competitor’s organic rankings in search engines. It occurs when someone uses scrupulous methods to lower a site’s rankings in the search engines. There are multiple ways that this can be attempted. The most common type of negative SEO known is link based, but there are many other techniques that people of ill will can use to try to reduce your rankings.
How can negative SEO can be done?
- Hack a website.
- Riddle the code with HTML hyperlinks
- Inject malware and spam
- Edit your robot.txt file to block Googlebot
- Making multiple duplicate copies of a website
- Stealing a website’s content
- Restricting all IP’s from a certain range
How do you know whether your site is affected by Negative SEO?
The first step in checking your website for negative SEO from links is reviewing your backlink profile. In most cases, it is suggested to do this once a week or once a month, depending on how competitive your website is.
You can view the links to your site report in Google Webmaster Tools to get a decent high-level view of the links Google is seeing and utilizing as authority indicators when reviewing your site.Tools like Majestic SEO, Opensite Explorer and Ahrefs, that can often surface additional inbound links not listed in the Google Webmaster Tools report.
Which kind of links you got to keep a check on?
- A large number of new links that look unnatural.
- Links with exact anchor text for competitive keywords.
- Links from low quality domains.
- Links from low quality blogs that could be in blog networks.
- If you see any drops in rankings or organic traffic and have not received a manual action, it may be the result of a Google algorithm update.
How do you protect yourself from negative SEO?
The only way to protect yourself is to establish authority in your industry. You need to get many high quality incoming links to your site. If your authority is powerful then an extra twenty thousand links pointing to your website will make no difference to your ranking. That’s what high quality SEO is all about.
Review your backlinks
Make sure you take the time to monitor your website’s backlink profile. Since competitor sabotage is an affordable and highly effective method to take down a competing website, be sure you are not the target by checking links regularly.
Organizing your link data
Organize all the link data you have collected from Google Webmaster Tools, Ahrefs, Majestic SEO, and Open Site Explorer then copy/paste the data from each link source into relevant sheets. Make sure you remove the duplicates from the sheet by going to the Data tab and selecting “Remove Duplicates”, so you have only one URL link.
Identify Bad Links
When there’s a relatively huge list of backlinks, it gets time-consuming to go through each one of them for the bad links identification purpose. Use tools like URL Profiler, CognitiveSEO, and Backlinks Monitor, to identify bad links and unnatural anchor text.
Disavow the Bad Links
After you have identified and made the list of the links to be removed, submit the data into the Disavow Tool. The Disavow Tool will label a link with a tag so that Google algorithm does not credit it to your site in a positive or negative way, but keep in mind there are mixed reports on the efficacy of the tool.
This tool allows publishers to tell Google that they don’t want certain links from external sites to be considered as part of Google’s system of counting links to rank web sites.
Contact the Webmasters
The key point to understand is that disavowing a link won’t be enough to help you come out clean. You need to contact the domain webmasters and ask them to remove these links. Getting the contact details of all these bad links might take you several days. This is where tools such as URL Profiler and Rmoov come handy.
Steps below to identify and correct violations using Google Search Console:
- Download a list of links to your site from Search Console.
- Check this list for any links that violate our guidelines on linking.
- For a large list, start by looking at the sites that link to your site the most, or links that were created recently.
- Contact the webmaster of that site and ask that they either remove the links or prevent them from passing PageRank, such as by adding a rel=”nofollow” attribute.
- Use the Disavow links tool in Search Console to disavow any links you were unable to get removed
Latest posts by Christine (see all)
- How to add an Admin User when WordPress Site is Hacked - January 28, 2018
- How to Manually Update WordPress Plugins using FTP - January 26, 2018
- Why Bluehost is best for WordPress Hosting 2018 - January 11, 2018