Negative SEO: Are You being Targeted?

negative seo - search engine optimization.

Should you be worried about negative SEO? Even some Grey Hat SEO techniques can be brutal for your website. Is there anything you can do to stay safe online?

Below we’ll address and attempt to answer these questions.

The cumbersome worry of negative SEO is remote but cumbersome nonetheless.

Is it easy for competitors to ruin rankings, and if so, are you able to protect your site? Let’s define what negative SEO is and what it is not.

Negative SEO is a set of activities aimed at lowering a competitor’s rankings in search results.

These activities are more often off-page (e.g., building unnatural links to the site or scraping and reposting its content); but in some cases, they may also involve hacking the site and modifying its content.

Negative SEO is not always the most likely cause of a sudden drop in rankings.

Before jumping to competitor actions as the most likely explanation, try and review other common reasons that can cause ranking drops.

We recommend reading these great SEO and content marketing tips.

NEGATIVE ON-PAGE SEO

Negative on-page SEO attacks are way more difficult to implement.

These involve hacking into your site and changing things around. Here are the main SEO threats a hacker attack can pose.

MODIFYING YOUR CONTENT

negative seo.

You’d think you’d notice if someone changed your content, but this tactic can also be very subtle and difficult to spot.

As the attacker adds spammy content to a site (usually links), they often hide it (e.g., under “display: none” in the HTML), so you won’t see it unless you look in the code.

Another possible negative SEO scenario is someone modifying your pages to redirect to theirs.

If your site experiences link popularity and high authority, one reason could be a stealthy way for other companies to increase their own sites PageRank, or to gain from redirecting visitors to their own site once visitors access your site.

Google, for example, doesn’t view it as just a temporary inconvenience, if it finds out about it, it takes measures to penalize the site for “redirecting to a malicious website”.

HOW TO STAY SAFE:

One of the best ways to locate stealthy attacks is to conduct site audits, with tools such as WebSite Auditor.

Launch WebSite Auditor and create a project for your site to in order to start your first audit, and following that for subsequent audits to re-run, simply use the Rebuild Project button.

With regular use, you’ll be able to discover any changes that may otherwise slip under the radar, such as changes in the number of outgoing links or pages with redirects.

To look into those links or redirects in detail, switch to the All Resources dashboard and go through the External Resources section.

If you see an unexpected increase in the count of these, look through the list on the right to see where those links point to, and the lower part of the screen for the pages they were found on.

HACKING THE SITE

negative seo.

Even if the hacker has no negative SEO in mind, the attack per se can hurt your SEO.

Google wants to protect its users, which is why, if they suspect a site has been hacked, they may de-rank it, or at the very least add a “this site may be hacked” line to your search listings.

Would you click on a result like this?  I highly doubt it.

HOW TO STAY SAFE:

Negative SEO aside, stepping up your site’s security should be high on your list of priorities for obvious reasons.

GETTING THE SITE DE-INDEXED

A good robots.txt can help avoid negative seo.

One small change likely to cause chaos on your entire SEO strategy is a change in robots.txt.

A small disallow rule can wreak havoc as it prompts Google to completely ignore your website.

HOW TO STAY SAFE

Regularly check your rankings in order to be the first to gauge whether your site as been de-indexed. Using Rank Tracker, you can even schedule automatic checks weekly or daily.

f you face sudden drops in search engine results, you’ll be able to locate a Dropped note in the Difference column.

When you notice this spanning a large number of keywords, it effectively means a penalty or de-indexation.

If you suspect it’s a de-indexation, make sure to review the crawl states in your Google Search Console account and thoroughly inspect your robots.txt

NEGATIVE OFF-PAGE SEO

This kind of negative SEO targets the site without internally interfering with it. Here are the most common shapes negative off-page SEO can take.

FORCEFUL CRAWLING

negative seo.

There are examples of desperate site owners trying to crash a competitor’s site by forcefully crawling it and causing heavy server load.

If Googlebot can’t access your site for a few times in a row… you guessed it — you might get de-ranked.

HOW TO STAY SAFE

If your site has become noticeably slow or, worse, unavailable, it is strongly encouraged to contact your web hosting company or webmaster and they will be able to pinpoint where the load is coming from.

Here is a detailed list of instructions on locating the villain crawlers and blocking them with .htaccess and robots.txt

LINK FARMS

Link farms can help avoid negative seo.

A small number of spammy links are unlikely to do long-term damage to a site’s rankings.

This is the reason negative SEO attacks consist of building links from a group of interconnected sites, otherwise known as link farms.

Generally, the same anchor text is used for most of these links.

These exact-match anchors could be completely unrelated to the site under attack, or could potentially incorporate a niche keyword in order to make the site’s link appear as though the owner is manipulating it.

A while ago, this happened to WP Bacon, a WordPress podcast site. Over a short period of time, the site acquired thousands of links with the anchor text “porn movie.”

Throughout 10 days, WP Bacon fell 50+ spots in Google for the majority of keywords it ranked for.

This story has a happy ending though: the webmaster disavowed the spammy domains, and eventually, WP Bacon recovered most of its rankings.

HOW TO STAY SAFE:

It is unlikely you’ll be able to prevent a negative SEO attack, but it is within your power to discover the attempt at an early stage and reverse any potential damage. In order to achieve an early detection, you’ll need to regularly monitor link profile growth.

Using SEO SpyGlass, you can review progress graphics for both the number of links in your profile and the number of referring domains.

Detecting an unusual spike in those graphs is good enough reasoning to review the links you recently and suddenly acquired.

Go to Linking Domains (or Backlinks) dashboard in SEO SpyGlass in order to see the links that make the sudden spike in your graphs.

Use Last Found Date to sort the links, you can do so by clicking on the header of the column twice.

By correlating the time the spike appeared on the graph, you can locate the links.

If you’re unable to discern where the links came from, have a look at their Penalty Risk.

Change to the Link Penalty Risk Tab, select the backlinks you discovered that you deem suspicious, and click Update Link Penalty Risk.

Within a short few minutes, the column will be populated with values ranging on a scale of 0 to 100.

This is a relatively accurate metric in order to help you detect whether the links are coming from Link Farms, by looking at the volume of linking domains arising from the same IP address or C Block.

Lastly, once you’ve identified the spammy links, you can create a disavow file right in SEO SpyGlass.

To do that, right-click the backlink/linking domain and select Disavow (make sure to select Entire domain under Disavow mode). Do the same for all unnatural links you spotted.

Finally, go to Preferences > Disavow/Blacklist backlinks, review your disavow file, and export it once you’re happy with it.

SCRAPING

Negative seo is the practice of using black hats.

Another manner in which a competitor can ruin your rankings is by scraping your content and replicating it across other sites.

Usually, when Google discerns that content is being duplicated across many sites, it picks only one version that it chooses to rank.

Often, Google is smart enough to detect the original piece, unless they, unfortunately, locate the “stolen” version first.

This is the reason scrapers automatically copy the new content and repost it immediately.

HOW TO STAY SAFE

Copyscape is an essential tool if you’re determined to find instances of content duplication.

If you do find scraped copies of your content, it’s a good idea to first contact the webmaster asking them to remove the piece.

If that’s not effective, you may want to report the scraper using Google’s copyright infringement report.

Credit to SEO PowerSuite and you can follow them @Twitter

2019-01-04T11:48:31+00:00