Should you audit your disavow file?

In October 2013, Google gave us the Disavow tool.

This allowed webmasters to instruct Google not to count the link metrics that flowed either through certain links or from certain domains.

If you’ve had a manual penalty or have been dealing with Google’s Penguin algorithm, you’ve probably filed a disavow file.

Disavow Links

For the last two years, I have reviewed a large number of disavow files, which often harm sites’ rankings. In some cases, I have suggested auditing the disavow file to determine whether it should be modified and resubmitted.

Here are some possible scenarios in which a site owner may make the decision to thoroughly review their disavow file.

Have you relied on link auditing software to make disavow decisions?

Some link auditing tools can be quite helpful when it comes to organizing your links. But it is vitally important that you take this well-organized list and manually review the disavow suggestions the tool has made.

I have seen far too many businesses blindly take the report generated from one of these tools and submit it directly to Google. Usually when this happens, a large number of unnatural links go undetected and are not disavowed.

Viewing the backlink profile for a site that had relied on an automated tool for disavow decisions recently, I discovered unnatural links from 965 new domains that had not been disavowed. It’s no wonder this site was still struggling with Penguin.

However another problem that I have seen, is that these automated tools will often flag really good links for removal. In one case, an automated tool flagged some valid press mentions – from BBC News, The Guardian, and other great news sources – as unnatural links.

It’s important to remember that the disavow suggestions made by these tools are suggestions, therefore, they need to be reviewed manually.

As such, if you have relied on an automated tool to make disavow decisions, it may be worthwhile to review your disavow file and see if you have accidentally disavowed some good domains. If you have, you can remove those lines from your disavow file and resubmit it.

Google should eventually start to count the link equity from these good sources again. However, it may take a while for the links to start helping again.

Were you super aggressive while removing links for a manual action?

If you’ve ever tried to remove a manual unnatural links penalty from your site, you know that Google can be pretty strict when it comes to giving you that beautiful “spam action revoked” message.

After each failed attempt at reconsideration, site owners often trim away at their links, eventually becoming so desperate that they end up getting rid of almost all them. In many cases, there were some unnecessary disavow decisions.

Auditing a disavow file after an overly aggressive link pruning can be tough. You certainly don’t want to try to game the system and reavow links that are unnatural. But if you feel that you have disavowed links that were actually valid, it may be worthwhile to have another look at your link profile.

A word of caution: If you decide to reavow links, be careful. It may be a good idea to have an impartial party review your reavow decisions to make sure that these links really are decent ones.

Did you hire an inexperienced person to do your link audit?

Sadly this is a very common experience. Knowing which links to disavow can be difficult to determine. No one can accurately predict exactly which links Google wants to see disavowed.

Some decisions are easy, especially when the link is outright spam, but sometimes it can be hard to decide whether to disavow a link or not.

I’ve seen cases where, while performing a link audit, the SEO company decided to disavow every link that was anchored with a keyword.

The issue with this is that not all keyword-anchored links are unnatural. If a major news publication wrote about your company and legitimately linked to you using your keyword as the link anchor, this is a good thing.

Additionally, I’ve witnessed people disavow every single directory link pointing to the site. While directories certainly can be a source of unnatural links, there are many directories that are valid citations and good links. Removing or disavowing links from good directories can destroy your rankings, both in the organic rankings and in the local pack.

I’ve even had cases where small business owners blindly trust an SEO company to do a link audit, only to have that company disavow every single link pointing to their client’s site.

Final thoughts:

My intention in writing this article is not to advise people to try to game Google by reavowing links that were self-made for SEO purposes.

Instead, I would urge you to review your disavow file to see if perhaps you have been too aggressive in disavow decisions. You may find that reavowing some decent links that were disavowed in error eventually results in a positive difference in your rankings.

What do you think? Have you disavowed links in error? What are your experiences and thoughts on reavowing links? Let us know in the comments below.

For more information visit:- http://searchenginewatch.com

Advertisements

Did Google Panda 4.0 Go After Press Release Sites?

Nine days ago when Google released Panda 4.0 – which aims to filter out “thin” content from top rankings – we focused our attention on the big Winners & Losers charts, Since then, some have noted that press releases may have also been hit big time.

Using SearchMetrics, I looked at the top Press Release sites and checked if they lost any SEO visibility on a large scale since Panda 4.0 hit, and PRWeb.com, PR Newswire, BusinessWire and PRLog all seem to have lost significant rankings in Google.

PRNewsWire.com seems to have shown a significant drop in SEO visibility, dropping ~63% after Panda 4.0 was released:

prnewswire-panda4

PRWeb.com, another major press release distribution web site, also saw a huge drop, ~71%: Continue reading

Google’s New Search Layout

Google Search LayoutThe new Google search layout users began seeing a couple weeks ago on a limited basis has now gone live to all users.

Google’s new layout, which changes the font and removes underlines from links, as well as displays the AdWords ads at the top differently, has definitely been getting poor reviews as it rolled out to everybody.

The headlines are larger the description text seems to be slightly lighter and they have adjusted the fonts with the wider typeface.

For AdWords ads, gone is the light yellow in the background that we have long associated as being advertising space for many years now. The new style doesn’t have any shaded background, instead it has a tiny yellow “Ad” marker next to the green URL. There is also an underline separating the ads from the organic search results.

Beyond the cosmetic change, the new search layout is affecting SEO in a pretty pronounced way. Titles that were optimized to the maximum 70 allowable characters for SEO purposes will now find the same headlines truncated in Google’s new results, giving everyone about 59-60 characters to work with. This means you might have a lot of work ahead of you trying to rework titles so they don’t appear poorly truncated in the search results, which could impact click-throughs to your site.

The first time many user saw the changes, many users thought they actually had their search hijacked or were falling victim to Google spoofing, because the search results looked completely different. And the reviews definitely aren’t good across the board, judging from all the comments by very upset searchers, something that actually made the switch to another search engine strictly because of the new look.

To me, it looks like a throwback to how search engine results looked 10 to 15 years ago, such as on Webcrawler or Hotbot, not something that has been refreshed for 2014. And I do agree with many people who say the new font makes it much harder to read and scan when on a desktop.

Google’s logic behind the new change was that they make the changes for mobile and tablets, and they carry out over several the design changes to desktop users. Google said they feel this creates improved readability and a much cleaner look. And they had an end goal of creating a consistent user experience across multiple devices (desktop, mobile tablet):

Towards the end of last year we launched some pretty big design improvements for Search on mobile and tablet devices (mobile first! :-). Today we’ve carried over several of those changes to the desktop experience.

We’ve increased the size of result titles, removed the underlines, and evened out all the line heights. This improves readability and creates an overall cleaner look. We’ve also brought over our new ad labels from mobile, making the multi-device experience more consistent.

Improving consistency in design across platforms makes it easier for people to use Google Search across devices and it makes it easier for us to develop and ship improvements across the board.

Will we see any changes reverted back? Hard to say, but Google doesn’t too often revert back on their changes once they’ve jumped in and made them.

Click Here to Read More Related Stories

Google’s Matt Cutts: We Don’t Use Twitter Or Facebook Social Signals To Rank Pages

Google’s head of search spam, Matt Cutts, released a video today answering the question, “are Facebook and Twitter signals part of the ranking algorithm?” The short answer was no.

Matt Cutt'sMatt said that Google does not give any special treatment to Facebook or Twitter pages. They are in fact, currently, treated like any other page, according to Matt Cutts.

Matt then answered if Google does special crawling or indexing for these sites, such as indexing the number of likes or tweets a specific page has. Matt said Google does not do that right now. Why?

They have at one point and they were blocked. I believe Matt was referring to Google’s real time search deal expiring with Twitter. Matt explained that they put a lot of engineering time into it and then they were blocked and that work and effort was no longer useful. So for Google to put more engineering time into this and then be blocked again, it just doesn’t pay.

Another reason, Google is worried about crawling identity information at one point and then that information changes but Google doesn’t see the update until much later. Having outdated information can be harmful to some people. Continue reading

Search Ranking Factors Survey Results 2013 – SEOmoz

Google Overall Survey Algorithm

Google Overall Survey Algorithm

Moz has released their 2013 search engine ranking factors, surveying 120 SEO professionals and having them rank different search factors. While this isn’t the full survey data, it does have a lot of interesting information to consider when you’re optimizing websites for search engines.

Moz uses search correlations in order to make estimates as to what is being used for Google’s ranking algorithm, based upon features on higher ranking sites versus ones that are lower ranking. They used over 14,000 keywords from Google AdWords across multiple categories then use of keywords to extract the top 50 organic search results in June, post Penguin 2.0.

One interesting correlation was that despite SEOs knowing that over optimization of keyword anchor text could be problematic and a sign of spamming, they found that the correlations for exact match and partial match was fairly high. But not surprisingly, the SEOs surveyed believed that diversity in anchor text, including both branded and nonbranded terms, was more important than the number of links themselves.

Moz also looked at on page keywords and not surprisingly found a very high correlation of those keywords in body text, title, meta-description and H1 tags. Likewise, the SEOs surveyed believed that including keywords in both the title and on page are important factors.

Moz also discovered that rankings of exact match domains (ie. Keyword1keyword2.com) has declined over the past year, although the correlation is still high. While having the keywords in the domain name was extremely important many years ago, SEOs have definitely shied away from exact match keywords in favor of partial match domains or branded URLs.

The look Moz took at social signals and its correlation with ranking factors is important. Google +1’s came out ahead of Facebook shares and tweets for correlated factors, although the surveyed SEOs did not believe that social signals are very important to Google search algorithm.

Their last look at ranking factors was in 2011, when they were still known as SEOmoz.

The full survey will be released by Moz in a few weeks.