It’s Mobilegeddon hot in here: Google strengthens its mobile-friendly ranking signal

If you have a mobile-friendly website, stand down. This news is not for you.

mobilegeddon2-ss-1920-800x450If you don’t have a mobile-friendly website, my goodness what the hell are you doing!?? Have you not had enough warnings already? Stop reading this right now and go get a responsive website immediately you maniac!

So for anyone else reading this news – probably one person: my boss, the editor-in-chief of ClickZ (hi Graham!) – here’s a quick update to one of the biggest changes in the way Google ranks your website made in the last few years.

According to an announcement yesterday, from the beginning of May 2016, Google will increases the effectiveness of its ‘mobile friendly’ ranking signal. Continue reading

Should you audit your disavow file?

In October 2013, Google gave us the Disavow tool.

This allowed webmasters to instruct Google not to count the link metrics that flowed either through certain links or from certain domains.

If you’ve had a manual penalty or have been dealing with Google’s Penguin algorithm, you’ve probably filed a disavow file.

Disavow Links

For the last two years, I have reviewed a large number of disavow files, which often harm sites’ rankings. In some cases, I have suggested auditing the disavow file to determine whether it should be modified and resubmitted.

Here are some possible scenarios in which a site owner may make the decision to thoroughly review their disavow file.

Have you relied on link auditing software to make disavow decisions?

Some link auditing tools can be quite helpful when it comes to organizing your links. But it is vitally important that you take this well-organized list and manually review the disavow suggestions the tool has made.

I have seen far too many businesses blindly take the report generated from one of these tools and submit it directly to Google. Usually when this happens, a large number of unnatural links go undetected and are not disavowed.

Viewing the backlink profile for a site that had relied on an automated tool for disavow decisions recently, I discovered unnatural links from 965 new domains that had not been disavowed. It’s no wonder this site was still struggling with Penguin.

However another problem that I have seen, is that these automated tools will often flag really good links for removal. In one case, an automated tool flagged some valid press mentions – from BBC News, The Guardian, and other great news sources – as unnatural links.

It’s important to remember that the disavow suggestions made by these tools are suggestions, therefore, they need to be reviewed manually.

As such, if you have relied on an automated tool to make disavow decisions, it may be worthwhile to review your disavow file and see if you have accidentally disavowed some good domains. If you have, you can remove those lines from your disavow file and resubmit it.

Google should eventually start to count the link equity from these good sources again. However, it may take a while for the links to start helping again.

Were you super aggressive while removing links for a manual action?

If you’ve ever tried to remove a manual unnatural links penalty from your site, you know that Google can be pretty strict when it comes to giving you that beautiful “spam action revoked” message.

After each failed attempt at reconsideration, site owners often trim away at their links, eventually becoming so desperate that they end up getting rid of almost all them. In many cases, there were some unnecessary disavow decisions.

Auditing a disavow file after an overly aggressive link pruning can be tough. You certainly don’t want to try to game the system and reavow links that are unnatural. But if you feel that you have disavowed links that were actually valid, it may be worthwhile to have another look at your link profile.

A word of caution: If you decide to reavow links, be careful. It may be a good idea to have an impartial party review your reavow decisions to make sure that these links really are decent ones.

Did you hire an inexperienced person to do your link audit?

Sadly this is a very common experience. Knowing which links to disavow can be difficult to determine. No one can accurately predict exactly which links Google wants to see disavowed.

Some decisions are easy, especially when the link is outright spam, but sometimes it can be hard to decide whether to disavow a link or not.

I’ve seen cases where, while performing a link audit, the SEO company decided to disavow every link that was anchored with a keyword.

The issue with this is that not all keyword-anchored links are unnatural. If a major news publication wrote about your company and legitimately linked to you using your keyword as the link anchor, this is a good thing.

Additionally, I’ve witnessed people disavow every single directory link pointing to the site. While directories certainly can be a source of unnatural links, there are many directories that are valid citations and good links. Removing or disavowing links from good directories can destroy your rankings, both in the organic rankings and in the local pack.

I’ve even had cases where small business owners blindly trust an SEO company to do a link audit, only to have that company disavow every single link pointing to their client’s site.

Final thoughts:

My intention in writing this article is not to advise people to try to game Google by reavowing links that were self-made for SEO purposes.

Instead, I would urge you to review your disavow file to see if perhaps you have been too aggressive in disavow decisions. You may find that reavowing some decent links that were disavowed in error eventually results in a positive difference in your rankings.

What do you think? Have you disavowed links in error? What are your experiences and thoughts on reavowing links? Let us know in the comments below.

For more information visit:- http://searchenginewatch.com

Did Google Panda 4.0 Go After Press Release Sites?

Nine days ago when Google released Panda 4.0 – which aims to filter out “thin” content from top rankings – we focused our attention on the big Winners & Losers charts, Since then, some have noted that press releases may have also been hit big time.

Using SearchMetrics, I looked at the top Press Release sites and checked if they lost any SEO visibility on a large scale since Panda 4.0 hit, and PRWeb.com, PR Newswire, BusinessWire and PRLog all seem to have lost significant rankings in Google.

PRNewsWire.com seems to have shown a significant drop in SEO visibility, dropping ~63% after Panda 4.0 was released:

prnewswire-panda4

PRWeb.com, another major press release distribution web site, also saw a huge drop, ~71%: Continue reading

Google Webmaster Tools Has Announced a New “Index Status” Feature

Google Webmaster Tools has advertised another “Index Status” offer now tracks a site’s listed Urls for every convention (HTTP and HTTPS), and in addition for checked subdirectories.

“This makes it easy for you to monitor different sections of your site,” Google’s announcement said. “For example, the following URLs each show their own data in Webmaster Tools Index Status report, provided they are verified separately:”

Google Webmaster Tool Indexing Data

The announcement highlighted the change in the reports if you have a website on HTTPS, or if some content is indexed under different subdomains. The reports will have an annotation, and look something like this: Continue reading

Google’s New Search Layout

Google Search LayoutThe new Google search layout users began seeing a couple weeks ago on a limited basis has now gone live to all users.

Google’s new layout, which changes the font and removes underlines from links, as well as displays the AdWords ads at the top differently, has definitely been getting poor reviews as it rolled out to everybody.

The headlines are larger the description text seems to be slightly lighter and they have adjusted the fonts with the wider typeface.

For AdWords ads, gone is the light yellow in the background that we have long associated as being advertising space for many years now. The new style doesn’t have any shaded background, instead it has a tiny yellow “Ad” marker next to the green URL. There is also an underline separating the ads from the organic search results.

Beyond the cosmetic change, the new search layout is affecting SEO in a pretty pronounced way. Titles that were optimized to the maximum 70 allowable characters for SEO purposes will now find the same headlines truncated in Google’s new results, giving everyone about 59-60 characters to work with. This means you might have a lot of work ahead of you trying to rework titles so they don’t appear poorly truncated in the search results, which could impact click-throughs to your site.

The first time many user saw the changes, many users thought they actually had their search hijacked or were falling victim to Google spoofing, because the search results looked completely different. And the reviews definitely aren’t good across the board, judging from all the comments by very upset searchers, something that actually made the switch to another search engine strictly because of the new look.

To me, it looks like a throwback to how search engine results looked 10 to 15 years ago, such as on Webcrawler or Hotbot, not something that has been refreshed for 2014. And I do agree with many people who say the new font makes it much harder to read and scan when on a desktop.

Google’s logic behind the new change was that they make the changes for mobile and tablets, and they carry out over several the design changes to desktop users. Google said they feel this creates improved readability and a much cleaner look. And they had an end goal of creating a consistent user experience across multiple devices (desktop, mobile tablet):

Towards the end of last year we launched some pretty big design improvements for Search on mobile and tablet devices (mobile first! :-). Today we’ve carried over several of those changes to the desktop experience.

We’ve increased the size of result titles, removed the underlines, and evened out all the line heights. This improves readability and creates an overall cleaner look. We’ve also brought over our new ad labels from mobile, making the multi-device experience more consistent.

Improving consistency in design across platforms makes it easier for people to use Google Search across devices and it makes it easier for us to develop and ship improvements across the board.

Will we see any changes reverted back? Hard to say, but Google doesn’t too often revert back on their changes once they’ve jumped in and made them.

Click Here to Read More Related Stories

Google Starts Penalizing Sites for Rich Snippet Spam

Google PenaltyIf you use rich snippets on your websites, you should be aware that Google is now penalizing websites for spamming structured data markup.

The new warning was first mentioned in a forum post on the Google Webmaster Central forums from a user who is asking for clarification about the warning and what the issue could be. It is a manual action penalty based on incorrect usage of markups, regardless of whether it was deliberate spam or simply a mistake.

The warning that would appear in a user’s accounts if they have manual action taken is:

Markup on some pages on this site appears to use techniques such as marking up content that is invisible to users, marking up irrelevant or misleading content, and/or other manipulative behavior that violates Google’s Rich Snippet Quality guidelines.

The writing on the wall for penalties related to rich snippets was back in October at Pubcon when Google’s Matt Cutts talked about changes Google was planning in regards to rich snippets and dealing with related snippet spam.

Rich snippets could get a revamp and they will dial back on the number of websites that will be able to display rich snippets. “More reputable websites will get rich snippets while less reputable ones will see theirs removed,” says Matt.

The new penalty seems to affect websites that are misusing rich snippets, such as including authorship on homepages and reviews on pages where there are no reviews. But there was evidence that Google was attempting to educate webmasters on how to use it correctly when they made changes in December to add debugging support for structured data.

If you’re unclear if you’re using rich snippets correctly, you should first check your Webmaster Tools account and see if there’s any issues that show up, either as issues or in the structured data debugging area. Google also has the pretty significant help area for rich snippets, including with videos, to help webmasters implement structured data correctly.

Click Here To Read More Related Stories

Google’s Matt Cutts: We Don’t Use Twitter Or Facebook Social Signals To Rank Pages

Google’s head of search spam, Matt Cutts, released a video today answering the question, “are Facebook and Twitter signals part of the ranking algorithm?” The short answer was no.

Matt Cutt'sMatt said that Google does not give any special treatment to Facebook or Twitter pages. They are in fact, currently, treated like any other page, according to Matt Cutts.

Matt then answered if Google does special crawling or indexing for these sites, such as indexing the number of likes or tweets a specific page has. Matt said Google does not do that right now. Why?

They have at one point and they were blocked. I believe Matt was referring to Google’s real time search deal expiring with Twitter. Matt explained that they put a lot of engineering time into it and then they were blocked and that work and effort was no longer useful. So for Google to put more engineering time into this and then be blocked again, it just doesn’t pay.

Another reason, Google is worried about crawling identity information at one point and then that information changes but Google doesn’t see the update until much later. Having outdated information can be harmful to some people. Continue reading