It’s Mobilegeddon hot in here: Google strengthens its mobile-friendly ranking signal

If you have a mobile-friendly website, stand down. This news is not for you.

mobilegeddon2-ss-1920-800x450If you don’t have a mobile-friendly website, my goodness what the hell are you doing!?? Have you not had enough warnings already? Stop reading this right now and go get a responsive website immediately you maniac!

So for anyone else reading this news – probably one person: my boss, the editor-in-chief of ClickZ (hi Graham!) – here’s a quick update to one of the biggest changes in the way Google ranks your website made in the last few years.

According to an announcement yesterday, from the beginning of May 2016, Google will increases the effectiveness of its ‘mobile friendly’ ranking signal. Continue reading

Advertisements

Should you audit your disavow file?

In October 2013, Google gave us the Disavow tool.

This allowed webmasters to instruct Google not to count the link metrics that flowed either through certain links or from certain domains.

If you’ve had a manual penalty or have been dealing with Google’s Penguin algorithm, you’ve probably filed a disavow file.

Disavow Links

For the last two years, I have reviewed a large number of disavow files, which often harm sites’ rankings. In some cases, I have suggested auditing the disavow file to determine whether it should be modified and resubmitted.

Here are some possible scenarios in which a site owner may make the decision to thoroughly review their disavow file.

Have you relied on link auditing software to make disavow decisions?

Some link auditing tools can be quite helpful when it comes to organizing your links. But it is vitally important that you take this well-organized list and manually review the disavow suggestions the tool has made.

I have seen far too many businesses blindly take the report generated from one of these tools and submit it directly to Google. Usually when this happens, a large number of unnatural links go undetected and are not disavowed.

Viewing the backlink profile for a site that had relied on an automated tool for disavow decisions recently, I discovered unnatural links from 965 new domains that had not been disavowed. It’s no wonder this site was still struggling with Penguin.

However another problem that I have seen, is that these automated tools will often flag really good links for removal. In one case, an automated tool flagged some valid press mentions – from BBC News, The Guardian, and other great news sources – as unnatural links.

It’s important to remember that the disavow suggestions made by these tools are suggestions, therefore, they need to be reviewed manually.

As such, if you have relied on an automated tool to make disavow decisions, it may be worthwhile to review your disavow file and see if you have accidentally disavowed some good domains. If you have, you can remove those lines from your disavow file and resubmit it.

Google should eventually start to count the link equity from these good sources again. However, it may take a while for the links to start helping again.

Were you super aggressive while removing links for a manual action?

If you’ve ever tried to remove a manual unnatural links penalty from your site, you know that Google can be pretty strict when it comes to giving you that beautiful “spam action revoked” message.

After each failed attempt at reconsideration, site owners often trim away at their links, eventually becoming so desperate that they end up getting rid of almost all them. In many cases, there were some unnecessary disavow decisions.

Auditing a disavow file after an overly aggressive link pruning can be tough. You certainly don’t want to try to game the system and reavow links that are unnatural. But if you feel that you have disavowed links that were actually valid, it may be worthwhile to have another look at your link profile.

A word of caution: If you decide to reavow links, be careful. It may be a good idea to have an impartial party review your reavow decisions to make sure that these links really are decent ones.

Did you hire an inexperienced person to do your link audit?

Sadly this is a very common experience. Knowing which links to disavow can be difficult to determine. No one can accurately predict exactly which links Google wants to see disavowed.

Some decisions are easy, especially when the link is outright spam, but sometimes it can be hard to decide whether to disavow a link or not.

I’ve seen cases where, while performing a link audit, the SEO company decided to disavow every link that was anchored with a keyword.

The issue with this is that not all keyword-anchored links are unnatural. If a major news publication wrote about your company and legitimately linked to you using your keyword as the link anchor, this is a good thing.

Additionally, I’ve witnessed people disavow every single directory link pointing to the site. While directories certainly can be a source of unnatural links, there are many directories that are valid citations and good links. Removing or disavowing links from good directories can destroy your rankings, both in the organic rankings and in the local pack.

I’ve even had cases where small business owners blindly trust an SEO company to do a link audit, only to have that company disavow every single link pointing to their client’s site.

Final thoughts:

My intention in writing this article is not to advise people to try to game Google by reavowing links that were self-made for SEO purposes.

Instead, I would urge you to review your disavow file to see if perhaps you have been too aggressive in disavow decisions. You may find that reavowing some decent links that were disavowed in error eventually results in a positive difference in your rankings.

What do you think? Have you disavowed links in error? What are your experiences and thoughts on reavowing links? Let us know in the comments below.

For more information visit:- http://searchenginewatch.com

Google launched “Trash Can” Which Recovers your Deleted Analytics

A new recovery feature for Google Analytics means that users will never again have to worry about deleting data or accounts. Today, Google launched “Trash Can,” which allows users to undo deletes in Google Analytics.

Trash Can is a safety net that saves information each time users delete a view, property, or account from Google Analytics. To use Trash Can, Analytics users simply select an account from the Administration tab and click the Trash Can icon. Then a list of deleted information appears, and users simply check off the information they want reclaimed and hit restore to return the document to its previous state.

google-trash-can

The only catch is that Trash Can only stores information for 35 days. After that, it gets deleted permanently.

While the feature hasn’t officially launched for Google Analytics, information is being stored in Trash Can as of today. When the product does appear in Analytics accounts, all deleted information should appear in the Trash Can folder.

Trash Can is a direct response to user feedback, according to a Google rep. “We heard from a lot of users that had mistakenly deleted their accounts, properties, and views. Especially in a multi-user environment, mistakes like this happen too often. Trash Can gives users a safety net, a chance to recover things before being deleted forever.”

Google to Shut Down Orkut Social Network on September

orkut social network

Google has called time on its social media service Orkut, claiming that success in other areas of the business have made its existence irrelevant.

Google’s engineering director Paulo Golgher said the administration would close on September 30, as the organization decides to concentrate on its more mainstream, and more beneficial, business ranges.

“Over the past decade, YouTube, Blogger and Google+ have taken off, with communities springing up in every corner of the world. Because the growth of these communities has outpaced Orkut’s growth, we’ve decided to bid Orkut farewell,” he said. “We’ll be focusing our energy and resources on making these other social platforms as amazing as possible for everyone who uses them.”

Google has given clients a three-month period to fare profile information utilizing its Google Takeout administration, however no all the more new Orkut profiles could be made.

“We are preserving an archive of all public communities, which will be available online starting 30 September 2014. If you don’t want your posts or name to be included in the community archive, you can remove Orkut permanently from your Google account,” Golgher added. “It’s been a great 10 years, and we apologize to those still actively using the service.”

Google consistently shuts its services in ‘spring cleans’ of its projects and tools as it looks to guarantee it doesn’t squander an excessive amount of time and assets on fizzled thoughts.

Orkut was seen as the first attempt by Google to enter the social media market, but it failed to generate much enthusiasm, although it did prove popular in Brazil.

For More Information Visit:- http://searchenginewatch.com/article/2353148/Google-to-Close-Orkut-Social-Netwok

Did Google Panda 4.0 Go After Press Release Sites?

Nine days ago when Google released Panda 4.0 – which aims to filter out “thin” content from top rankings – we focused our attention on the big Winners & Losers charts, Since then, some have noted that press releases may have also been hit big time.

Using SearchMetrics, I looked at the top Press Release sites and checked if they lost any SEO visibility on a large scale since Panda 4.0 hit, and PRWeb.com, PR Newswire, BusinessWire and PRLog all seem to have lost significant rankings in Google.

PRNewsWire.com seems to have shown a significant drop in SEO visibility, dropping ~63% after Panda 4.0 was released:

prnewswire-panda4

PRWeb.com, another major press release distribution web site, also saw a huge drop, ~71%: Continue reading

Google Webmaster Tools Has Announced a New “Index Status” Feature

Google Webmaster Tools has advertised another “Index Status” offer now tracks a site’s listed Urls for every convention (HTTP and HTTPS), and in addition for checked subdirectories.

“This makes it easy for you to monitor different sections of your site,” Google’s announcement said. “For example, the following URLs each show their own data in Webmaster Tools Index Status report, provided they are verified separately:”

Google Webmaster Tool Indexing Data

The announcement highlighted the change in the reports if you have a website on HTTPS, or if some content is indexed under different subdomains. The reports will have an annotation, and look something like this: Continue reading

Google’s New Search Layout

Google Search LayoutThe new Google search layout users began seeing a couple weeks ago on a limited basis has now gone live to all users.

Google’s new layout, which changes the font and removes underlines from links, as well as displays the AdWords ads at the top differently, has definitely been getting poor reviews as it rolled out to everybody.

The headlines are larger the description text seems to be slightly lighter and they have adjusted the fonts with the wider typeface.

For AdWords ads, gone is the light yellow in the background that we have long associated as being advertising space for many years now. The new style doesn’t have any shaded background, instead it has a tiny yellow “Ad” marker next to the green URL. There is also an underline separating the ads from the organic search results.

Beyond the cosmetic change, the new search layout is affecting SEO in a pretty pronounced way. Titles that were optimized to the maximum 70 allowable characters for SEO purposes will now find the same headlines truncated in Google’s new results, giving everyone about 59-60 characters to work with. This means you might have a lot of work ahead of you trying to rework titles so they don’t appear poorly truncated in the search results, which could impact click-throughs to your site.

The first time many user saw the changes, many users thought they actually had their search hijacked or were falling victim to Google spoofing, because the search results looked completely different. And the reviews definitely aren’t good across the board, judging from all the comments by very upset searchers, something that actually made the switch to another search engine strictly because of the new look.

To me, it looks like a throwback to how search engine results looked 10 to 15 years ago, such as on Webcrawler or Hotbot, not something that has been refreshed for 2014. And I do agree with many people who say the new font makes it much harder to read and scan when on a desktop.

Google’s logic behind the new change was that they make the changes for mobile and tablets, and they carry out over several the design changes to desktop users. Google said they feel this creates improved readability and a much cleaner look. And they had an end goal of creating a consistent user experience across multiple devices (desktop, mobile tablet):

Towards the end of last year we launched some pretty big design improvements for Search on mobile and tablet devices (mobile first! :-). Today we’ve carried over several of those changes to the desktop experience.

We’ve increased the size of result titles, removed the underlines, and evened out all the line heights. This improves readability and creates an overall cleaner look. We’ve also brought over our new ad labels from mobile, making the multi-device experience more consistent.

Improving consistency in design across platforms makes it easier for people to use Google Search across devices and it makes it easier for us to develop and ship improvements across the board.

Will we see any changes reverted back? Hard to say, but Google doesn’t too often revert back on their changes once they’ve jumped in and made them.

Click Here to Read More Related Stories

Google Starts Penalizing Sites for Rich Snippet Spam

Google PenaltyIf you use rich snippets on your websites, you should be aware that Google is now penalizing websites for spamming structured data markup.

The new warning was first mentioned in a forum post on the Google Webmaster Central forums from a user who is asking for clarification about the warning and what the issue could be. It is a manual action penalty based on incorrect usage of markups, regardless of whether it was deliberate spam or simply a mistake.

The warning that would appear in a user’s accounts if they have manual action taken is:

Markup on some pages on this site appears to use techniques such as marking up content that is invisible to users, marking up irrelevant or misleading content, and/or other manipulative behavior that violates Google’s Rich Snippet Quality guidelines.

The writing on the wall for penalties related to rich snippets was back in October at Pubcon when Google’s Matt Cutts talked about changes Google was planning in regards to rich snippets and dealing with related snippet spam.

Rich snippets could get a revamp and they will dial back on the number of websites that will be able to display rich snippets. “More reputable websites will get rich snippets while less reputable ones will see theirs removed,” says Matt.

The new penalty seems to affect websites that are misusing rich snippets, such as including authorship on homepages and reviews on pages where there are no reviews. But there was evidence that Google was attempting to educate webmasters on how to use it correctly when they made changes in December to add debugging support for structured data.

If you’re unclear if you’re using rich snippets correctly, you should first check your Webmaster Tools account and see if there’s any issues that show up, either as issues or in the structured data debugging area. Google also has the pretty significant help area for rich snippets, including with videos, to help webmasters implement structured data correctly.

Click Here To Read More Related Stories

Google’s Matt Cutts: We Don’t Use Twitter Or Facebook Social Signals To Rank Pages

Google’s head of search spam, Matt Cutts, released a video today answering the question, “are Facebook and Twitter signals part of the ranking algorithm?” The short answer was no.

Matt Cutt'sMatt said that Google does not give any special treatment to Facebook or Twitter pages. They are in fact, currently, treated like any other page, according to Matt Cutts.

Matt then answered if Google does special crawling or indexing for these sites, such as indexing the number of likes or tweets a specific page has. Matt said Google does not do that right now. Why?

They have at one point and they were blocked. I believe Matt was referring to Google’s real time search deal expiring with Twitter. Matt explained that they put a lot of engineering time into it and then they were blocked and that work and effort was no longer useful. So for Google to put more engineering time into this and then be blocked again, it just doesn’t pay.

Another reason, Google is worried about crawling identity information at one point and then that information changes but Google doesn’t see the update until much later. Having outdated information can be harmful to some people. Continue reading

5 Social Media Advertising Trends to Watch for in 2014

Social Media Advertising Trends

Social media advertising is what traditional search engine marketing was 5 years ago: constantly in development with major changes coming every other week. That’s not to say that SEM has stagnated and development has stopped – it hasn’t (enhanced campaigns anyone?).

The point is that 2013 has seen more changes and big leaps forward in social media advertising then any year before. The development boom hasn’t slowed down at all this year and is propelling the entire social media space forward into 2014.

Here are five trends in social media advertising to watch out for next year.

More Twitter Ads

Twitter had a big year in 2013. The social media network saw changes across the user interface on desktops and mobile devices. The ad platform grew leaps and bounds. And perhaps most importantly, Twitter filed for and successfully completed their IPO.

Previously I wrote about what that meant for Twitter Ads. In short, using Facebook’s IPO and subsequent development boom as an example, it is safe to assume we should expect big things from Twitter. Throughout 2013 and leading up to the IPO Twitter successfully overhauled the advertising interface and targeting functionality, launched TV ad targeting and created the impressive Lead Generation Cards.

Since the IPO in October, the Twitter development train has kept right on chugging. Now we have Tailored Audiences (remarketing), promoted accounts in timelines, and a true “broad match” for keyword targeting.

Now that Twitter is beholden to shareholders, expect this kind of rapid development in the advertising platform to continue. Ad revenue = happy shareholders.

Facebook Video Ads

TV advertising was (and still is) a fundamental piece to the modern advertising puzzle. Video has proven time and again to have significant impact on branding and purchasing decisions.

Now that the world is shifting their television habits online, this means that the video advertising model must shift with it. While the devices are different, the end customer will still be drawn to (good) video advertising.

YouTube is already killing it with their video ad campaign product. The InStream ad unit alone has so much potential. So, how does all of this tie in to Facebook? Videos are prominently featured in Facebook feeds organically. After they tested the “autoplay” video content organically, it was a natural progression to see this come to video ads.

This ad unit is still in testing at Facebook and the video plays on silent – so the impact won’t be quite the same as what is seen on YouTube. However, I fully expect this testing to expand and for video to play a bigger role in Facebook ads in 2014. If Facebook is successful with video, who knows what implications that will have across the social media world?

Ads on Google+?

A few years into Google’s gamble in social media and Google+ is still completely void of ads. Many people likely are quite pleased by this. Countless others (no doubt advertisers) are chomping at the bit waiting for the day that they can place targeted ads on this coveted real estate.

Will it happen in 2014? Well, just a few weeks ago it came to light that Google will begin testing +Post Ads. While this isn’tan ad unit on Google+, it allows brands to take quality Google+ posts and advertise them across the Google Display Network.

Could this be a baby step toward a larger advertising platform built around Google+? Maybe. In the meantime, all we can do is keep on wishing!

Fragmentation vs. Variety

New social media networks pop up nearly every week it seems. The vast majority of them burn out before they secure mass-appeal and adoption. A few maintain relevance and start to grow. Pinterest, SnapChat, and Instagram are just a few examples of those that have found staying power. (Instagram is owned by Facebook, so that’s not exactly a fair comparison)

With mass-appeal and adoption comes the added responsibility of creating revenue. Pinterest and Instagram are both experimenting with fledgling advertising platforms. Reddit, Foursquare, Tumblr and others are already providing self-serve ad campaigns.

Moving into 2014, this race to create new social media networks and advertising platforms to support them will create an interesting dilemma for advertisers. On one hand, you have a variety of channels that cater to unique targets and demographics. On the other hand, you have the issue of fragmentation. For every new social network and ad platform, you’re forced to deal with varied ad units (sizes, editorial policies, etc.) and management requirements.

The tools to consolidate management of these channels simply don’t exist – that or the channels themselves are not sophisticated enough to have APIs, etc. (looking at you LinkedIN Ads!). As the greater world of social media continues to grow and more networks take root in 2014, expect your role as an advertiser to only become more complicated. Good luck!

Blurred Lines Between Paid and Organic Content

The original Facebook ad unit (appearing on the right) was simple and straightforward. It followed the general format pioneered by the search engines through PPC. Safe. Predictable.

Then came Twitter and the promoted tweets for timelines – an ad that fit seamlessly into the organic construct of the social experience. Since then Facebook and LinkedIn have followed suit with newsfeed ad units that have boosted ad engagement and click-through rates while simultaneously blurring the lines of what is organic or paid social content.

There’s a precedent for this. Look at the search engines. Google, Bing and Yahoo all have conducted numerous tests to make the PPC ads look more like organic listings. Why? More people click on them.

Now social networks are up to the same tricks – only in a far more aggressive (nay, effective) manner. Promoted status updates blend in seamlessly save for a simple marker that says “sponsored.” From a mobile device the average user may never realize that the status update they just read and clicked on was actually an ad.

This trend is only going to grow in 2014. New ad units at Instagram and Pinterest are designed to blend into the natural landscape of the user’s social feed. Facebook is concerned with dwindling usage and has gone so far as to suppress organic content – effectively forcing advertisers to promote their content.

Will the users of these social sites continue to be OK with this? Time will tell.

Summary

It has been an incredible year of growth for social media advertising, with new features, targeting options, and channels to explore. But 2013 has merely been a preview of what is to come. Here’s to an exciting 2014!

Click Here To Read More Related Stories

Role & Benefits of Business Intelligence Services in Current & Small Enterprises and Databases

With the regular ups and downs in the economy, businessmen have to look for the better and effective business solutions. Business intelligence services are one of them. A business intelligence service is the service in which all the processes of the business are joined together to present a unified development for the entire organization, by using current technology. Business and technology have been working together for quite some time now so that they can contribute in the growth of the nation with the growth of the organization as a whole.

Business Intelligence ServicesThe needs of the business and the companies keep on changing with time. It is essential to have the latest technology and machinery so that a particular company does not lag behind in the race of becoming number one. It can be better software, finances, machines, better resources, cheaper storage options and data usage for the entire organization. Data storage and its easy availability is a must for the development of the company.

Technology firms who provide business intelligence services are aware of the need of data storage for business houses, thus they provide online data storage services. By opting for these services you can store data of your entire organization in form of bytes as digital files so that you can save as much data as possible. To have these services it is not necessary to upgrade all the systems. Business process automation system will take care of it. All data files will be available online to every employee of the company. To make sure that the data is safe and secure you must get it locked and accessible to anyone only with id and password.

Data can consist of the details of the new project, clients, what is currently happening in these projects, who all are working on them, financial details of the company and the details of the resources. All the different departments can save their data online from where other departments who reԛuire the data can access it easily with their id password.

Moreover the business intelligence services are affordable and cheap. For small scale companies, they provide custom web applications and software. The entire program or the software will be customized as per your requirement so that unnecessary charges can be avoided. Use this business intelligence program intelligently with help of the technological firm and find out what more they can do to make your company generate more profits.

It wont be exaggerated to say that technology has intervened in every part of our life and business world, both online and offline also are witnessing new concepts everyday which promises to make business better and easier as well.

It is a lot economical these days to outsource data warehousing due to firm competition between numerous online firms offering the services at very competitive rates and the trend is still on increase. While talking about data warehousing one cannot simply avoid the possibility of data integration problems in case it couldnt be done in improper manner or order. So, the need of services of data integration consultants gets inevitable who have effective as well as economical solutions to the problems of a business.

Matt Cutts: Don’t Stitch Copied Content from Multiple Sites

While many marketers and site owners understand the importance of quality, original content in their marketing plans, there are still those who are testing the boundaries of what that actually means. In a recent Google Webmaster Help video, Matt Cutts talked about the line between spam content and original content, once again.

The question was: Can you copy small portions of content from multiple websites and combine it to create one new web page, and compete in Google with that?

Matt Cutts - Stiching Content

“I fear you may be heading for heartbreak,” Cutts replied in the video. He added that if this is something you’re considering, you should really be asking yourself why you’re doing it in the first place. For example, are you trying to automate?

Cutts began explaining the practice in question by talking about Yahoo’s distaste for it in days past. Yahoo referred to this tactic as “stitching” and “they really considered that spam,” said Cutts. Continue reading

Matt Cutts: Larger Websites Don’t Automatically Rank Higher on Google

Matt Cutts

Webmasters have always gone by the rule that the more pages you get indexed on a website, the better it is for Google. Not only do you have a larger website overall, but you can also capture a lot of long-tail search traffic from visitors who end up on one of those many internal pages.

But does Google really bring sites higher or better if the website has more pages indexed in the search results? This is the topic of the latest webmaster help video. Continue reading

Updated Facebook Graph Search Now Lets You Search for Posts, Status Updates, Check-ins, Comments

Facebook’s Graph Search has been updated to allow people to search in greater depth on Facebook. The new Graph Search now allows users to search for not just the usual people and places, but also to search posts, status updates, check-ins, and comments on Facebook.

Yet users also now allows you to search within a period of time, such as posts within the last month or year:Facebook Graph Search

You can also search for comments made by people posting from a specific location. Users can also search specifically for posts that they have commented then and search their own posts, and narrow those searches within a specific time period. Continue reading

Google Webmaster Tools Give Users More Link Data

Google Webmaster ToolsGoogle’s Distinguished Engineer Matt Cutts kicked off SES San Francisco this morning by announcing a change to the way Google Webmaster Tools serves backlinks to users. Now, instead of getting a huge list of backlinks in alphabetical order, they are giving a better representation of all the backlinks.

“If I download my backlinks in Webmaster Tools, my list ends at H. If you are Amazon or eBay, you get 000000a.com to 000000c.com,” Cutts said.

When Google is serving 100,000 backlinks in Webmaster Tools, it wasn’t that useful to users when they could get so many results from a single domain, and there was no way to sort them. Continue reading

Twitter New Update: Lead Generation Cards for All Advertisers, Conversation View

Twitter was busy this week with two announcements that included the availability of Lead Generation Cards to all advertisers, and an easier way to follow conversation happening on Twitter.

Lead Generation Cards Available to All

Back in May, Twitter announced “Lead Generation Cards” that allowed a select group of advertisers to experiment with capturing emails right in a tweet. This week, Twitter unveiled the feature to all languages and all levels of advertisers from big to small businesses. Continue reading

Google Keyword Tool Has Been Replaced By Keyword Planner

Google Keyword ToolThe Google Keyword Tool, used by SEO professionals the world over, has officially been retired. While the sentiments regarding this change have been mixed, there are positive and negative aspects to being forced to use the Keyword Planner instead of the Keyword Tool. Continue reading

New Paid And Organic Report Adds By Google AdWords

Google AdWords is introducing a new feature for advertisers to give more data right within the AdWords interface, even when it isn’t paid ads specific. This is part of their campaign to connect data between the Google AdWords, Google Analytics, and Webmaster Tools.adwords paid and organic report

The new paid & organic report, which can help advertisers see their search footprints and enable them to determine if there are keyword areas that can be supplemented with paid advertising. It also allows you to view detailed reports to show for particular keywords, how much organic traffic as well as how it advertising traffic you are getting or have the potential to get.

Google suggesting several ways for advertisers confined to the inclusion of organic traffic information beneficial to their business. You can:

  • Look for additional keywords where you might have impressions on natural search, but without any related ads.
  • Use it to optimize your presence for your most important high-value keyword phrases, so you can see where you need to improve your presence.
  • Use it to test website improvements and AdWords changes, as you can compare traffic across both AdWords and organic in the same interface, which enables you to adjust accordingly. Continue reading

How To Measure SEO Success

Correctly measuring the success of an SEO campaign can vary greatly depending on the type of business you’re in and your objectives.Success Failure

However, there are three key performance indicators (KPIs) that should always be considered when measuring an SEO campaign’s effectiveness:

  • Rankings
  • Traffic
  • Conversions

Not only can the information gathered from these three KPIs enable you to accurately measure your campaign’s performance, they can also provide you with actionable data to improve your campaign over time.

measure-rankings

Keyword rankings are the most common and obvious KPI, especially when studies show that websites listed on the first page of Google receive up to 92 percent of traffic share. Tracking keyword rankings over time gives you the ability to craft your SEO strategy around the keywords that require the most attention and provide the most benefit.

For example, let’s say you’re tracking 20 keywords, and all but five of these are on the first page of Google. You know that in order to get these five keywords on the first page, you will have to invest more optimization efforts into them.

On the other hand, you may discover that these keywords are simply too competitive, and based on your research, would not provide enough benefit to warrant the effort. It would be more beneficial to focus efforts on the other 15 keywords in order to get them into the top three positions, where they’ll really pay off. Without keyword ranking data, making informed strategic decisions such as this would be very difficult. Continue reading

Google Analytics : New Changes To Advanced Segmentation Features

Google Analytics has evolved again. The Analytics team has announced new changes to the advanced segmentation features. First and foremost, a new interface that simplifies the process of making a segment. Secondly, a slight – but important – change in how you can segment.

Google Analytics New User Segment

Image Credit: Justin Cutroni

The new interface seriously changes the process of selecting built-in advanced segments or creating a new segment. Similar to the new Admin area, the segmentation interface is less colorful, but more compartmentalized.

Google Analytics Segmented Traffic

The change in advanced segments also brings with it a change to the reporting area. A new bar will appear instead of an advanced segments button.

The new bar contains a small button with a downward-facing arrow that opens the advanced segments interface. Icons representing each segment being applied appear to the right. The icons also contain donut graphs with label indicating the percentage of the population the segment represents.

At first blush, the interface appears to offer less options. But once you dig into it, you’ll find it actually contains more options for creating new segments than ever before.

Users, Not Visits

The biggest change from yesterday’s announcement isn’t actually the interface, but in how segments can be created. Previously, when you created an advanced segment, the segments were based on visits, not actual users. With this new change, segments are based on individual users. Continue reading

Search Ranking Factors Survey Results 2013 – SEOmoz

Google Overall Survey Algorithm

Google Overall Survey Algorithm

Moz has released their 2013 search engine ranking factors, surveying 120 SEO professionals and having them rank different search factors. While this isn’t the full survey data, it does have a lot of interesting information to consider when you’re optimizing websites for search engines.

Moz uses search correlations in order to make estimates as to what is being used for Google’s ranking algorithm, based upon features on higher ranking sites versus ones that are lower ranking. They used over 14,000 keywords from Google AdWords across multiple categories then use of keywords to extract the top 50 organic search results in June, post Penguin 2.0.

One interesting correlation was that despite SEOs knowing that over optimization of keyword anchor text could be problematic and a sign of spamming, they found that the correlations for exact match and partial match was fairly high. But not surprisingly, the SEOs surveyed believed that diversity in anchor text, including both branded and nonbranded terms, was more important than the number of links themselves.

Moz also looked at on page keywords and not surprisingly found a very high correlation of those keywords in body text, title, meta-description and H1 tags. Likewise, the SEOs surveyed believed that including keywords in both the title and on page are important factors.

Moz also discovered that rankings of exact match domains (ie. Keyword1keyword2.com) has declined over the past year, although the correlation is still high. While having the keywords in the domain name was extremely important many years ago, SEOs have definitely shied away from exact match keywords in favor of partial match domains or branded URLs.

The look Moz took at social signals and its correlation with ranking factors is important. Google +1’s came out ahead of Facebook shares and tweets for correlated factors, although the surveyed SEOs did not believe that social signals are very important to Google search algorithm.

Their last look at ranking factors was in 2011, when they were still known as SEOmoz.

The full survey will be released by Moz in a few weeks.

Death of AdWords Keyword Tool – Google Pushes Users to Keyword Planner

Adwords Keyword Tool

The Google AdWords Keyword Tool has long been in the toolkit of most search engine marketers, regardless of whether they used Google AdWords. In fact, it was one of the first keyword tools available freely to webmasters in order to find keyword suggestions and build keyword list.

But users using this free tool are now seeing a notice stating that in the coming months the external Keyword Tool will no longer be available. Instead, they are steering users to their new Keyword Planner that they launched earlier this year.

The keyword planner is actually a combination of the Keyword Tool plus the Traffic Estimator, so pay per click advertisers on Google can see not only the suggested keywords but traffic estimates at the same time, along with some other relevant data Google includes.

Unfortunately, a lot of the data that we are used to seeing from the Keyword Tool hasn’t been integrated into the Keyword Planner, such as the ad share statistics. Google says they are working on adding a new column to give insight into ad impression share data.

The local search trends and search share columns of also been removed. However, some of the other data has been changed to only be available when you search for specific data or access your historical statistics.

Adwords Keyword PlannerThe new Keyword Planner tool is still free for use. However you must have a Google AdWords account, and be logged into that account, in order to use it. Continue reading

Why Buying Facebook Likes is a Bad Move

You might have come across some services that offer to provide you with hundreds or even thousands of Facebook ‘Likes’, and for anyone new to Facebook with just a handful of fans, this can seem like a very tempting offer.

Facebook

Firstly though, you need to stop and ask yourself – why do you want to buy likes? Sure, it’s nice to look popular, but what good will looking popular really achieve? The truth is, nothing at all.

The real value of Facebook comes from genuine fans. These are the people that will be past/existing customers and people who largely, have an actual interest in your brand and what you have to say. They are effectively, receptive sales leads and as such, it is generally worthwhile investing your time into engaging with them as they are far more likely to convert into customers or purchase from you again.

   Only genuine fans have a real intrinsic value to your business

The low down is, buy likes and your fans won’t have any genuine interest in your brand and certainty won’t be receptive to anything you do; in which case they are of little or no real value to you. It makes much more sense to build communities organically which is always our preferred approach. Sure, this takes a little longer to achieve, but it will result in a much higher quality of fan who will have real intrinsic value for your business. Continue reading

Twitter Opens Up Its Analytics Platform

lets everyone review the performance of their tweets for free

Twitter has quietly opened up its various analytics tools to the public, giving everyone access to in-depth data about the people and brands who follow them, as well as the performance of their most recent tweets.Twitter

The change was spotted by Christopher Penn, vice president of marketing technology at SHIFT Communications earlier this week, as well as Danny Olson, a digital strategist at Weber Shandwick. Users simply need to head to the Twitter Ads dashboard and click on the ‘Analytics’ tab at the top of the page to access the new features.

The Timeline activity displays a graph for the user based on the number of mentions, follows and unfollows that they’ve received over the last month. A detailed list underneath shows all of the user’s most recent tweets, including the number of times someone has favorited, retweeted or replied to it.

Screen-Shot-2013-06-12-at-15.49.46-730x431

All of the information is shown in a clear and accessible format so that every user, regardless of whether they’re the marketing manager of an international conglomerate or an emerging blogger with just a handful of followers, can analyze and take action based on the data.

Most notably, this list also shows the number of times that someone has clicked on the link contained in a tweet – an easy way to gauge referrals from one of the largest and most influential social networks on the Web. There’s even the ability to download a CSV file of the data for the user’s own personal records.

Until now, the advertising dashboard for Twitter has been aimed primarily at businesses who want to pay to display their tweets in front of a specific audience. Coupled with the ability to promote the user’s account, it’s an obvious and lucrative way for Twitter to monetize its service. Continue reading

4 Huge Brands That Still Aren’t on Social Media

Trader Joe'sIt’s almost 2013 and it seems like the whole world is on Facebook and Twitter. That’s pretty much true, except for that stubborn 24% of adult Americans who still haven’t joined Facebook, and for the five brands listed below.

Pretty much every other major brand is tweeting and providing updates to its fans on Facebook, even those you’d think wouldn’t bother, like Goldman Sachs and ExxonMobil. The following are brands that have no official account under their name on Facebook or Twitter at this time.

1. Trader Joe’s

The quirky, much-loved grocery chain has a fine website where it highlights new items in its stores and recipes. But you’ll notice there’s no prompt to “follow us on Twitter.” Like many things about Trader Joe’s, the company’s social media marketing strategy is a mystery. It’s in keeping with the company’s marketing philosophy, though. Trader Joe’s doesn’t do any traditional advertising, either. The lack of Facebook fans doesn’t seem to have hurt the chain: Sales in 2009 (the last year such figures were available) were around $8 billion — the same as the social media-friendly Whole Foods, according to Fortune.

2. Marlboro

Facebook is not Marlboro country. You won’t see any other cigarette brands on the social network, either, probably because it would risk running afoul of the numerous marketing restrictions placed on the category. (Though the Tobacco Master Settlement Agreement that set many of such limits, occurred in 1998, well before the birth of social media.) The lack of visibility appears to have hurt the brand: In 2006, Marlboro was no. 10 on Interbrand’s 100 Best Global Brands list. This year, it dropped out of the top 100.

3. Viagra

Viagra may rule the social media universe via spam, but there’s no official presence for the drug on Twitter or Facebook. As with Marlboro, it seems to be a category-wide circumstance; there’s no feed or Facebook presence for Levitra or Cialis, either. John Mack, editor and publisher of the Pharma Marketing News/Pharma Marketing Blog, says FDA regulations are a major hurdle for pharma brands on social media “Another factor has to do with reporting adverse events — as may be mentioned in social medai conversations,” he says.

4. Apple

The biggest company in the world is also the world’s biggest social media holdout. There’s no official Twitter or Facebook account for Apple at this point. Obviously, this hasn’t hurt the brand at all. Should you try to mimic Apple’s strategy? Brian Solis, principal analyst at Altimeter Group, thinks not: “The truth is, if they didn’t have that momentum going into this new, connected generation they would have to do it,” he says. “They already had word of mouth and that word of mouth continues. They already had that momentum.”

Google Search Algorithm Update To Target Spammy Queries

Google Payday Loan Algorithm

Google has officially launched a new search update to target “spammy queries” such as payday loan, pornographic and other heavily spammed queries.

Matt Cutts, Google’s head of search spam, announced this on Twitter saying “We just started a new ranking update today for some spammy queries.” He pointed to the video he published where he talked about upcoming Google SEO changes.

Our summary then was:

While queries that tend to be spammy in nature, such as [pay day loans] or some pornographic related queries, were somewhat less likely to be a target for Google’s search spam team – Matt Cutts said Google is more likely to look at this area in the near future. He made it sound like these requests are coming from outside of Google and thus Google wants to address those concerns with these types of queries.

Here is the video where he pre-announced this change at about 2 minutes in 30 seconds in:

This update impacted roughly 0.3% of the U.S. queries but Matt said it went as high as 4% for Turkish queries were web spam is typically higher.

For more of our coverage on Matt’s talk at SMX Advanced see Google’s Cutts Talks Structured Data Beta, Mobile Site Speed Need, Penalty Notices To Get Example Links & More.

SEO – The Beginner’s Guide

New to SEO? Need to polish up your knowledge? The Beginner’s Guide to SEO has been read over 1 million times and provides comprehensive information you need to get on the road to professional quality SEO.

Search Engine Optimization

Search Engine Optimization

Chapter 1:-How Search Engine Operates
Search engines have two major functions – crawling & building an index, and providing answers by calculating relevancy & serving results.Read More:- How Search Engine Operates

Chapter 2:- How People Interact With Search Engines
One of the most important elements to building an online marketing strategy around SEO is empathy for your audience. Once you grasp what the average searcher, and more specifically, your target market, is looking for, you can more effectively reach and keep those users.Read More:- How People Interact With Search Engines

Chapter 3:- Why Search Engine Marketing Is Neccessary
An important aspect of Search Engine Optimization is making your website easy for both users and search engine robots to understand. Although search engines have become increasingly sophisticated, in many ways they still can’t see and understand a web page the same way a human does. SEO helps the engines figure out what each page is about, and how it may be useful for users. Read More:- Why Search Engine Marketing Is Neccessary

Chapter 4:- Basics of Search Engine Friendly Design and Development
Search engines are limited in how they crawl the web and interpret content. A webpage doesn’t always look the same to you and me as it looks to a search engine. In this section, we’ll focus on specific technical aspects of building (or modifying) web pages so they are structured for both search engines and human visitors alike. This is an excellent part of the guide to share with your programmers, information architects, and designers, so that all parties involved in a site’s construction can plan and develop a search-engine friendly site.Read More:- Basics of Search Engine Friendly Design and Development

Chapter 5:- Keyword Research
Keyword research is one of the most important, valuable, and high return activities in the search marketing field. Ranking for the “right” keywords can make or break your website.Read More:- Keyword Research

Chapter 6:- How Usability Experience and Content Affect Search Engine Rankings
The search engines constantly strive to improve their performance by providing the best possible results. While “best” is subjective, the engines have a very good idea of the kinds of pages and sites that satisfy their searchers.Read More:- How Usability Experience and Content Affect Search Engine Rankings

Chapter 7:- Growing Popularity and Links
For search engines that crawl the web, links are the streets between pages. Using sophisticated link analysis, the engines can discover how pages are related to each other and in what ways.Read More:- Growing Popularity and Links

Chapter 8:- Search Engine Tools and Services
SEOs tend to use a lot of tools. Some of the most useful are provided by the search engines themselves. Search engines want webmasters to create sites and content in accessible ways, so they provide a variety of tools, analytics and guidance. These free resources provide data points and opportunities for exchanging information with the engines that are not provided anywhere else.Read More:- Search Engine Tools and Services

Chapter 9:- Myths and Misconceptions About Search Engines
Over the past several years, a number of misconceptions have emerged about how the search engines operate. For the beginner SEO, this causes confusion about what’s required to perform effectively. In this section, we’ll explain the real story behind the myths.Read More:- Myths and Misconceptions About Search Engines

Chapter 10:- Measuring and Tracking Success
They say that if you can measure it, then you can improve it. In search engine optimization, measurement is critical to success. Professional SEOs track data about rankings, referrals, links and more to help analyze their SEO strategy and create road maps for success.Read More:- Measuring and Tracking Success