$wpsc_last_post_update = 1504603841; //Added by WP-Cache Manager
Warning: Cannot modify header information - headers already sent by (output started at /home/link/public_html/blog/wp-content/wp-cache-config.php:2) in /home/link/public_html/blog/wp-includes/feed-rss2.php on line 8
Link Promotions Ltd Blog http://blog.linkpromotions.net Local Search Marketing Experts Tue, 05 Sep 2017 09:30:41 +0000 en-GB hourly 1 https://wordpress.org/?v=4.8.2 How to Determine if a Page is “Low Quality” http://blog.linkpromotions.net/2017/09/05/determine-page-low-quality/ http://blog.linkpromotions.net/2017/09/05/determine-page-low-quality/#respond Tue, 05 Sep 2017 09:30:41 +0000 http://blog.linkpromotions.net/?p=328 What constitutes “quality” for Google? So Google has some ideas about what’s high quality versus low quality, and a few of those are pretty obvious and we’re familiar with, and some of them may be more intriguing. So… Google wants unique content. They want to make sure that the value to searchers from that content […]

The post How to Determine if a Page is “Low Quality” appeared first on Link Promotions Ltd Blog.

]]>
What constitutes “quality” for Google?

So Google has some ideas about what’s high quality versus low quality, and a few of those are pretty obvious and we’re familiar with, and some of them may be more intriguing. So…

  • Google wants unique content.
  • They want to make sure that the value to searchers from that content is actually unique, not that it’s just different words and phrases on the page, but the value provided is actually different. You can check out the Whiteboard Friday on unique value if you have more questions on that.
  • They like to see lots of external sources linking editorially to a page. That tells them that the page is probably high quality because it’s reference-worthy.
  • They also like to see high-quality pages, not just sources, domains but high-quality pages linking to this. That can be internal and external links. So it tends to be the case that if your high-quality pages on your website link to another page on your site, Google often interprets that that way.
  • The page successfully answers the searcher’s query.

This is an intriguing one. So if someone performs a search, let’s say here I type in a search on Google for “pressure washing.” I’ll just write “pressure wash.” This page comes up. Someone clicks on that page, and they stay here and maybe they do go back to Google, but then they perform a completely different search, or they go to a different task, they visit a different website, they go back to their email, whatever it is. That tells Google, great, this page solved the query.

If instead someone searches for this and they go, they perform the search, they click on a link, and they get a low-quality mumbo-jumbo page and they click back and they choose a different result instead, that tells Google that page did not successfully answer that searcher’s query. If this happens a lot, Google calls this activity pogo-sticking, where you visit this one, it didn’t answer your query, so you go visit another one that does. It’s very likely that this result will be moved down and be perceived as low quality in Google.

  • The page has got to load fast on any connection.
  • They want to see high-quality accessibility with intuitive user experience and design on any device, so mobile, desktop, tablet, laptop.
  • They want to see actually grammatically correct and well-spelled content. I know this may come as a surprise, but we’ve actually done some tests and seen that by having poor spelling or bad grammar, we can get featured snippets removed from Google. So you can have a featured snippet, it’s doing great in the SERPs, you change something in there, you mess it up, and Google says, “Wait, no, that no longer qualifies. You are no longer a high-quality answer.” So that tells us that they are analyzing pages for that type of information.
  • Non-text content needs to have text alternatives. This is why Google encourages use of the alt attribute. This is why on videos they like transcripts. Here on Whiteboard Friday, as I’m speaking, there’s a transcript down below this video that you can read and get all the content without having to listen to me if you don’t want to or if you don’t have the ability to for whatever technical or accessibility, handicapped reasons.
  • They also like to see content that is well-organized and easy to consume and understand. They interpret that through a bunch of different things, but some of their machine learning systems can certainly pick that up.
  • Then they like to see content that points to additional sources for more information or for follow-up on tasks or to cite sources. So links externally from a page will do that.

This is not an exhaustive list. But these are some of the things that can tell Google high quality versus low quality and start to get them filtering things.

How can SEOs & marketers filter pages on sites to ID high vs. low quality?

As a marketer, as an SEO, there’s a process that we can use. We don’t have access to every single one of these components that Google can measure, but we can look at some things that will help us determine this is high quality, this is low quality, maybe I should try deleting or removing this from my site or recreating it if it is low quality.

In general, I’m going to urge you NOT to use things like:

A. Time on site, raw time on site

B. Raw bounce rate

C. Organic visits

D. Assisted conversions

Why not? Because by themselves, all of these can be misleading signals.

So a long time on your website could be because someone’s very engaged with your content. It could also be because someone is immensely frustrated and they cannot find what they need. So they’re going to return to the search result and click something else that quickly answers their query in an accessible fashion. Maybe you have lots of pop-ups and they have to click close on them and it’s hard to find the x-button and they have to scroll down far in your content. So they’re very unhappy with your result.

Bounce rate works similarly. A high bounce rate could be a fine thing if you’re answering a very simple query or if the next step is to go somewhere else or if there is no next step. If I’m just trying to get, “Hey, I need some pressure washing tips for this kind of treated wood, and I need to know whether I’ll remove the treatment if I pressure wash the wood at this level of pressure,” and it turns out no, I’m good. Great. Thank you. I’m all done. I don’t need to visit your website anymore. My bounce rate was very, very high. Maybe you have a bounce rate in the 80s or 90s percent, but you’ve answered the searcher’s query. You’ve done what Google wants. So bounce rate by itself, bad metric.

Same with organic visits. You could have a page that is relatively low quality that receives a good amount of organic traffic for one reason or another, and that could be because it’s still ranking for something or because it ranks for a bunch of long tail stuff, but it is disappointing searchers. This one is a little bit better in the longer term. If you look at this over the course of weeks or months as opposed to just days, you can generally get a better sense, but still, by itself, I don’t love it.

Assisted conversions is a great example. This page might not convert anyone. It may be an opportunity to drop cookies. It might be an opportunity to remarket or retarget to someone or get them to sign up for an email list, but it may not convert directly into whatever goal conversions you’ve got. That doesn’t mean it’s low-quality content.

THESE can be a good start:

So what I’m going to urge you to do is think of these as a combination of metrics. Any time you’re analyzing for low versus high quality, have a combination of metrics approach that you’re applying.

1. That could be a combination of engagement metrics. I’m going to look at…

  • Total visits
  • External and internal
  • I’m going to look at the pages per visit after landing. So if someone gets to the page and then they browse through other pages on the site, that is a good sign. If they browse through very few, not as good a sign, but not to be taken by itself. It needs to be combined with things like time on site and bounce rate and total visits and external visits.

2. You can combine some offsite metrics. So things like…

  • External links
  • Number of linking root domains
  • PA and your social shares like Facebook, Twitter, LinkedIn share counts, those can also be applicable here. If you see something that’s getting social shares, well, maybe it doesn’t match up with searchers’ needs, but it could still be high-quality content.

3. Search engine metrics. You can look at…

  • Indexation by typing a URL directly into the search bar or the browser bar and seeing whether the page is indexed.
  • You can also look at things that rank for their own title.
  • You can look in Google Search Console and see click-through rates.
  • You can look at unique versus duplicate content. So if I type in a URL here and I see multiple pages come back from my site, or if I type in the title of a page that I’ve created and I see multiple URLs come back from my own website, I know that there’s some uniqueness problems there.

4. You are almost definitely going to want to do an actual hand review of a handful of pages.

  • Pages from subsections or subfolders or subdomains, if you have them, and say, “Oh, hang on. Does this actually help searchers? Is this content current and up to date? Is it meeting our organization’s standards?”

Make 3 buckets:

Using these combinations of metrics, you can build some buckets. You can do this in a pretty easy way by exporting all your URLs. You could use something like Screaming Frog or Moz’s crawler or DeepCrawl, and you can export all your pages into a spreadsheet with metrics like these, and then you can start to sort and filter. You can create some sort of algorithm, some combination of the metrics that you determine is pretty good at ID’ing things, and you double-check that with your hand review. I’m going to urge you to put them into three kinds of buckets.

I. High importance. So high importance, high-quality content, you’re going to keep that stuff.

II. Needs work. second is actually stuff that needs work but is still good enough to stay in the search engines. It’s not awful. It’s not harming your brand, and it’s certainly not what search engines would call low quality and be penalizing you for. It’s just not living up to your expectations or your hopes. That means you can republish it or work on it and improve it.

III. Low quality. It really doesn’t meet the standards that you’ve got here, but don’t just delete them outright. Do some testing. Take a sample set of the worst junk that you put in the low bucket, remove it from your site, make sure you keep a copy, and see if by removing a few hundred or a few thousand of those pages, you see an increase in crawl budget and indexation and rankings and search traffic. If so, you can start to be more or less judicious and more liberal with what you’re cutting out of that low-quality bucket and a lot of times see some great results from Google.

This article is a partial transcript from a video by Rand Fishkin. The original video and transcript can be found here.

The post How to Determine if a Page is “Low Quality” appeared first on Link Promotions Ltd Blog.

]]>
http://blog.linkpromotions.net/2017/09/05/determine-page-low-quality/feed/ 0
3 Not So Friendly SEO Trends http://blog.linkpromotions.net/2017/04/10/3-not-so-friendly-seo-trends/ Mon, 10 Apr 2017 10:08:01 +0000 http://blog.linkpromotions.net/?p=313 It seems like every couple months (weeks?) there’s a new post predicting the end of SEO: But rather than proclaiming that SEO is already dead (it’s not), let’s look at 3 ways in which the SEO industry might eventually die and ways in which SEOs can prepare themselves. 1. New ads Google is constantly experimenting […]

The post 3 Not So Friendly SEO Trends appeared first on Link Promotions Ltd Blog.

]]>
It seems like every couple months (weeks?) there’s a new post predicting the end of SEO:

But rather than proclaiming that SEO is already dead (it’s not), let’s look at 3 ways in which the SEO industry might eventually die and ways in which SEOs can prepare themselves.


1. New ads

Google is constantly experimenting with new ad formats that actually provide a better user experience than organic results. The better the ads, the less traffic that will be captured by organic listings. What are some examples of when PPC ads provide a better UX than organic results?

Finding service providers in select cities:

Google has rolled out home service ads to a few markets as a way of making it easier to find various contractors. Even sites with incredible local SEO will have a hard time competing with the convenience of these ads, not to mention the extremely valuable endorsement by Google.

More info: Google Home Services Ads Launch In AdWords Express

Researching cars:

This is a relatively simple example of Google providing a bunch of relevant information directly in the SERPs with a simple interface that will, inevitably, come at the expense of people clicking through to organic listings about car reviews, details, dealerships, etc. I suspect that Google will continue rolling out these rich, informational interfaces to high-value verticals such as credit cards, mortgages, legal services, and so on.

More info: Google launches its giant mobile search ads for automakers in the US

Finding flights:

It’s hard to imagine such a prominent, compelling interface not sucking up a decent amount of organic clicks. Google Flights has also likely contributed to the fact that Google earns twice as much travel-related revenue as Expedia.

App discovery

This is one of the most ambitious ad interfaces that Google offers — you can literally play apps directly within Google! No need to download the app or click through to a site. It’s incredible.

Image source

SEO hasn’t traditionally been a huge driver of app discovery, so app streaming for games specifically might not be that disruptive to SEOs, but you can imagine this functionality being rolled out for all sorts of purposes. For example, instead of building functionality similar to Google Flights, maybe Google could simply send you to the Kayak app interface within search results and piggyback off the improved user experience of apps dedicated to one specific purpose.

More info: Google Adds NEW Interactive Ad Format Targeting Gamers

Searching for products

As interesting as the previous examples were, the most dangerous Google ad products for mostSEOs would likely be the different ones related to product discovery.

Product Listing Ads (PLAs), for example, offer a fairly basic interface relative to the previously mentioned examples, but this simplicity makes them very dangerous because it makes them so easy to scale. Google doesn’t need to build a bespoke solution for different verticals or sets of queries.

It’s not hard to imagine a future where product queries on Google simply return a large set of product cards with the option to buy directly within the Google interface. The pieces are already there with PLAs and the limited rollout of the ability to buy directly on Google through select websites. All it would take is Google expanding the number of product cards and the number of sites that offer direct checkout through Google. The implications of a shift like this are hard to overestimate.

These product-related ad interfaces include:

  • Product Listing Ads (PLAs):

  • Showcase shopping ads:

showcase-shopping-ads.gif

<p “=”” 40px;”=””>More info: Google rolling out major change to PLAs for broad product queries, among other Shopping updates

  • Buying directly on Google:

More info: Winning the shopping micro-moments

There are other examples of rich ad interfaces (please share some below, as it’s always interesting to see more!) and Google has many reasons to continue rolling them out as far and wide as possible:

  • To attract more clicks: The more compelling the ad, the more people will click on it, the more money Google will make
  • As a response to ad blockers: Entice users to whitelist Google because their ads are actually a better user experience than organic results
  • Simply a better UX: Finding a handyman through the new format is easier than navigating the organic results, which makes people happy, which makes Google happy

What can SEOs do about it?

  • Stay aware of these new ad formats to make sure you’re experimenting with those that are relevant to your business
  • Avoid competing on SERPs that are overtaken by interactive ads
  • Find ways to differentiate your on-site experience from what Google is offering

Best-case scenario:

Ad-blocking continues to grow and these rich ads remain on a small percentage of overall queries because they’re difficult to scale.


2. New search interfaces

What happens when people no longer find what they’re looking for by typing in a Google search and clicking through to a result? Some of these alternative interfaces include:

Voice search

It’s difficult to rank for a search query that is heavily personalized and has only one result:

Non-Google “search engines”

There are a lot of sites that arguably offer a better search experience for a specific type of query:

  • Wikipedia – Informational queries
  • Yelp – Local business queries
  • Amazon – Transactional queries

Will people shift more and more of their searches to sites that offer a search experience tailor-made to one specific type of query?

“Pre-search,” such as Google Now

Google Now effectively tries to give you information you’re interested in before ever having to type in a query:

You can imagine quite a few verticals that could be disrupted by improvements to pre-search:

  • Flights: We know you tend to fly home for the holidays. Instead of searching for “flights to Vancouver,” we’ll surface some holiday flight deals for you in November.
  • Entertainment: You don’t need to search for a specific movie or play. We’ll surface tickets to events we predict you’ll be interested in before you ever type in a query.
  • Food: We know you’re walking around at dinnertime and there happens to be an Indian food restaurant nearby which we predict you’d like — let’s send you a notification about it.
  • Products: We know you tend to buy Jordan shoes. Let’s show you the latest Jordan release before you ever need to look for it.

Using chatbots

Perhaps conversational commerce will actually take off and people will start finding products/information by using chatbots:

Image source

More examples here: 11 Examples of Conversational Commerce and Chatbots in 2016

Shrinking SERPs

This has been well-covered, but Google continues to aggressively enrich organic results such that the mythical “ten blue links” SERP only makes up about 3% of searches:

The expansion of ads above the fold for some queries hasn’t helped, either. It’s hard to capture organic clicks on a SERP that looks like this (cut off):

These changes are especially impactful on mobile devices which already have limited screen real estate as is.

What can SEOs do about it?

  • Optimize for whatever “search engine” your users are on. You’ll need to understand where your audience is searching for information that might be relevant to your business and then optimize for the platform they’re on.
  • Use structured data. The easier your site is to understand, the more future-proof it’ll be.
  • Avoid competing on SERPs that are overrun by rich interfaces.

Best-case scenario:

These new interfaces consume a relatively small percentage of overall searches. They might continue to cover more and more informational queries (“when is Mother’s day 2017?”), and personal ones (“set an alarm for 7am”), but transactional queries (“new Jordan shoes”) remain on conventional screens because ultimately, finding products through voice search or a chatbot might not be the most enjoyable experience:

1/ Conversational commerce is unproven, even in Asia. If texting takes more time than clicking a button on a webview, why is it better?

 

 


3. New Google?

Google could potentially become so good at understanding websites that you don’t need to worry about SEO. Good user experience could become 100% indistinguishable from good SEO. Want to use asynchronous JavaScript to render your global nav? Go for it! Want to hide content behind a sign-in wall? No problem! Want to launch your site internationally without hreflang tags? Who cares! Google doesn’t really want to reward a site because it knows what rel=canonical is; it wants to reward it because it satisfied a user’s intent. Therefore, Google is inherently working to make SEOs obsolete; maybe one day they’ll succeed.

What can SEOs do about it?

Broaden your skillset and make sure you’re providing value beyond simply optimizing a website for Google.

Best-case scenario:

Google will always need the help of SEOs to understand the Internet.


Despite these threats, I think it’s very unlikely that SEO disappears as a discipline anytime soon. I have yet to run into a site that doesn’t have large SEO opportunities to capture, given the right projects. I also believe the best-case scenario for each threat is actually the most likely scenario. That being said, it can still be helpful to think through future threats to the SEO industry. Can you think of any others?

This post originally appeared on moz.com and was authored by Daniel Marks. You can find it here.

The post 3 Not So Friendly SEO Trends appeared first on Link Promotions Ltd Blog.

]]>
The Most Effective Way to Improve Quality and Rankings http://blog.linkpromotions.net/2016/09/15/effective-way-improve-quality-rankings/ Thu, 15 Sep 2016 08:51:38 +0000 http://blog.linkpromotions.net/?p=299 Quality and relevance are different things, yet we often discuss them as if they were the same. SEOs have been optimizing for relevance all this time, but are now being asked to optimize for quality. This post will discuss what that means and how to do it, with a focus on what I believe to […]

The post The Most Effective Way to Improve Quality and Rankings appeared first on Link Promotions Ltd Blog.

]]>
Quality and relevance are different things, yet we often discuss them as if they were the same. SEOs have been optimizing for relevance all this time, but are now being asked to optimize for quality. This post will discuss what that means and how to do it, with a focus on what I believe to be the single most effective and scaleable tactic for improving your site’s overall level of quality in most situations.

Potential Ranking FactorsYou need BOTH quality AND relevance to compete these days.

First, let’s establish what we’re talking about here, which is quality. You can have relevancy (the right topic and keywords) without quality, as shown here:

Highly Relevant. But Very Low Quality.This MFA site is highly relevant for the phrase “Chairs for Baby.” But it also sucks.

“…babies are sensitive, delicate individuals who need cautious. So choosing a right Chairs For Baby is your gentle care.” WTF?

It doesn’t matter how relevant the page is. The only way to get that page to rank these days would be to buy a lot of links, but then you’re dealing with the added risk. After a certain point, it’s just EASIER to build a better site. Yes, Google has won that battle in all but the most hyper-competitive niches, where seasoned experts still find the reward:risk ratio in their favor.

Quality + Relevance = Ranking

OK, now that we’ve established that quality and relevance are different things, but that you need both to rank, how do you optimize for quality?

Quality Indicators

Tactics (How to Optimize for Quality)

Grammar, spelling, depth Hire a copy editor. Learn to write better.
Expertise, Authority, Trust (EAT) Make content deep and useful. Call out your awards, certifications, news coverage, and use trust symbols. Make it easy to contact you, and easy to find policies like terms of service, returns, and privacy.
PageRank (PR) from links Build high-quality links. There are thousands of great articles out there about it.
Reviews Ask your best clients/customers.
Short-clicks Improve Out-of-Stock pages with in-stock alerts and related products. Be less aggressive with your pop-up strategy. Also see query refinements, pages per visit, and dwell time.
Query refinements Only attract the “right” audience and deliver what you promised in the search results. Choose keywords by relevance, not just volume. Think about query intent.
Dwell time on site Make your pages stickier by improving the overall design. Add images and video. Run GA reports on the stickiest traffic sources and develop a strategy to increase traffic from those sources.
Pages per visit Improve internal linking, suggest related content, customize 404 pages. Split up long posts into a series.
Conversion rates Do A/B testing, make your messaging very clear, follow CRO best practices.
Ad placements No ads above the main content on the page and do not have an annoying amount of ad blocks or pop-ups.
CTR from SERPs Craft better title tags, URLs, and descriptions. Grab attention. Gain trust. Make them want to click. Add schema markup when appropriate.
Page speed Visit https://developers.google.com/speed/pagespeed/insights/.
Mobile experience Responsive and/or adaptive design, less text, easy navigation, loads lighting fast, shorter forms.

There are many ways to improve the quality of your site. Some are obvious. Others are more abstract. All of these quality indicators together make up your site’s overall level of quality. For more, check out the keyword agnostic ranking factors and the engagement metrics from SimilarWebareas in the Moz Search Engine Ranking Factors report.

We’ve already established that Google knows the relative quality of a page. Let’s assume — because it is very likely — that Google also knows the relative quality of your entire site. And let’s call that your sitewide QualityRank (QR) (h/t Ian Lurie, 2011).

What’s the most effective, scalable, proven tactic for improving a website’s QR?

In a word: Pruning.

Learn more about it here. Pruning requires doing a content audit first, which you can learn more about here. It’s nothing groundbreaking or new, but few clients come in the door that can’t substantially benefit from this process.

Sometimes pruning is as easy as applying a noindex tag to all pages that have had zero organic search traffic over the course of a year. You may be surprised how many enterprise-level sites have huge chunks of the site that fit that criteria. Other times it requires more analysis and tougher decisions. It really all depends on the site.

So let’s look at some pruning case studies.

Three things to remember:

1. Significant portions of the sites were removed from Google’s index.

2. Pruning was not the only thing that was done. Rarely do these things happen in a vacuum, but evidence points to pruning as a major contributor to the growth examples you’re about to see.

3. You can read more about Inflow’s research in our case studies section.

SEO Case Study1800doorbell had technical issues that made it possible to identify cruft and prune big chunks of the site quickly. This contributed to a 96% increase in revenue from organic search within six months.

The dip at the end has to do with how the timeline was generated in GA (i.e. an incomplete month).The dip at the end has to do with how the timeline was generated in GA. Growth was sustained.

We’re not the only ones finding success with this. Go check out the Ahrefs case study for another example. Here’s a compelling image from their case study:

Read the ahref Blog Pruning Case StudyAhrefs saw amazing results after performing a content audit and pruning their blog.

If you weren’t already convinced, I hope by now it’s clear that performing a content audit to determine which pages should be improved and which should be pruned from Google’s index is an effective and scalable SEO tactic. That being established, let’s talk about why this might be.

We don’t know exactly how Google’s ranking algorithms work. But it seems likely that there is a score for a site’s overall level of quality.

Does QualityRank actually exist as a calculation in Google’s organic algorithm? Probably not under that name and not in this simplistic form. But it DOES seem likely that something similar would exist, especially since it exists for PPC. The problem I have with the PPC equivalent is that it includes relevance factors like keyword use in their metric for “quality.”

Google needs a way to measure the overall quality level of each site in order to rank them properly. It’s just probably much more mathematically complicated than what we’re talking about here.

The point of discussing QualityRank as a framework for pruning is to help explain why pruning works. And to do that, we don’t need to understand the complex formulas behind Google’s ranking algorithms. I doubt half of the engineers there know what’s going on these days, anyway.

Let’s imagine a site divided into thirds, with each third being assigned a QualityRank (QR) score based on the average QR of the pages within that section.

The triangle below represents all indexable content on a domain with a QR of 30. That sitewide QR score of 30 comes from adding all three of these sections together and dividing by three. In the real world, this would not be so simple.

Before Pruning PyramidI hope the mathematicians out there will grant me some leeway for the sake of illustrating the concept.

This is the same site after removing the bottom 50 percent from the index:

After Pruning Pyramid

Notice the instant lift from QR 30 to QR 40 just by removing all LOW QR pages. That is why I saypruning is the most effective way to raise your site’s overall quality level for better rankings, IF you have a lot of low-quality pages indexed, which most sites do.

Time to switch analogies

Pruning works because it frees up the rest of your content from being weighed down by the cruft.

“Cruft” includes everything from a 6-year-old blog post about the company holiday party to 20 different variants with their own landing pages for every product. It also includes pages that are inadvertently indexed for technical reasons, like faceted navigation URLs.

iceberg-1-seo

Remove the bottom half of this iceberg and the rest of it will “rise up,” making more of it visible above the surface (i.e. on the first 2–3 pages of Google).

Iceberg SEO Content Cruft

The idea of one page being weighed down by another has been around at least since the first Panda release. I’m not writing about anything new here, as evidenced by the many resources below. But I’m constantly surprised by the amount of dead weight most websites continue to carry around, and hope that this post motivates folks to finally get rid of some of it. Your choices are many: 404, 301, rel =”canonical”, noindex, disallow… Some of the resources below will help you decide which solutions to use for your unique situation.

This post originally appeared on moz.com and was written by Everett Sizemore. You can find that post here.

The post The Most Effective Way to Improve Quality and Rankings appeared first on Link Promotions Ltd Blog.

]]>
Making Sense of Google’s Updates in Local Search http://blog.linkpromotions.net/2016/08/24/making-sense-googles-updates-local-search/ Wed, 24 Aug 2016 09:43:18 +0000 http://blog.linkpromotions.net/?p=296 This has been a big year for local search, with Google launching a ton of changes related to local, including several changes directly to their local platform, Google My Business. Marketers and brands are naturally scrambling to respond to each of these changes individually, as they should, but what about the larger implications of changes […]

The post Making Sense of Google’s Updates in Local Search appeared first on Link Promotions Ltd Blog.

]]>
This has been a big year for local search, with Google launching a ton of changes related to local, including several changes directly to their local platform, Google My Business. Marketers and brands are naturally scrambling to respond to each of these changes individually, as they should, but what about the larger implications of changes like these?

The running theme with all these changes seems to be the following three things: Google is taking local seriously, Google wants to get more local data through its crawler, and Google really, really wants more reviews. But let’s not jump ahead of ourselves. First, let’s review some of the major changes that have occurred over the last few months.

What’s changed?

1. No more descriptions for Google My Business

The most recent change to Google My Business occurred on August 3rd when Google My Business stopped accepting edits to the description. The description will still be editable through Google+, but with the way the rest of the company has been distancing itself from its social platform, that’s likely not to stick around for long.

2. Additional categories no longer supported

Additionally, though it got lost in the shuffle a bit, when they removed the descriptions they also removed the following sections from their bulk upload form:

  • Ad Icon URL
  • Ad Landing Page URL
  • Alt Phone. Alt phone is now “Additional phones.”
  • Categories. This field has been replaced by “Primary category” and “Additional categories.”
  • City. City is now “Locality.”
  • Description
  • Email
  • Fax
  • Payment Types
  • State. State is now “Administrative area.”

3. Google+ metrics removed, additional Google My Business Insights

In a separate announcement, Google also removed Google+ metrics from their dashboard, instead providing more detailed metrics around the source of views to your GMB profile. Google My Business now shows whether customers found a business via search or Google Maps and breaks down actions customers are taking by website visits, driving direction requests, phone calls, or photos.

4. Greater support of reviews for local businesses

And in yet another announcement this month, Google released the ability for all websites to have“Critic Reviews” published directly in Google search results, next to the local businesses results. Days later, Google backed up this announcement by promoting the detailed Schema Markup needed to apply for critic reviews.

For reviews on Google My Business itself, they added the ability to respond to reviews on Google directly through the latest version of its API.

Overview of changes

And this is just within the last couple of months! So, what do all of these changes imply? Well, first off, it means that Google is making some serious changes towards local. And it should. Based on data released in May of 2016, over 50% of its traffic is now mobile and within that, nearly 30% of those searches are local!

Secondly, it means that Google is getting more confident in its own crawl data. Google wouldn’t take away a chance to get information from you if it didn’t have a good way of getting that same information by itself. We already saw this when Google removed support of Authorship and, years earlier, removed support of the Meta Keywords tag. By further distancing its local product from its social product, Google+, it implies that the data gathered from those sources wasn’t valuable. It also means that Google likely hasn’t been paying attention to any of this stuff for some time now.

This is pretty in line with everything Google has worked towards with local information. User-generated information, while invaluable, is easily manipulated. Because of this, Google often prefers to use its own data, when available. This is why that irritatingly complicated Local Search Ecosystemis so irritatingly complicated. Google needs to be able to verify its data, verify it again somewhere else, and repeat however many times it needs to to be sure.

How does this affect me?

So what does this mean for marketers and brands? There are a couple of key takeaways. First, it means that Google is becoming increasingly confident in the data that it’s getting on its own. On top of that, Google is surfacing more information about an individual business than it ever has before. Information like business hours, reviews, driving directions, social links, and more are all available directly in the search results.

While providing all of this information is potentially great from a user perspective, this is also makes Google tremendously vulnerable from a trust perspective. Every new piece of information that Google surfaces in its search results is a new opportunity for them to get that information wrong, so they’re putting themselves at a tremendous risk. They aren’t going to do this unless they can be absolutely sure and, as we know, the way they verify information is through their own crawls.

The second big takeaway is that Google is trying harder than ever to get more reviews into its platform. By distancing itself from Google+ they removed one of the biggest barriers for leaving reviews. By promoting Schema and opening up the ability for more people to have their reviews included in search results, Google is making sure that it has as much review data as possible. As demonstrated last year in another study by Casey Meraz, we know that reviews are a huge element in the click-through rate of local results.

What should I do?

Let’s talk tactics. Knowing that Google is putting more emphasis on crawl data and that it’s looking for more ways to get reviews, your job as a marketer gets pretty clear. You need to get your local information and reviews in all the places Google might look and make it easy for Google to understand.

Learn to love Schema markup

One of the most telling things about Google’s updates, in general, is that they’ve been consistently and reliably promoting Schema usage every chance they get. This means they probably like it. And the great thing about Schema is that it’s easier than ever to implement! To facilitate their love affair with Schema, Google created an easy-to-use tool, the Structured Data Markup Helper, that lets you highlight contact information, reviews, and more, then generate the JSON-LD code you can paste right in the <head> of your page. Pair that with their other free tool for testing markup, theStructured Data Testing Tool, and you have everything you need to start using Schema right away.

Make your business listing information accurate

This may seem repetitive in the local space, but that’s just because it’s true. Even if you enter all the information exactly right in Google My Business, Google still doesn’t trust it unless it can verify it against other sources. Use the free Check Listing Tool or any of the other online tools to make sure you’re not only listed on all of the most important online sources, but that your information is accurate. And not just mostly accurate — so accurate that Google doesn’t have any choice other than to completely trust your data. The one thing that will prevent Google from ever showing your business in their giant local search result is conflicting information about your business on various online sources.

Get your review strategy together

You can’t just sit around and hope for reviews anymore. According to a study by BrightLocal, 92% of people look to online reviews when deciding to use a business. We also know that people click on them in Google. And we know that Google is trying to get as many of them as possible in their own search results.

  1. Use Schema markup on your site for all the reviews you have on your own site. Even if Google isn’t using those now, they’re certainly acting as though they want to start.
  2. Monitor your reviews online and have a response strategy. With Google surfacing reviews even more in their results, you need to be sure you properly address negative reviews and take every opportunity to address the concern.
  3. Give great customer service. The most frustrating part of an online review strategy is that the majority of it occurs offline. Be nice to your customers and thank them for their time.

Earn good links

There are tons of great resources for linkbuilding on the Moz Blog alone, so I won’t muddy the waters with more advice. I will say that, while the Google penalties of the previous few years have been rough on linkbuilding, there’s still no question it’s still one of the most influential ranking factors in SEO as a whole, let alone in local. The only difference is that it has to be good. The one thing the Google penalties proved that Google definitely knows the difference between a good links and a bad link. Good links are good because they mean people are actually interested in your content and are legitimately trying to share it. Of course Google would want to use that as a metric.That’s the content you need to make.

Is this all you can do? Of course not. But focusing on the things Google is paying attention to is one of the best ways to make sure you’re staying ahead of the curve to make your local strategy as future-proof as possible.

This post was authored by George Freitag and appeared on Moz. The original post can be found here.

The post Making Sense of Google’s Updates in Local Search appeared first on Link Promotions Ltd Blog.

]]>
Why Listing Accuracy is Important http://blog.linkpromotions.net/2016/06/22/listing-accuracy-important/ Wed, 22 Jun 2016 09:03:01 +0000 http://blog.linkpromotions.net/?p=293 Today’s topic is listing accuracy. The reason why I wanted to talk about it was because it’s one of those topics that is brought up a lot in local search. If you work in local search directly or if you work with an agency or an SEO on local search, it’s one of those things that […]

The post Why Listing Accuracy is Important appeared first on Link Promotions Ltd Blog.

]]>
Today’s topic is listing accuracy. The reason why I wanted to talk about it was because it’s one of those topics that is brought up a lot in local search. If you work in local search directly or if you work with an agency or an SEO on local search, it’s one of those things that you’ve probably heard a lot about, you understand it’s something you have to do, you understand that it’s very important because it is, but you might not understand exactly what it means, why it’s important, how come you have to do it, or why it takes as much time as it does. So today, I wanted to spend the time to go over why listing accuracy is important, how it works and how you can do that.

So to start out, let’s just look at an actual local search result. Let’s say this is a search for what you do. This is a search for a burger restaurant, if you run a burger restaurant. You’ve got your three local results. It’s on a phone, so that’s pretty much all you get, is those three listings, and this is where you want to be, but you aren’t.

So why aren’t you? How come your clients, your competitors are on this search result, but you aren’t? It has to do with trust.

A story about trust

So to demonstrate why trust is so important, I want to go over a quick story about me when I was looking for a bank a few years ago. I don’t go to the bank that often. So when it came time, I got out my phone, punched out Google Maps, and then proceeded to walk for the next seven blocks with my face buried in my phone, checking my Twitter and email and whatever. When the little lady inside told me I’d arrived, I looked up and saw that the place was closed. Not even just closed for the day. It was closed altogether.

So that is Google’s greatest fear, because if there’s one reason why you, me, anyone will stop using Google is if that happens over and over again. If I repeatedly get sent to businesses that don’t exist, if I try to call them and the call information is incorrect, then I’m going to stop using Google, and so Google takes that very seriously.

In fact, Google is putting itself even more on the line with additional business information. If you’ve done a local search lately, and I assume that you have if you’ve watched this video, you’ll see that in this Knowledge Graph they’re giving you all sorts of information. They’re telling you whether or not the store is open. They’re linking you to reviews. They are just giving a ton of information in their local search results. And if they’re not confident on all the information they have, then they’re not going to show it, because if they repeatedly show that a place is closed when it’s not, then you’re going to stop using Google. So Google takes trust very seriously.

Listing accuracy

So how does Google determine trust? That is where listing accuracy comes into play. Listing accuracy is Google’s method for determining whether or not it can trust a local business search result.

So to show how that works, let’s go over here and say this is your business and let’s say that you’ve already set up your Google My Business page. You’ve already set up your name, address, phone, your NAP. If you haven’t done anything along those lines, there’s plenty of information on our website and on other places online about how to do that.

But let’s say that you’ve already set up your Google My Business. It’s absolutely perfect with your business name, all your hours, all the images are filled out, and it’s still not showing up here. Well, one of the reasons may be because of this concept of listing accuracy.

So here’s your listing, but what you might not know of or you might be kind of aware of are these other slight variations that exist elsewhere online, and these might just exist somewhere. So let’s say you’ve got one variation where the address is slightly different, like it’s a suite number that you don’t want to have, but sometimes it mentions a suite number. Maybe it’s an old phone number or an old cell phone number that somehow got indexed or an old tracking number. Maybe it’s just some general bad information, just about specifically where you are or a website that’s slightly different.

Then over here, you’ve got another variation. Maybe this is a different business name. Maybe this is a business that was in your location before you moved in. So these places just sort of exist out there on the web somewhere, and they might even be in a place that you don’t even know about. They might be on an obscure website that you don’t ever see, and you’re not really that worried about because it’s something that you know isn’t really being seen by your customers.

So why should you worry about it? Why should you care about a website that’s got bad information, that’s on a source that you’re never going to go to, you know your customers are never going to go to and probably isn’t driving that much traffic your website instead? Well, it has to do with this concept of listing accuracy, because, again, this is how Google is measuring how much it can trust your information.

The local ecosystem

So over here we’ve got what we call the local ecosystem. You might’ve seen our graphic on our website, which sort of explains what the different data sources are. I just want to demonstrate how it works.

So for your listing, Google can get its information from a bunch of different places. One of them, of course, is you. So this is you providing your information directly to Google My Business. This can also be the “Report a problem” in Google Maps or Google Mapmaker.

But in addition to that, it’s got all these other places that Google knows it can get business information from, and some of these places provide information directly to Google through feeds and some of them Google just knows about because it can crawl them (because basically it can just crawl anything on the web). These are places like:

  • Phone directories
  • Phone books
  • Business directories for specific businesses like OpenTable or Healthgrades
  • Review sites like Citysearch, Insider Pages
  • News sites
  • A restaurant review about your business
  • Government websites

Each time one of these places mentions your business information, it increases the confidence that Google has in the information that you provided. So this place and this place both mention you; that works to increase the confidence that Google has in the business information that you provided, making it more likely for it to show your business in its search results. So the more times it mentions you, the greater confidence it has in your information.

But if you have these other variations sort of floating around out there, then all of a sudden Google’s got some conflicting sources about your business information. So let’s say that all these places are mentioning you the way that you want to be mentioned, but these places are giving slight variations. So all of a sudden Google’s getting two different addresses, and so it’s becoming a little bit less confident in the information that you’ve provided. Maybe now this place is giving a completely different phone number. So now it doesn’t really want to show you because it doesn’t want to have that call button on your search result.

Each time it mentions one of these, it decreases the amount of confidence, and you also lose the opportunity to build confidence in your website. In fact, if there’s enough sources out here saying one thing that are contradictory to what you’re saying in your own Google My Business page, these places can actually override what you’re providing and Google will deem them more trustworthy than the information that you’re directly providing to them. So if all these places are saying that you’re open ’til six and you’re telling Google you’re open ’til eight, all of a sudden Google is telling everyone that you’re closed when you’re not, which can be absolutely detrimental to your business.

So how do you fix this and what you do about it? Well, it should be pretty clear from this graph. You want to find all these instances of Google locating business information that is not consistent with you and making it consistent. You’re going to go out and find the source that’s pointing at one of these variations, fix it so it’s pointing at your own place. Then all of a sudden, instead of taking away from the confidence and trust Google has in your listing, it’s building towards it.

Finding NAP variations

The way that you find this variation, it can just be through doing some pretty straightforward Google search. So let’s go down here to some examples. You’re going to use a quoted search to look for different types of information.

If you already know about some bad information out there, you already know that there’s like an old business name that you used to go under or a variation of your address, you can just do a quoted search for that information directly, find all the sources that bring up that bad information, go to the website, fix it, and then move on. All along you’re building the trust that Google has in your business listing.

For those places that you’re not yet sure about, like I said, there might be some directories that you don’t even know about or there might be some variations of your business information that you might not be aware of, so you can’t search for it directly, the best way to do that is to search for your phone number in different formats.

Darren Shaw of Whitespark did a great Whiteboard Friday just a few weeks ago about exactly this, about how to find all of your NAP variations in Google. I recommend reading it if you want to follow through with this.

There are also some tools you can use. Andrew Shotland of Local SEO Guide has a Chrome plug-in, called Local Citation Finder*, that will just open up a whole bunch of different variations of your business information in different Chrome tabs and can really, really help with that.

*Editor’s note: The tool mentioned here is actually called the NAP Hunter Chrome Extension, but Whitespark’s Local Citation Finder tool is another fantastic resource to keep in your local SEO toolkit. 😉

Enterprise-level solutions for cleaning up NAP consistency

So this might work if you’ve got one, a couple dozen, or maybe a hundred or so businesses. You can probably do this by hand and find all these places yourself. But if you have a ton of businesses — a few hundred or even a few thousand businesses — then this is not that scalable all of a sudden, and then it’s time to move to one of these solutions where you’re working with some of the primary data sources.

So these are the sources that provide the information to all of these places. The big ones are Localeze, Neustar, Infogroup, Acxiom, and Factual. We’re getting into the paid options right now. But basically, some of these places have been around for decades. They’re the ones who provided the phone books with their information. You might have gone to Yelp and tried to sign up and saw that you’re already on there and wondered why. This is why. They’re getting their information from these places, and they push out all the information to these other places.

So if you go to one or more of these, either directly or through a service like Moz Local or Yext, you can correct your information on one of these platforms. It’ll push it out to every place on its network. That will correct the information here and, again, which in turn will make its way over to building Google’s confidence in your Google My Business, which will increase the likelihood of it putting your business in its search results.

If one of these places finds an inaccurate listing, it will correct it. So let’s say that this phone book is in Infogroup’s network and it encounters the inaccurate data, it will hit it, fix it, and then all of a sudden instead of hurting you, this is again building confidence in your Google My Business listing.

Another benefit of working with these places is that they’ll get you into places that you probably weren’t even aware of. So, in addition to fixing points of bad data, it’s also creating new points of accurate data that didn’t even exist in the first place, which again build the confidence in your business listing and then increase the likelihood to show up in those local search results.

The last step for this is related to maintenance. This is not a one-time thing. Over time this information can be corrupted because these places not only get their information from a primary data provider, but they also get their information from each other. In some cases, a primary data source might be crawling these sites, that it indirectly provides information to, and so if you ever played a game of a telephone, you kind of know how this will end up. So you do need to go back and revisit these exercises from time to time, looking for business inaccuracies through Google manually or keeping up a relationship with one of these top-level data providers.

So in summary, what you want to do is start here, make sure you…

  1. Have got your Google My Business listing set up,
  2. Find all the variations and inaccuracies in your data,
  3. Fix them, and
  4. Work with a primary data provider to push out the correct information.

And then all of these places will build up the confidence that you’re already providing Google in your Google My Business listing, making it more and more likely for Google to show you in its search results.

The post Why Listing Accuracy is Important appeared first on Link Promotions Ltd Blog.

]]>
Will Intelligent Personal Assistants Replace Websites? http://blog.linkpromotions.net/2016/06/06/will-intelligent-personal-assistants-replace-websites/ Mon, 06 Jun 2016 14:08:55 +0000 http://blog.linkpromotions.net/?p=289 Intelligent Personal Assistants (IPAs) are capable of radically disrupting the way we search for and consume information on the Internet. The convergence of several trends and technologies has resulted in a new interface through which people will be able to interact with your business. This will have a dramatic impact — if your long-term marketing/business […]

The post Will Intelligent Personal Assistants Replace Websites? appeared first on Link Promotions Ltd Blog.

]]>

Intelligent Personal Assistants (IPAs) are capable of radically disrupting the way we search for and consume information on the Internet. The convergence of several trends and technologies has resulted in a new interface through which people will be able to interact with your business. This will have a dramatic impact — if your long-term marketing/business plan doesn’t account for IPAs, you may be in the same boat as those people who said they didn’t need a website in the early 2000s.

Your website is an API to your business

If we look to pre/early Internet, then the primary interface to most businesses was the humble phone. Over the phone you could speak to a business and find out what they had in stock, when they’d be open, whether they had space for your reservation, etc., and then you could go on to order products, ask for directions, or place reservations. The phone was an interface to your business, and your phone line and receptionist were your “API” — the way people interacted with your business.

As the Internet matured and the web gained more traction, it increasingly became the case that your website empowered users to do lots of those same things that they previously did via the phone. They could get information and give you money, and your website became the new “API” for your business, allowing users to interact with it. Notice this didn’t necessitate the death of the phone, but lots of the requests that previously came via phone now came via the web, and there was also a reduction in friction for people wanting to interact with your business (they didn’t have to wait for the phone line to be free, or speak to an actual human!).

Since then, the web has improved as technologies and availability have improved, but fundamentally the concept has stayed the same. Until now.

The 5 tech giants have all built an intelligent personal assistant

The 5 tech giants have all built an Intelligent Personal Assistant

Intelligent Personal Assistants apps such as Google Now, Siri, Cortana, and Facebook M — as well as the newer appliances such as Amazon Echo, the new Google Home, and the rumored Apple Siri hardware — are going to have a profound effect on the way people search, the types of search they do, and the way they consume and act upon the results of those searches.

New entries, such as Hound and Viv, show that intelligent personal assistants are growing beyond just something phone makers are adding as a feature, and are becoming a core focus.

In the last couple of years we’ve discussed a variety of new technologies and their impact on search; a number of these are all feeding into the rise of these personal assistants.

Trend 1: More complex searches

The days of searches just being a keyword are long since over. The great improvements of natural language processing, driven by improvements in machine learning, have meant that conversational search has become a thing and we have seen Hummingbird and RankBrain becoming building blocks of how Google understands and handles queries.

Furthermore, implicit signals have also seen the rise of anticipatory queries with Google Now leading the way in delivering you search results based off of your context without you needing to ask.

Contributing technologies & trends:

  • Implicit Signals
  • Natural Language
  • Conversational Search
  • Hummingbird & RankBrain

Watch this video of Will Critchlow speak about these trends to hear more.

Trend 2: More complex results

Search results have moved on from 10 blue links to include the Knowledge Graph, with entities and direct answers being a familiar part of any search result. This has also meant that, since the original Siri, we’ve seen a search interface that doesn’t even do a web search for many queries but instead gives data-driven answers right there in the app. The earliest examples were queries for things like weather, which would turn up a card right there in the app.

Finally, the rise of conversational search has made possible complex compound queries, where queries can be revised and extended to allow the sorting, filtering, and refining of searches in a back and forth fashion. This phase of searching used to be something you did by reviewing the search results manually and sifting through them, but now search engines understand (rather than justindex) the content they discover and can do this step for you.

Contributing technologies & trends:

  • Entities / Direct Answers
  • Faceted search
  • Data driven answers

You may like Distilled’s Searchscape which has information and videos on these various trends.

Trend 3: Bots, conversational UI, and on-demand UIs

More recently, with the increased interest in bots (especially sinceFacebook’s F8 announcement), we can see a rise in the number of companies investing in various forms of conversational UI (see this article and this one).
Bots and conversational UI provide a new interface which lends itself to all of the benefits provided by natural language processing and ways of presenting data-driven answers.

Note that a conversational UI isn’t limited to purely a spoken or natural language interface, but can also provide an “on demand” UI for certain situations (see this example screenshot from Facebook, or the Siri/Fandango cinema ticket example below).

Contributing technologies & trends:

  • Conversational UI
  • Bots
  • On-demand UIs within the IPA interface

Trend 4: 3rd-party integration

Going back to the first versions of Siri or Google Now, there were no options for 3rd-party developers to integrate. They could only do a limited set of actions based on what Apple or Google had explicitly programmed in.

However, over time, the platforms have opened up more and more, such that apps can now provide functionality within the intelligent personal assistant on the same app.

Google Now, Amazon Echo, Cortana, and Siri (not quite — but rumored to be coming in June) all provide SDKs (software development kits), allowing 3rd-party developers to integrate into these platforms.

This is an opportunity for all of us integrate directly into the next generation search interface.

What’s the impact of all this?

More searches as friction reduces

Google published an (under-reported) paper on some of the research and work that went into Google Now, which when combined with their daily information needs study indicates how hard they’re trying to encourage and enable users to do searches that previously have not been possible.

The ability of intelligent personal assistants to fulfil more complex search queries (and of “always listening” search appliances like Amazon Echo and Google Home) to remove the friction of doing searches that were previously “too much work” means we’ll see a rise in search queries that simply wouldn’t have happened previously. So rather than cannibalizing web-based searches that came before, a large segment of the queries to IPAs will be wholly new types of searches.

Web rankings get bypassed, go straight to the top

As more and more people search via personal assistants, and with personal assistants trying to deliver answers directly in their interface, we’ll see an increasing number of searches that completely bypass web search rankings. As 3rd-party integration becomes more widespread, there will be an increasing number of dynamic queries that personal assistants can handle directly (e.g. “where can I buy The Martian?,” “flights to Berlin,” or “order a pepperoni pizza”).

This is a massive opportunity — it does not matter how many links and how much great content your competitor has to help them in “classical SEO” if you’ve integrated straight into the search interface and no web search is ever shown to the user. You can be the only search result shown.

The classic funnel gets compressed; checking out via IPAs

This part is probably the most exciting, from my perspective, and I believe is the most important from the impact it’ll have on users and businesses. People have modeled “the funnel” in a variety of different ways over time, but one common way to look at it is:

The search is separate to the browsing/checkout process, and that checkout process happens via a website. Apps have had some impact on this classic picture, but so far it hasn’t been a big part.

However, conversational search/UI combined with the ability for developers to integrate directly into IPAs opens up a huge opportunity to merge the interfaces for the search step and the steps previously fulfilled by the website (browsing and checking out). There are already examples of the funnel being compressed:

In this example, using Siri, you can see I was able to search for movies playing nearby, pick a particular movie and cinema, then pick a particular showing and, finally, I can click to buy, which takes me to the Fandango app. I am most of the way through the checkout process before I leave the intelligent personal assistant app interface. How long until I can do that final step and actually check out inside the personal assistant?

Integrating with intelligent personal assistant apps currently normally happens via the app model (i.e. you build an app that provides some functionality to the assistant), but how long until we see the possibility to integrate without needing to build an app yourself — the intelligent personal assistant will provide the framework and primary interface.

Summary

Intelligent Personal Assistants bring together all the recent developments in search technology, and as integration options improve, we will see an increasing number of queries/transactions go end-to-end entirely inside the personal assistant itself.

People will conduct searches, review data, and make purchases entirely inside that one interface, completely bypassing web search (already happening) and even checking out inside the personal assistant (within the next 12 months) and thus bypassing websites.

IPAs represent an absolutely massive opportunity, and it would be easy to underestimate the impact they will have (in the same way many people underestimated mobile initially). If you’ve been on the fence about building an app, you should re-evaluate that decision, with a focus on apps being the way they can integrate into intelligent personal assistants.

What do you think? I’d love to have a discussion in the comments about how everyone thinks this will play out and how it might change the landscape of search.

About Tom-Anthony — Tom is Head of R&D at Distilled, spending his time looking at technology trends and leading the team that builds the DistilledODN SEO split testing platform. Follow him on Twitter: @TomAnthonySEO.

This post originally appeared on the Moz blog and can be found here.

The post Will Intelligent Personal Assistants Replace Websites? appeared first on Link Promotions Ltd Blog.

]]>
How to Achieve 100/100 with the Google Page Speed Test Tool http://blog.linkpromotions.net/2016/04/06/achieve-100100-google-page-speed-test-tool/ Wed, 06 Apr 2016 13:54:55 +0000 http://blog.linkpromotions.net/?p=285 Website loading speed is a priority for the overall user experience, and it’s also one of the hundreds of SEO ranking factors. Truth is, nowadays people don’t have the patience to wait more than five seconds for a page to load. If your website isn’t loading fast enough, you’ll lose potential customers. With more than […]

The post How to Achieve 100/100 with the Google Page Speed Test Tool appeared first on Link Promotions Ltd Blog.

]]>
Website loading speed is a priority for the overall user experience, and it’s also one of the hundreds of SEO ranking factors. Truth is, nowadays people don’t have the patience to wait more than five seconds for a page to load. If your website isn’t loading fast enough, you’ll lose potential customers.

With more than 50% of online traffic coming from mobile devices, everyone expects a site to load almost instantaneously. With that in mind, in this article, I will show you how we managed to score 100/100 with Google PageSpeed Insights Tool for Monitor Backlinks for both desktop and mobile.

The motivation

Our site was loading quite fast already, but we knew there’s always a way to make it even better.

One day, while playing with the PageSpeed Tool, I noticed Google’s website had a terrible score for mobile devices, 59/100. The desktop version was doing better at 95/100.

LLlzIRg.png

Maybe they should use their tool to improve their website, right?

That’s what pushed us to make our site load faster and prove you can get 100/100. It’s not an obsession; it’s aiming to be perfect.

We started at 87/100.

Here’s the result we got after implementing some of the techniques I’m about to share with you.

WWAYM7w.png

How to make pages load faster

Before I start showing the exact steps we followed, let me tell you that the PageSpeed tool is only a guideline for best web performances practices. It provides recommendations for optimizing your website for page load speed, and achieving favorable results depends on how your server environment is set up.

While some of these steps require technical expertise, others do not. Note that they can be followed using almost any content management system (CMS).

Step #1: Optimize images

SA9EugY.png

The PageSpeed Insights Tool suggested that we optimize our images to load faster by reducing their file size. To solve this problem, we did two significant things:

  • Compressed all images using tools like Compressor.io and TinyPNG. These tools are free and can reduce image file size by more than 80% in some cases, without decreasing the quality of the image.
  • Reduced the size of the images to minimal dimensions without decreasing image quality. For example, if we wanted to have a picture at 150x150px on our website, that’s exactly the size the picture should have been on our server. You should never have larger images than what you want them to render at, nor reduce their size using CSS or HTML tags.

We downloaded each of our images, then manually compressed and resized them. After optimizing these images, it’s best to make a habit of optimizing all the new images you upload to your server. Each new image should be compressed and resized.
Google also offers the option to download your already optimized images, and you can just upload them to your server. You can do the same with JavaScript and CSS.

rPecTD4.png

Step #2: Minify CSS & JavaScript

zEeAoMe.png

Google was now telling us that we had to minify our JavaScript and CSS files.

The minifying process reduces the sizes of your files by eliminating unnecessary white spaces, characters, and comments from your CSS and JavaScript files. Programmers will often leave many spaces and comments while coding. These can even double the size of your CSS and JavaScript files.

HME9biO.png

To fix this problem, we installed Gulpjs on our server. The tool automatically creates a new CSS file and removes all spaces. It also creates a new CSS file automatically for all the new changes you are making. In our case, it helped us decrease the size of our main CSS file from approximately 300kb to 150kb. The difference was all in unnecessary characters. For more instructions on how to remove white spaces, check Google’s guide.

If you are using WordPress, I recommend you to install the plugin Autoptimize.

You can also download the optimized files from the PageSpeed Tool. Here’s an example:

Below is the result we got after minifying CSS and JavaScript.

V6BCd5T.png

Step #3: Leverage browser caching

ZWX6fOV.png

For many website operators leveraging browser caching is the most challenging part.

To fix this problem, we moved every statical file from our website to a CDN (content delivery network).

A CDN is a network of servers located at various sites around the world. They are capable of caching the static version of websites, such as images, CSS, and JavaScript files. The CDN stores copies of your website’s content on its servers, and when someone lands on your site, the static content is loaded from the server closest to them.

For example, if your website’s main server is from Texas, without a CDN, a visitor from Amsterdam would have to wait for the server to load the site all the way from the U.S.A. With a CDN, your site is loaded from a location that’s closer to the user. In this case, this is a place closer to Amsterdam. Therefore, the website will load faster.

Here’s a visual representation from GTmetrix of how a CDN works.

M7cMb05.png

We moved all images, JavaScript, and CSS files onto the CDN and kept only the HTML file on our main server. Hosting your images on a CDN will make a big difference in how fast your pages load for website visitors.

After we did this, the PageSpeed tool still annoyingly suggested that we leverage our browser caching for some third-party resources. Here’s a screenshot:

xpt5dQ1.png

We fixed the social media scripts problem by replacing the counters provided by them with some static images hosted on the CDN. Instead of having third-party scripts that were trying to access data from Twitter, Facebook, or Google Plus to get the followers count, we hosted these ourselves and fixed the problem.

Even more frustrating than the social media scripts problem was that the Google’s Analytics code was slowing our website.

i7MXiu9.png

To solve the Google Analytics script problem, we did something rather difficult. As we didn’t want to remove Analytics from our website, we had to find a different solution.

The Analytics code is rarely modified by Google more than once or twice per year. Therefore,Razvan created a script that runs every eight hours to verify when the Analytics code was last modified. The script downloads the Analytics code again only if new changes are found. This way, we can host the Analytics JavaScript code on our server without having to load it from Google’s servers on every visit.

If no changes have occurred, then the Analytics code will load from the cached version on our CDN.

When Google modifies its JavaScript code again, our server will automatically download the new version and upload it to the CDN. We used this script for all external third-party scripts.

Here’s a screenshot from Pingdom Tools showing how everything loads from the CDN, including the Analytics code. The only file loading from our server is the homepage file, which is only 15.5 KB. Eliminating all third-party scripts hugely improved the overall loading speed.

h9JmXjG.png

Step #4: Eliminate render-blocking resources above the fold

FMIAxXZ.png

Eliminating render-blocking is one of the most complicated parts of improving page load speed because it requires more technical knowledge. The main problem we had to deal with was moving all JavaScript code from the header and the body to the footer at the bottom of pages across the website.

If you are using WordPress, the Autopmize plugin I suggested above should help you with this task. Check its settings, then uncheck “Force JavaScript in <head> and check “Inline all CSS.”

Step #5: Enable compression

zycbOoO.png

Implementing the enable compression suggestion can be done simply in your server’s settings. If you are not very technical, you can ask your technical support team to enable GZIP compression for your server.

Step #6: Optimize the mobile experience

The mobile experience is all about showing a responsive mobile version for all different types of resolutions, using correct fonts, and having a good navigation system.

RREZqKy.png

You can test how your website looks in different mobile versions using Google Chrome. Click on the hamburger menu on the top right side, and then on “More Tools – Developer Tools.” On the left side, you can select to see what your website looks like in different mobile resolutions.

bR5sqwO.png

In our case, there weren’t many changes to be made.

Conclusion

These are the most important steps we’ve followed to make Monitor Backlinks score 100/100 with Google’s Speed Tool. We didn’t optimize only the homepage. We also optimized an internal page, the free backlinks checker.

ny2zaoB.png

The three most important actions you can take to improve your website are:

  1. Use a CDN (content delivery network).
  2. Fix the render-blocking issues. (Avoid having JavaScript in the body of the coding. Your JavaScript code should be placed at the bottom of the files.)
  3. Optimize the size of images and compress them.

Has your team undertaken such a project for your website? If so, what were your results?

This post was authored by Felix Tarcomnicu and first appeared on Moz.com. The original post can be found here.

The post How to Achieve 100/100 with the Google Page Speed Test Tool appeared first on Link Promotions Ltd Blog.

]]>
Website Showcase – Tattoo 2000 Leicester http://blog.linkpromotions.net/2016/02/19/tattoo-2000-leicester/ Fri, 19 Feb 2016 11:49:44 +0000 http://blog.linkpromotions.net/?p=272 A new website hosted by Link Promotions, we present Tattoo 2000.   After a dispute with his web host who then promptly deleted his website, Dave of Tattoo 2000 Leicester contacted us about building him a new site as he had now dropped off the first page of the search engines for his focus keywords. […]

The post Website Showcase – Tattoo 2000 Leicester appeared first on Link Promotions Ltd Blog.

]]>
A new website hosted by Link Promotions, we present Tattoo 2000.

Tattoo 2000's home page for their newly designed website.

Tattoo 2000’s home page for their newly designed website.

 

After a dispute with his web host who then promptly deleted his website, Dave of Tattoo 2000 Leicester contacted us about building him a new site as he had now dropped off the first page of the search engines for his focus keywords. We built this site and now host it for him so he can happily be found again for people wanting a Tattoo Shop in Leicester.

By the end of the project we had built a brand new site to meet his requirements including full design and logo design, Google map integration, a Facebook feed and local marketing as well as giving him control of his site with a user accessible content management system. This means that he can better connect with his customers across a whole range of different social and marketing tools as well as edit the actual content of their site at any time. He quickly started to appear on the search engines for “Tattoo Shop Leicester” and Tattoo 2000 now receive regular bookings just from having this site online with us.

Check out the site for yourself at tattoo2000.net

For more information on Link Promotions’ Web Design Services call our sales team on 01254 433 703 or find out more information here.

The post Website Showcase – Tattoo 2000 Leicester appeared first on Link Promotions Ltd Blog.

]]>
Link Promotions – 7 SEO Tricks You Should Know http://blog.linkpromotions.net/2016/02/17/link-promotions-7-seo-tricks-you-should-know/ Wed, 17 Feb 2016 11:53:28 +0000 http://blog.linkpromotions.net/?p=268 Today we have a short article including Link Promotions’ 7 Tricks for SEO we think you should know. Comment below or interact with us on Twitter for more information. Trick #1: Find Out What Page You Should Optimize Are you wondering what’s the most relevant page on your site for your keyword according to Google? […]

The post Link Promotions – 7 SEO Tricks You Should Know appeared first on Link Promotions Ltd Blog.

]]>
Today we have a short article including Link Promotions’ 7 Tricks for SEO we think you should know. Comment below or interact with us on Twitter for more information.

Trick #1: Find Out What Page You Should Optimize

Are you wondering what’s the most relevant page on your site for your keyword according to Google? Use this query: [keyword] site:yoursite.com.

SEO

 

And Google will give you the answer:

result

 

Once you know what page is the most relevant to a specific keyword, you can decide whether you want to optimize that page or create a new one.

Trick #2: Find Out How Much Content You Have for a Specific Keyword
Let’s say you and I are trying to rank for the keyword “dog training.” I only have one page of content and you have 2,000. Who do you think is going to rank better? You, of course. By doing the same search I showed you above you can see who has more content around a given keyword.

 

Trick #3: Recover the Link Juice from 404 Error Pages

  1. Check out your Google Webmaster Tools logs and find 404 error pages. These are pages that other websites (or your own website) are linking to but don’t exist on your server.
  2. Do the same analysis using Xenu Link Sleuth.
  3. Use OpenSiteExplorer to get a list of all the links pointing to your site. Click on the “Top Pages” tab to get a list of all the pages on your site that get links from external websites. Export that list of pages and use Xenu to check for 404 errors.
  4. Once you’ve found all the pages that no longer exist, use 301 redirects to send the traffic (and links) from those pages to new pages. For example, let’s say people are linking to you-are-awesme.html (notice that “awesome” is misspelled), but your actual page is you-are-awesome.html. Then what you need to do is redirect traffic from the wrong URL to the right one. Misspellings are not the only cause for getting links to nonexistent pages. Sometimes a page is taken down, sometimes URLs change and sometimes servers don’t work the way they’re supposed to. By using redirects you keep the link juice and your visitors don’t get lost on your site.

 

Trick #4: Include Images and Videos in Your XML Sitemap File
We all know how important XML sitemaps are to help Google find all the content on our sites. But what a lot of people tend to miss is including images and videos in their XML sitemaps, which is an incredible opportunity to get your multimedia content indexed by Google. This website explains how to create video and image sitemaps.

Trick #5: Fix Keyword-Landing Page Congruency Issues
This is a HUGE problem that most websites have. People find them on Google, visit their websites, check them out and leave them within three seconds. The verdict: they didn’t find what they were looking for. This is how you fix this issue and keep people on your site:

  1. On Google Analytics, pull a report of all the keywords with more than 5 searches in the last 12 months.
  2. Sort them by bounce rate. The goal is to find keywords that bring you a lot of traffic but have a really high bounce rate.
  3. The next step is to figure out what page of your site they found when they searched for each keyword. There are two ways to find out: the quickest but less accurate way is the Google query I mentioned at the beginning of this post: [keyword] site:yoursite.com. The other way is to create an advanced segment using Google Analytics. You’ll create a segment for every keyword, such as “people who found me searching for “black pool tables.” Once that advanced segment is selected, on Google Analytics go to Content > Top Landing Pages.
  4. Ask yourself this question: “why someone searching for “black pool tables” would leave right after finding this page?” Find ways to make that page more appealing to that audience. Most of the time the answer is obvious, but if it isn’t, use a tool like KissInsights on that page to ask your visitors what they were expecting and they didn’t get.

Trick #6: Choose Your Names Carefully
I’m guilty of not spending enough time thinking about the names for my e-books, tools and blog posts. There are two reasons why names are so important:

  • Great names attract more attention, which result in more traffic, links and better organic rankings.
  • When you choose a keyword-rich name, people will use it to link to you. And, as you know, when your keywords are in the anchor text of the links pointing to you, it helps you rank better for that keyword. Let me give you an example. Let’s say I create a tool to check your organic rankings. I could call it “Zeke’s Ranking Tool,” but nobody is searching for that. Instead, I should call it “SEO Ranking Tool.” That way, when people link to it they’ll use “SEO Ranking Tool” in the link text, which will help me rank better for that keyword.

Trick #7: Find Out What Keywords Your Competitors Are Using for Their SEO Campaigns
I’m shocked at how very few people check their competitors keywords when they do keyword research. These are my favorite ways to spy on my competitors’ keywords:

  • Run SEMRush on their sites.
  • Check their page titles and meta descriptions (site:yourcompetitor.com)
  • Check their meta keywords for their top 20 pages on Google.
  • Use OpenSiteExplorer to check their anchor text distribution. If they’re actively doing SEO, looking at the link text for their inbound links is the best way to see what keywords they’re focusing on.

To find out more about Link Promotions’ SEO services call 01254 433 703 or get more contact information here.

The post Link Promotions – 7 SEO Tricks You Should Know appeared first on Link Promotions Ltd Blog.

]]>
New iPhone and Mac due in March http://blog.linkpromotions.net/2016/02/15/new-iphone-and-mac-due-in-march/ Mon, 15 Feb 2016 12:12:43 +0000 http://blog.linkpromotions.net/?p=263 After revealing them at a March 15th event, Apple will start selling the new 4-inch iPhone 5se and Macbook from as early as 18th March, according to Reuters. Apple have suffered their slowest ever increase in iPhone sales since they launched the original in 2007 so are looking in to new avenues to increase sales […]

The post New iPhone and Mac due in March appeared first on Link Promotions Ltd Blog.

]]>
After revealing them at a March 15th event, Apple will start selling the new 4-inch iPhone 5se and Macbook from as early as 18th March, according to Reuters. Apple have suffered their slowest ever increase in iPhone sales since they launched the original in 2007 so are looking in to new avenues to increase sales and diversify their product line.

66

The new iPhone 5se will appeal to those iPhone users unwilling to upgrade to the 6 or 6 plus due to the larger form factor of these newer devices.

The post New iPhone and Mac due in March appeared first on Link Promotions Ltd Blog.

]]>