Audience measurement

Over the past couple of days, some SEOs have been sharing some unusually looking charts from Google Search Console’s Search Analytics report. It shows negative numbers, charts that don’t connect, numbers that are off, etc. John Mueller from Google event replied.

click for full size

click for full size

Market research

John Mueller from Google said this:

Forum discussion at Twitter.

facebook

Ari Nahmani posted on Facebook (public URL) a message he got for one of his clients in Google Search Console that is unique. It is directly from John Mueller, the same John Mueller we cover here, about a critical and urgent issue with their web site.

The email first explains who John is, then says how the site has issues with search engines. It talks about how URLs are bing returned with a 400 error, preventing Google and other search engines from crawling it. It appears the site has something in place to block Google and other search engines from crawling and Google wanted to let this site owner know.

Here is a copy of the notification in Google Search Console provided by Ari (click to enlarge it):

click for full size

So here you have it, we always knew Google sent personal messages every now and then – but here is a screen shot of one.

Forum discussion at Facebook.

search engine

Yesterday Bing announced at SMX West that they have increased the ability to submit URLs to their search engine by 1,000X from 10 URLs per day to 10,000 URLs per day. They also said this is a fundamental shift in how search engines discover content and reduce crawling of web sites.

Bing wrote “We believe that enabling this change will trigger a fundamental shift in the way that search engines, such as Bing, retrieve and are notified of new and updated content across the web. Instead of Bing monitoring often RSS and similar feeds or frequently crawling websites to check for new pages, discover content changes and/or new outbound links, websites will notify the Bing directly about relevant URLs changing on their website. This means that eventually search engines can reduce crawling frequency of sites to detect changes and refresh the indexed content.”

What? Did you read that?

Bing removed the public URL submission tool last year and had issues with the private one so had to add rate limits there. Now, they expanded it from 10 URLs per day to 10,000 URLs per day. This is a big big change.

But not everyone gets 10,000 URLs per day, it is based on the age of the verified site. Bing wrote “The daily quota per site will be determined based on the site verified age in Bing Webmaster tool, site impressions and other signals that are available to Bing. Today the logic is as follows, and we will tweak this logic as needed based on usage and behavior we observe.”

click for full size

To submit URLs, go to Bing webmaster tools and submit them manually:

click for full size

Or you can use the API:

click for full size

I still find it weird that Bing thinks they can slowly do away with crawling? Maybe I am reading too much into this?

Forum discussion at Twitter.

5 Easy SEO Wins with Powerful Results

TeamVFM Local SEO Valencia, CA

TeamVFM Local SEO Valencia, CA

 

 

 

 

 

 

 

 

 

 

 

 

Search engine optimization, when done correctly, can take a lot of work. This is why so many people are so eager to take shortcuts.

Fortunately, there are some tasks that don’t require as much effort, compared to tasks like link building, yet still yield significant gains.

I’m a big fan of efficiency, so I love tactics that deliver a greater return on my investment of time and/or money.

In this article, I’m going to explain five of these tactics which are easy to execute successfully but can deliver powerful results.

These easy SEO wins will help you get more out of your efforts and sprint past your competitors. They will also help to leverage better results out of your other SEO efforts like link building and content development.

1. Prune Outdated /Low-Quality Content

You probably created all of the content on your website with the best of intentions, but still, it’s almost a certainty that some of it is garbage.

There are a variety of reasons for this, and it happens to the best of us. The solution in many cases is to prune this content. In fact, Danny Goodwin and Loren Baker recently hosted a webinar on exactly this topic.

Some people are hesitant to get rid of any content, no matter the reason. The thinking is generally that it can’t do any harm to leave it there. And Google has reinforced this thinking time and time again.

But the reality is that despite what Google’s representatives say, outdated and/or low-quality content can negatively impact your ranking and traffic.

It probably should impact your credit score too, but apparently, I don’t have the clout necessary to make that happen.

Identifying Content to Prune

Once you’ve worked up the courage to start pruning, the first step is to identify the content that should be deleted.

The easiest and most complete way to do this is to use software like Screaming Frog to crawl your website and generate a list of URLs. This helps to ensure you don’t miss anything.

Next, you’ll need to begin the tedious task of reviewing this list, URL by URL, to determine which content is outdated or low quality. This means you actually need to manually visit each page and review the content.

It may help to prioritize this list. Google Search Console gives you the ability to export a CSV file of the URLs Google has indexed for your website, which you can then sort by traffic.

From here, you’ll want to start evaluating the URLs with no traffic, working your way up.

sorted URLs

It’s important to point out that a lot of this content you’re deleting can and should be redirected to a stronger, high-quality page.

But don’t fall into the misguided approach of redirecting them to your homepage. If there is a legitimately relevant page on your website, redirect it there, otherwise, just let it 404.

But what about the content that’s not a complete dumpster fire, and is still relevant?

2. Improve Quality Content

If you’ve been doing things right, a lot, if not most of your content should survive the executioner’s blade.

This content should be improved based on your visitors’ needs.

The advantage here is that this content already exists, the URL has a history in Google, and it may even have some inbound links. Because of these factors, it makes a lot more sense to improve that content rather than starting over from scratch.

Depending on circumstances, this might include:

  • Editing your content to improve readability, increase engagement, and to make it more comprehensive.
  • Adding relevant and useful media, including images, video, and PDFs.
  • Including original data, research, statistics, and case studies.

We’ll want to prioritize the content to improve based on quick and easy wins. This means we won’t be targeting topics we don’t already rank for, but we also won’t be focused on improving positions we already rank highly for.

So let’s go back to our Google Search Console export and sort the data based on URLs that rank anywhere from Position 5 to 30 in the search results.

sort URLs by position

We’ll then further sort this data by relevance and potential search volume. From here, we will compare these URLs to our competitors who outrank us to identify opportunities to improve.

Some of the things we’re looking for could include:

Word Count

Despite what you may have been told, size does matter.

While not a worthwhile metric on its own, it can help to determine how comprehensive several URLs are in comparison to each other.

Depth

Generally speaking, the top ranking pages across all industries tend to be more comprehensive than those that they outrank. This doesn’t mean that longer content will always win, but it can be a powerful factor.

Does your content effectively and completely answer not only the original query, but also any related questions that may come up as a result?

You need to think about not only the immediate topic, but everything related to the customer journey. This might include:

  • Related definitions
  • Frequently asked questions
  • A summary of relevant laws and regulations
  • Explanation of a process
  • Technical specifications
  • Statistical data
  • Case studies

Readability

How well-written is your content?

This is not something you want to evaluate by gut feel – you need an objective measurement.

  • Yoast gives you a readability score while editing content right in WordPress.
  • SEMrush enables you to test readability both within their platform and with a Chrome add-on that integrates with Google Docs.
  • There are countless other tools as standalone websites, apps, and addons/plugins, available.

Your immediate goal is to make your content easier to read than the content that’s outranking you, but that’s just a starting point.

If your competitors content reads like someone spilled a bowl of alphabet soup, don’t set out to simply be a little better than them. Your goal should be to blow them away.

Media

Are original and useful images included within your content? How about video and/or audio files?

Images can provide additional context that helps search engines understand what your content is about. So can video, provided that schema is properly used.

But both serve another more important role, and that is to improve the user experience.

Look for opportunities to use media to provide additional information that’s not included in the text.

Both images and video are great at making complex topics easier to understand, but video is particularly effective at keeping visitors on your website longer, which is always a good thing.

It’s always a wise idea to include a watermark on your images to prevent competitors from stealing them.

Sure, you could file a DMCA complaint after the fact, but it’s always easier to avoid the problem in the first place.

3. Update Internal Links

Internal links can be a powerful tactic in your SEO toolbox, but it’s important to review them from time.

Your internal links should point to any pages that you want to rank well, and they should be placed on any pages with content relevant to the link destination. Equally important, these links should be direct.

redirects

This is a pretty common problem in websites where content is frequently published, moved, or deleted. The solution is to use a tool like Screaming Frog or SEMrush to crawl your site and identify any redirect chains.

As for managing these internal links, I’m a big fan of automating this task, and this is easy for WordPress websites.

There are several plugins available that enable you to specify certain words/phrases to automatically link to specific URLs.

This allows you to instantly create, edit, and delete links across your entire website, whether you have a few pages or a few million pages.

4. Improve Page Load Speed

The longer a webpage takes to load, the fewer leads and sales you’ll generate. To compound this problem, slower websites also tend not to rank as well compared to faster websites.

This makes page speed monumentality important.

Most websites are painfully slow, but the good news is that it’s relatively easy to improve.

While improving page speed requires a moderate level of technical expertise, I still consider this to be an easy win because the improvements you make will have an immediate and sitewide effect.

I’ll briefly share a few tactics here, but I encourage you to check out another article I wrote, explaining how to improve page speed, in great detail.

Dump the Budget Web Hosting

The cheaper web hosts tend to oversell their services, so your website is crammed onto a server with hundreds or even thousands of other websites.

Because these servers often lack the horsepower necessary, the websites they host often suffer in terms of page speed.

Reduce HTTP Calls

Every part of your website – each HTML, CSS, JavaScript, image, video, and any other type of file — requires a separate HTTP request.

Fewer HTTP requests typically means a faster website.

So how do we get there?

The first step is to remove any unnecessary plugins. Then, you’ll merge multiple CSS and JavaScript files into a single CSS and JavaScript file.

You should also minimize the number of image files by using CSS to create the desired design effect and/or using sprites to merge multiple frequently used images.

Optimize Media Files

Images and videos on many websites tend to be larger than they need to be.

The first step is to to make sure your media files are in the ideal format. For example, JPG is best for photographic images, while GIF or PNG are better for images with large areas of solid color.

Then, you’ll need to ensure your media files are properly sized. Images and video should be no larger than they will be displayed.

For example, if a particular image on your website will be never displayed at more than 800px wide, the image file should be 800px wide.

Finally, you’ll need to compress your media files. There are a number of free tools available online for compressing various file types. There are also WordPress plugins that can compress all of the images already on your website.

These three steps are a good start, but as I mentioned earlier, I highly encourage you to check out my previous article on improving page speed for more tactics and greater detail.

5. Implement Schema Markup

There is no definitive evidence that schema markup has any direct impact on ranking, however, it’s still critical to SEO.

That’s because it has the potential to increase your website’s visibility in the search results, which results in higher click-through rates.

Since most websites today still don’t use schema, this creates a tremendous opportunity for those that do. Take a look at this example and tell me which result caught your eye first?

schema in SERPs

Fortunately, implementing schema is relatively simple. There are three types, and they are used in different scenarios.

  • Standard schema microdata, which is marked up directly in HTML.
  • JSON-LD, which is marked up in JavaScript and is the most recommended format.
  • RDFa, which is used in a variety of different document types including XML, HTML 4, SVG, and many others.

In some cases, you’ll use JSON and add it to your website just like you would any other script. In some cases, you’ll add markup to specific elements on your website, and in others, you might add RDFa to a different document type.

Roger Montti wrote a great, in-depth post on schema, so rather than reinventing the wheel here, I’ll just direct you to his article.

But schema goes a lot deeper than where it is today and I anticipate that it will play a much larger and more direct role in the search algorithm. Especially as voice search begins to gain traction.

Montti explains in another article how Google is currently using speakable markup, which I believe will become a more prominent factor in search in the coming years.

via: SearchEngine Journal: https://www.searchenginejournal.com/easy-seo-wins/303251/?utm_source=email&utm_medium=daily-newsletter&utm_campaign=daily-newsletter#close

 

TeamVFM Local SEO Valencia, CA

Study Finds 96% of Business Locations Aren’t ‘Voice Search Ready’

The rise of virtual assistants and voice search is changing the way consumers seek local business information. Voice search has led to queries that are longer, more conversational and often more specific (e.g. Where is the nearest grocery store?).

Three years ago Google said that 20% of all searches were initiated by voice. Since that time the company has not updated the statistic, though in the interim more than 100 million smart speakers have been sold in the US. There are also more than a billion devices globally that feature the Google Assistant (mostly smartphones).

All of this is driving more voice search volume. But are businesses “voice search ready”?

A new study from Uberall analyzed the voice search readiness (VSR) of 73,000 businesses in the Boston Metro area, ranging from SMBs to large enterprises, all of which had a location. The study utilized “a percentage-based grading system that analyzed a business’s optimized online presence” and data.  Specifically, it took into account:

  • Which directories are most important for feeding voice search platforms (37 of them)
  • The accuracy of multiple categories of business information

By assigning a value to the top pieces of business information, including address, hours of operation, phone number, business name, website and zip code, the study calculated a VSR score from 1-100%.

The study discovered that an overwhelming 96% of businesses are not voice search ready, and the average VSR score was 44%. Dentists faired best, with an average VSR score of 96.82%, followed by health food (96.6%), home improvement (96.5%), criminal attorneys (91.5%) and dollar stores (90.1%). The bottom five categories averaged a VSR score of less than 2%.

The major contributing factor to low scores is inaccurate business information across online directories. According to the study, a mere 4% of businesses had correct information on the most significant directories (Google, Yelp and Bing).

Out of a possible 2.1 million listings (across directories) for the 73,000 businesses analyzed, there were an overwhelming number of errors, including:

  • 978,305 for hours of operation
  • 710,113 for websites
  • 510,010 for location name
  • 421,048 for addresses

The study concluded with recommendations about how to improve VSR scores. Those included having an accurate and complete Google My Business listing, as well as on key other search and directory sites, such as Bing and Yelp. It also recommends getting help from a service provider, making sure listings are correct across channels and using voice-friendly content on sites and in listings.

via: https://www.lsainsider.com/study-finds-96-of-business-locations-arent-voice-search-ready/archives

Contributed by: 

Courtney is the content strategist for the Local Search Association