Google's May 2020 Core Update: Winners, Winnerers, Winlosers, and Why It's All Probably Crap

{ object.primary_image.title }}

Posted by Dr-Pete On May 4, Google announced that they were rolling out a new Core Update. By May 7, it appeared that the dust had mostly settled. Here’s an 11-day view from MozCast: We measured relatively high volatility from May 4-6, with a peak of 112.6° on May 5. Note that the 30-day average temperature prior to May 4 was historically very high (89.3°). How does this compare to previous Core Updates? With the caveat that recent temperatures have been well above historical averages, the May 2020 Core Update was our second-hottest Core Update so far, coming in just below the August 2018 “Medic” update. Who “won” the May Core Update? It’s common to report winners and losers after a major update (and I’ve done it myself), but for a while now I’ve been concerned that these analyses only capture a small window of time. Whenever we compare two fixed points in time, we’re ignoring the natural volatility of search rankings and the inherent differences between keywords. This time around, I’d like to take a hard look at the pitfalls. I’m going to focus on winners. The table below shows the 1-day winners (May 5) by total rankings in the 10,000-keyword MozCast tracking set. I’ve only included subdomains with at least 25 rankings on May 4: Putting aside the usual statistical suspects (small sample sizes for some keywords, the unique pros and cons of our data set, etc.), what’s the problem with this analysis? Sure, there are different ways to report the “% Gain” (such as absolute change vs. relative percentage), but I’ve reported the absolute numbers honestly and the relative change is accurate. The problem is that, in rushing to run the numbers after one day, we’ve ignored the reality that most core updates are multi-day (a trend that seemed to continue for the May Core Update, as evidenced by our initial graph). We’ve also failed to account for domains whose rankings might be historically volatile (but more on that in a bit). What if we compare the 1-day and 2-day data? Which story do we tell? The table below adds in the 2-day relative percentage gained. I’ve kept the same 25 subdomains and will continue to sort them by the 1-day percentage gained, for consistency: Even just comparing the first two days of the roll-out, we can see that the story is shifting considerably. The problem is: …

Read Full Article on Moz

How to Get Quick Results With SEO Sprints: The DriveSafe Case Study

{ object.primary_image.title }}

Posted by ChristopherHofman Currently, many businesses face challenging times and are moving their SEO budget to disciplines which offer quicker wins. But you can also create instant results with SEO, and it can be done on a small budget even when you are up against bigger players in your industry. In this blog post I will show you my framework to do SEO sprints. I will show you how you can use Google’s ability to index and rank faster to your advantage. Later, you will be presented with a case study, where we used SEO sprints for a chain of opticians. The result: an increase in bookings of vision tests of 73%. But first, let's have a look at the layout on page one of Google (for most queries). Google never took SEOs into account when designing for the user. As a result, their transformation over the last few years …

More

Identifying Advanced GSC Search Performance Patterns (and What to Do About Them)

{ object.primary_image.title }}

Posted by izzismith Google Search Console is by far the most used device in the SEO’s toolkit. Not only does it provide us with the closest understanding we can have of Googlebot’s behavior and perception of our domain properties (in terms of indexability, site usability, and more), but it also allows us to assess the search KPIs that we work so rigorously to improve. GSC is free, secure, easy to implement, and it’s home to the purest form of your search performance KPI data. Sounds perfect, right? However, the lack of capability for analyzing those KPIs on larger scales means we can often miss crucial points that indicate our pages’ true performance. Being limited to 1,000 rows of data per request and restricted filtering makes data refinement and growth discovery tedious (or close to impossible). SEOs love Google Search Console — it has the perfect data — but sadly, it’s …

More

Holiday marketing: Get the data that puts you ahead of the competition

{ object.primary_image.title }}

What you will read in this post: Understand holiday season traffic trends Optimize for strong SEO and PPC keywords Analyzing keyword-driven traffic for seasonal marketing Which sites won the most keyword traffic? Black Friday marketing: November 2019 Christmas marketing strategy: December 2019 Build strong display and referral partnerships Analyze historic conversion data The biggest display and referral sites (and the brands winning traffic) Black Friday marketing strategy: November 2019 Christmas marketing: December 2019 Already imagining the taste of the delicious holiday meals and the laughter of your kids when the entire family comes together? Sorry, we know you’re a marketer; you don’t have time for that. You’re busy worrying whether you have everything you need so your marketing strategy can ensure the biggest possible chunk of holiday traffic and generate maximum sales. This post investigates seasonal marketing statistics of the past few years and provides some eye-opening insights from SimilarWeb’s …

More

Do Headings Really Impact Rankings?

{ object.primary_image.title }}

They say in SEO you need to use headings. Those can be H1, H2, or even H3 tags. But do they really impact your rankings? Sure, a lot of CMS systems put headings on each of your web pages by default. They do this with the title of the page (or blog post) and sometimes to sections within a page. But again, the real question is, do they help with rankings? I decided to run a fun experiment to find out if they really help. How the experiment worked Similar to past experiments I ran, I reached out to a portion of my email list to ask if they would like to participate. Just like how I did with the one on blog comment links and this one on link building . 4,104 of you responded wanting to participate. But unlike previous experiments, we only ran this one on websites …

More

Google’s May 2020 Core Update: What You Need to Know

{ object.primary_image.title }}

On May 4th, Google started to roll out a major update to its algorithm. They call it a “core” update because it’s a large change to their algorithm, which means it impacts a lot of sites. To give you an idea of how big the update is, just look at the image above. It’s from SEMrush Sensor , which monitors the movement of results on Google. The chart tracks Google on a daily basis and when it shows green or blue for the day, it means there isn’t much movement going on. But when things turn red, it means there is volatility in the rankings. Now the real question is, what happened to your traffic? If you already haven’t, you should go and check your rankings to see if they have gone up or down. If you aren’t tracking your rankings, you can set up a project on Ubersuggest for …

More

Should You Test That? When to Engage in SEO Split Tests

{ object.primary_image.title }}

Posted by Portent This blog was written by Tim Mehta, a former Conversion Rate Optimization Strategist with Portent, Inc. Running A/B/n experiments (aka “Split Tests”) to improve your search engine rankings has been in the SEO toolkit for longer than many would think. Moz actually published an article back in 2015 broaching the subject, which is a great summary of how you can run these tests. What I want to cover here is understanding the right times to run an SEO split-test, and not how you should be running them. I run a CRO program at an agency that’s well-known for SEO. The SEO team brings me in when they are preparing to run an SEO split-test to ensure we are following best practices when it comes to experimentation. This has given me the chance to see how SEOs are currently approaching split-testing, and where we can improve upon the …

More

Should You Test That? When to Engage in SEO Split Tests

{ object.primary_image.title }}

Posted by Portent This blog was written by Tim Mehta, a former Conversion Rate Optimization Strategist with Portent, Inc. Running A/B/n experiments (aka “Split Tests”) to improve your search engine rankings has been in the SEO toolkit for longer than many would think. Moz actually published an article back in 2015 broaching the subject, which is a great summary of how you can run these tests. What I want to cover here is understanding the right times to run an SEO split-test, and not how you should be running them. I run a CRO program at an agency that’s well-known for SEO. The SEO team brings me in when they are preparing to run an SEO split-test to ensure we are following best practices when it comes to experimentation. This has given me the chance to see how SEOs are currently approaching split-testing, and where we can improve upon the …

More

Top 10 Changes That Impacted Google My Business in 2020

{ object.primary_image.title }}

Posted by ColanNielsen 2020 has been a busy year for Google My Business (GMB). Since January, Google has launched new features, fixed bugs, and had to adapt to the global pandemic. At Sterling Sky, we think it’s important to keep track of all the changes that happen in the local search space in general, and that impact GMB specifically. So far in 2020 we are up to 54 changes. As you can tell, changes that impact Google My Business came at a fast pace — and at high volume — in 2020. In this post, I highlight the changes I think were most important in each month of this year, so far. For an exhaustive list of all the updates that have been made, check out this timeline . January: Google posts borked — hello, 2020! Foreshadowing things to come, GMB started off the year with a major issue in …

More

We Analyzed 11.8 Million Google Search Results. Here’s What We Learned About SEO

{ object.primary_image.title }}

We recently analyzed 11.8 million Google search results to answer the question: Which factors correlate with first page search engine rankings? We looked at content. We looked at backlinks. We even looked at page speed. With the help of our data partner Ahrefs , we uncovered some interesting findings. And today I’m going to share what we found with you. Here is a Summary of Our Key Findings: 1. Our data shows that a site’s overall link authority (as measured by Ahrefs Domain Rating) strongly correlates with higher rankings. 2. Pages with lots of backlinks rank above pages that don’t have as many backlinks. In fact, the #1 result in Google has an average of 3.8x more backlinks than positions #2-#10. 3. Comprehensive content with a high “Content Grade” (via Clearscope ), significantly outperformed content that didn’t cover a topic in-depth. 4. We found no correlation between page loading speed …

More

Using the Flowchart Method for Diagnosing Ranking Drops — Best of Whiteboard Friday

{ object.primary_image.title }}

Posted by KameronJenkins Being able to pinpoint the reason for a ranking drop is one of our most perennial and potentially frustrating tasks as SEOs, especially in 2020 . There are an unknowable number of factors that go into ranking these days, but luckily the methodology for diagnosing those fluctuations is readily at hand. In this popular Whiteboard Friday, the wonderful Kameron Jenkins shows us a structured way to diagnose ranking drops using a flowchart method and critical thinking. Click on the whiteboard image above to open a high-resolution version in a new tab! Video Transcription Hey, everyone. Welcome to this week's edition of Whiteboard Friday. My name is Kameron Jenkins. I am the new SEO Wordsmith here at Moz, and I'm so excited to be here. Before this, I worked at an agency for about six and a half years. I worked in the SEO department, and really a …

More

Subscribe to our newsletter

Join our newsletter and never miss out trending marketing news.

HitcountVariables(pk=3478, ajax_url='/api/hit/ajax/', hits='4')