Bill Hartzer Wed, 29 Mar 2017 05:45:33 +0000 en-US hourly 1 33 Speakers to Meet and See at SMX West 2017 San Jose Mon, 20 Mar 2017 22:38:02 +0000 smx west 2017

Attending SMX West 2017 in San Jose this week? Whenever I attend (and speak) at conferences, I like to take the time before the conference to look at the list of speakers. I’ve been speaking at and attending internet marketing and search marketing conferences for over 16 years now.

I’ve seen a lot of presentations. Some good, some bad. I’ve seen some memorable ones, like the time the speaker hired a professional magician to help with his presentation. Then there was the time that a speaker was going long (he kept talking) and was told to leave and stop speaking when the speakers for the next session walked into the room.

Not in any order of importance, here’s my personal, hand-picked list of speakers to meet and see at SMX West 2017 being held this week in San Jose, California.

Andy Atkins-Kruger, Group CEO WebCertain
Thomas Ballantyne, Director of Marketing Bulwark Exterminating
Ashley Berman Hale, Technical SEO Specialist MobileMoxie
Aaron Bradley, Manager, Web Channel Strategy Electronic Arts
Maddie Cary, Director of Paid Search Point It
Christine Churchill, President KeyRelevance
Janet Driscoll Miller, President and CEO Marketing Mojo
Mona Elesseily, VP Online Marketing Strategy Page Zero Media
Eric Enge, CEO Stone Temple Consulting
Duane Forrester, VP, Industry Insights Yext
Glenn Gabe, President G-Squared Interactive
Brad Geddes, Founder AdAlysis
Greg Gifford, Director of Search & Social DealerOn
Casie Gillette, Director of Online Marketing KoMarketing
Emily Grossman, Mobile Marketing Specialist MobileMoxie
Steven Hammer, President RankHammer
Bill Hunt, President Back Azimuth Consulting
Gary Illyes, Webmaster Trends Analyst Google
Cindy Krum, CEO MobileMoxie
Adam Proehl, Partner NordicClick Interactive
Kristine Schachinger, Owner
Paul Shapiro, Director of Strategy & Innovation Catalyst
Andrew Shotland, President Local SEO Guide
Chris Silver Smith, Founder & CEO Argent Media
Aleyda Solis, International SEO Consultant Orainti
David Szetela, President FMB Media
Shari Thurow, Founder and SEO Director Omni Marketing Interactive
Marcus Tober, Founder/CTO Searchmetrics Inc.
Mark Traphagen, Senior Director of Brand Evangelism Stone Temple Consulting
Matt Van Wagner, President Find Me Faster
Purna Virji, Senior Client Development and Training Specialist, Bing Ads Microsoft
Marty Weintraub, Founder aimClear
Susan Wenograd, Partner & SEM Manager Five Mill, Inc.

Are you a speaker and didn’t make my list this time? Come find me and convince me to attend your session. Give me your business card (yeah, that!). Tell me a funny joke or make me laugh. Introduce yourself. I’ll be hanging out around the Majestic booth, booth 418, at SMX West 2017 San Jose. Oh, and one way I pick this list of speakers, just as a few hints? I’ve heard them speak (most multiple times, and I was impressed. I have learned from these people). And, I’ve seen their work. And, again, I was impressed.

Are you attending the conference? Making up your mind on which sessions to see? First, look for the topic of choice–then pick a speaker from my personal, hand-picked list. Also, be sure to drop by the Majestic booth 418 this week for a personal demo.

If you’re looking for an SEO Consultant who has over 20 years of organic SEO experience, I’m your guy. I do highly technical SEO Audits, am a domain name expert, as well. If you’re an SEO agency that’s looking to possibly outsource some work, I do that, too.

New gTLD Domains Win The SEO Hero Contest Wed, 15 Mar 2017 02:01:58 +0000 seo hero contest results

The results are in for the SEO Hero contest by Wix, and the New gTLD domain names have a great, strong showing in the results. Wix ran a good old fashioned SEO contest, where the winner would receive $50,000. Essentially, on a certain date, after four months’ time, whoever ranks #1 for the phrase SEO Hero will win $50,000.

The results are in, and the New gTLD domain names were the real winners here: The top spot clearly goes to and Who said that the New gTLD domain names can’t rank well in Google?!?

Here’s what Wix said about the contest:

We’re going to create a new Wix website optimized for the keyword “SEO Hero”. Everyone is invited to create their own website (on any platform) and if it ranks the highest for that search term in 4 months’ time, we’ll give them $50,000.” is reporting just over 3900 backlinks from 107 domain names for the winner, SEO-Hero.Tech.

seo hero

Patrick Stox, “Bean SEO Hero”, posted a chart showing the winners. At this point, I don’t know how it’s going to play out officially, as there is still chance that someone would be disqualified. Some of the sites that are ranking are, in fact, already disqualified (like, as it’s not a new domain name. But, here’s what the winners look like:

Here’s what I’m seeing as of this post, from Arkansas:


I asked why a new gTLD domain name was chosen over a .COM domain name:

“I needed SEO Hero as a brand name, not a keyword. .tech available and sounded pretty good for an SEO Tool” — @lightonseo

Google Thinks I Posted Every Post 7 Hours Prior Mon, 13 Mar 2017 14:44:32 +0000 google bug posted 7 hours ago

I’ve been watching this interesting Google bug for over a year now. Turns out that Google thinks that I posted every single post 7 hours prior to when I post. Whenever I make a new post here on my WordPress blog, and go find the post in Google’s search results, Google ALWAYS shows that I posted it 7 hours prior–even if I made the post 3 minutes ago.

Let’s look at a specific example. My last post, which was the presentation I made on technical SEO and SEO Audits at the Engage 2017 conference in Portland last week. Like I do with a lot of posts, after making the post I use the Google Fetch and Render tool to ask Google to crawl the post, and I request indexing. See the screen capture below:

Note the time stamp in Google’s fetch and render tool, though. It’s actually not correct, as it does show this based on PST time (I’m in Central Standard Time) here in Dallas. I made the post at 9:15am Central Time. Not PST time, which it is showing.

Here’s where it gets even more bizarre.

Then, I check to see if it’s indexed by searching for the URL in Google, as seen below. This is immediately after requesting indexing:

google 7 hours prior

Even though I *JUST POSTED* the blog post, Google thinks I made the post 7 hours prior to when I posted it. I didn’t know that I can actually predict things now–if something happened, and I posted about it, I apparently knew about it 7 hours prior to that event happening, according to Google.

I’ve looked at many reasons why Google might think that I posted 7 hour prior–even looking at the time stamp in the html code of the page, as well as the server time. In fact, I’ve even had my web host look at the server date–which they say is correct. But, again, Google still thinks, after all this time, that I posted something 7 hours prior. Is this a Google bug? Or am I missing something, like a time/date setting somewhere that’s wrong?

Even if it’s a time/date stamp that’s on my server or on my WordPress blog, that would technically be something that I can manipulate. I don’t think Google should be relying on what time I say I posted something. Google should use their own date/time of when they crawled the post and make that determination.

Is this a Google bug? If so, it’s been going on for over a year now. If it’s not, and Google’s relying on my site or server to tell them when something was posted, then that’s not right, either, as it can be manipulated.

Update after posting this
So, right after I made THIS post, it turns out that Google thinks that I posted this post 5 hours prior. Take a look:

Between the last post and this post, I changed the date/time on my laptop from PST to CST, as it was set for Pacific time from last week’s visit to Portland. However, I don’t think that Google would be using your own laptop’s time to show when something was posted.

Technical SEO and SEO Audits – Engage 2017 Portland Presentation Mon, 13 Mar 2017 14:11:50 +0000

At the Engage 2017 Portland conference on March 9th, 2017, I had the pleasure of presenting a technical SEO session with Jon Henshaw from Raven Internet Marketing Tools titled “Improving SEO and User Experience – The Technical Side”. My presentation is embedded below, along with transcribed notes, courtesy of Slideshare.

1. Improving SEO and User Experience
The Technical SEO Side: Improving SEO and User Experience

2. About Me
— Senior SEO Consultant,
— Founder, DFWSEM Association (2004)
— US Brand Ambassador,
— Practicing Organic/Natural SEO since 1996
— Formerly Senior VP, Advice Interactive (Advice Local)
— Senior SEO Strategist, Globe Runner
— Director of SEO, Standing Dog

3. Overview
— Before You Begin
— Gather Data
— Analyze
— Present Results

4. Before You Begin – Info to Gather
— Access to site (if possible), log file data
— Google Analytics, Search Console access
— Bing Webmaster Tools access
— Prior history: What SEO was done in past?
— Prior history: Domain Names used
— List of Domain Names owned, redirected
— List of Competitors
— Ask: anything else we need to know?

5. Content – Optimize Content on Site
— Google Search Console – Search Analytics
— Keyword Research
— Current Rankings

6. GSC Search Analytics Pages Trick

7. Featured Snippets Keyword Research

8. Current rankings, Focus on Pages Ranking 5th to 20th

9. Gathering Phase
— Gather the Data
— Save the Data (MS Excel, MS Word)
— Start making notes (Notepad)

10. Google Searches
— –
— Click last page of SERPs to get page count

11. Screaming Frog Adjust memory to crawl large site over 100k pages

12. Integrity (aka Scrutiny) What it does:
— Crawls site, reports redirects, 404 errors
— Mac only, finds more errors


14. Bing Webmaster Tools

15. Bing Webmaster Tools
— Spot issues to fix

16. Google Search Console

17. Google Search Console

18. SEMRush Site Audit

19. Website Log Files
— Weblog Storming, AwStats

20. Website Log Files, Crawl & Analytics
— OnCrawl combines all three

21. Audit Site Structure
— Internal Link Structure
— Manual Review of Site
— URL Hierarchy
— Grouping of Topics
— GA: In-Page Analytics

22. Gather Off-Site Data
— It’s not all about on-site and on-page.

23. Google Search Console Links

24. Ahrefs Links

25. Links

26. Topic Review

27. Majestic Trust Flow, Citation Flow
— Trust Flow = Number of clicks from a seed set of trusted sites to a given URL, or Domain
— Citation Flow = Number of citations to a given URL, or Domain

28. Majestic Anchor Text Review

— Check Server Headers

— Page Performance

31. Miscellaneous Data to Review
— All sorts of extras

32. Site Speed, DNS Health

33. Site Speed Review – Google Analytics

34. Robots.txt Issues
— Default CMS file?
— Disallowing wrong?
— Using “allow”?
— Not specifying sitemap.xml URL
— Directives conflict with other signals

35. Sitemap Issues
— Google Search Console sitemaps
— New pages not updated on sitemap
— Issues with multiple sitemaps
— No sitemap file?

36. Subdomain Issues
— Duplicate content?
— Using subdomain rather than directory
— Wildcard subdomains turned on?
— www versus non-www issues

37. Canonical Issues
— Review canonical tags
— Not using canonical correctly
— Conflicting signals with canonical, others
— Canonical tags to help with dupe content

38. Use of Structured Data
— Taking advantage of
— Not just for local addresses
— Reviews and Events
— Person, Place, Organization
— Products and Offers

39. Local Listings – Audit the listings
— Google My Business
— Bing Places for Business
— Yahoo! Local

40. Local Citations – Separate Audit
— Are citations correct?
— Same address, suite #, phone number?

41. Analysis Phase
— Analyze the Data
— Make Assumptions & Recommendations
— You’re Fat!
— You need to exercise!
— OK, well maybe not like that…

42. Analysis Phase
— Analyze gathered data
— Look for Obvious Issues
— Look for Odd Data Points
— More notes!

43. Analyze Google Analytics
— Changes over time, year over year data
— Conversion Data
— Bounce Rates
— Pages Per Session
— Drops in Traffic: Panda or Penguin in past few years?

44. Panda or Penguin Issues From Past?
— Verify with SEMRush, Google Analytics
— Moz Google Algorithm Change History
— Panguin Tool

45. Analyze Keyword Data, Rankings
— Look for keyword opportunities Anything ranking on 2nd page?
— Review SEMRush Keyword Data
— Review GSC Search Analytics

46. Analyze On-Site Data
— Title Tags
— Meta Descriptions
— Headings H1, H2, etc.
— Internal Anchor Text
— External Links (outgoing links)
— Google Pages Indexed vs. Crawled

47. Analyze On-Site Data
— Conflicting signals? Overall Topic focus
— Conflicting signals? Robots.txt vs. Canonical vs. Meta Robots vs. Nofollow
— Internal Duplicate Content
— Not Enough Unique Content
— Internal links within content present?
— Review Navigation
— Review Footer, footer links, copyright

48. Analyze Off-Site Data
— Link Velocity Matters (getting new links)
— Review anchor text (over optimized?)
— Diversity of anchor text
— Look for Toxic, Low Quality Links
— Run Link Risk, Link Detox report

49. Presentation Phase
— Present the Data
— Show the Results
— Make it look great!
— Action Plan
— Implementation Plan

50. Present The Results
— Use internal or formal doc needed?
— Who will implement?
— Implement changes in-house? Outsource?
— MS Word doc,
spreadsheets with data
— PowerPoint needed for presentation?

51. Present The Results – Document
— Cover Page
— Table of Contents
— Overview (summary, positives, negatives)
— Website Crawlability (404s, 301s, One Page issues)
— Content Optimization (site arch., keyword issues)
— Linking (Internal links, externals links, link profile)
— Overall Recommendations

52. Finally – Additional Lists
— List of Priority Issues
— Present Action Plan for Implementing

53. SEO Audit Toolset
— Google Page Speed Test
— Mobile Friendly Test
— Google Search Console – time downloading a page (if slower than 2 seconds to get the code, they’ll abandon the crawl)
— (diversity of testing environments is helpful… browser types, locations, connection speeds)

54. SEO Audit Toolset
— WebPageTest
— Google Search Console
— Google Analytics
— GPSi page speed insights
— G Structured Data Test
— GSC Disavow
— GSC Fetch & Render
— Marie Haynes’ Disavow Blacklist

55. SEO Audit Toolset
— Google Mobile Test
— Google AMP Test
— Bing Mobile Test
— W3c mobileOK Checker
— Deep Crawl
— Google AW Keyword Planner
— Moz Keyword Explorer
— What is my IP

56. SEO Audit Toolset
— Builtwith
— Server Software Detection
— SEO Server Header Checker Tool
— SSL Server Test powered by Qualsys SSL Labs
— Redirect Checker
— Serpstat
— semrush
— GA Code Checker

57. SEO Audit Toolset
— W3c CSS Validator
— W3c Markup Validator
— hreflang testing tool
— tracker map
— OSE, Majestic, A hrefs, Link Research Tools
— Bing Webmaster Tools
— Kerboo
— Panguin tool
— OnCrawl

18 Speakers to See at Engage 2017 Portland Tue, 07 Mar 2017 17:18:29 +0000 Engage 2017 Portland

The SEMPDX conference, Engage 2017 Portland, formerly known as SearchFest, is being held on Thursday, March 9, 2017 at the Sentinel Hotel in Portland, Oregon. I’m honored to be speaking again this year at the conference, where I’m presenting on technical SEO and performing a technical SEO audit of your website.

In this session, titled “Improving SEO and User Experience – The Technical Side”, Jon Henshaw from Raven Internet Marketing Tools will show you how to make your pages load like AMP pages without the AMP. Then, I’ll go into technical SEO and auditing a website.

But, enough about our session. When I’m not speaking, I’ll either be hanging out at the booth (I’m one of their US Brand Ambassadors), or I’ll be attending one of the other sessions. Here’s my picks for which speakers you absolutely must see at the Engage conference in Portland this week. (Not in order of importance, and I’ve already included myself and Jon Henshaw above, so I’m not including “us” on the list below.)

So, without further delay, here’s my list of 18 speakers to meet and see at the Engage 2017 Portland this year.

Cindy Krum – MobileMoxie
Dana DiTomaso – Kick Point – @danaditomaso
Mona Elesseily – Page Zero Media – @webmona
Emily Grossman – MobileMoxie – @goutaste
Ross Hudgens – Siege Media – @RossHudgens
AJ Kohn – Blind Five Year Old – @ajkohn
Jennifer Lopez – Welocalize – @jennita
Ian Lurie – Portent – @portentint
Dr. Pete Meyers – Moz – @dr_pete
Manny Rivas – Aimclear – @mannyrivas
Rob Ousbey – Distilled – @RobOusbey
Carolyn Shelby – tronc – @cshel
Cyrus Shepard – Fazillion – @CyrusShepard
Matt Siltala – Avalaunch Media – @Matt_Siltala
Mark Traphagen – Stone Temple Consulting – @marktraphagen
Purna Virji – Microsoft – @purnavirji
Marty Weintraub – Aimclear – @martyweintraub
David Mihm – Tidings – @davidmihm

I’m really looking forward to this year’s Engage 2017 Portland conference, it will be my second trip to Portland.

Google’s List of SEO Bloggers in the Search Results Fri, 03 Mar 2017 04:17:21 +0000 Apparently I’m one of the top SEO bloggers, according to Google. If you search Google for this: list of SEO bloggers, you will see that Google is showing a list carousel of, well, SEO bloggers. And, according to Google, I’m number nine on the list (which puts me in the top 10). Take a look at the screen shot below, which shows the list of SEO bloggers in the search results:

list of seo bloggers

The list is made up of several (actual) news sites, and not what I would call “SEO Bloggers”. So, if you essentially remove all of those news sites from the list, then I am number 2 on the list, behind Brian Solis.

1. Search Engine Land
2. Search Engine Journal
3. Search Engine Roundtable
4. Moz
5. Search Engine Watch
6. TechCrunch
7. Online Newspaper
8. Brian Solis
9. Bill Hartzer
10. SEO By the Sea

So, sure, I guess I could go around now and say that I’m one of the top 10 SEO bloggers on the web according to Google. If I was someone who did all of that self-promotional junk (spam?) that I see so many do, then I guess I would do that. But I’m not going to, I’m not like that. I did, though, just write this blog post, for a few reasons:

– I wanted to point out that I’m on the list. Who wouldn’t?
– Google’s algorithms are not very great, especially if they are going to create a carousel of news sites that are not obviously blogs.

Can’t Google distinguish between a blog and a news site? If someone searches Google for “bloggers” it makes sense to me to return a list of blogs, no?

I still like how this sounds:

“Bill Hartzer is a Top 10 SEO Blogger according to Google.”

“Google Names Bill Hartzer to List of Top 10 SEO Bloggers on the Web.”

DMOZ Shutting Down Tue, 28 Feb 2017 23:33:43 +0000 dmoz
As of March 14, 2017,, the Open Directory Project, will no longer be available. The word on the street is: DMOZ shutting down. If you go to the site’s home page, you’ll see the message from the editors:

dmoz shutting down

Important Notice
As of Mar 14, 2017 will no longer be available.

This truly is the end of an era, where human editors were heavily involved in the editing of the web. If your website didn’t have a listing, approved by a human editor, then it was VERY difficult to get a top search engine ranking at several of the major search engines, including Google. In fact, Google used to put a lot of weight into the human-edited directory DMOZ, so much so that you essentially got the “green light” to rank well if you had a listing.

Tweet: DMOZ is shutting down. End of an era. to Tweet: DMOZ is shutting down. End of an era.

As a result of Google liking DMOZ so much, they even had their own copy of the site, the Google Directory (which has been taken down, as well, a few years ago). We also had the Yahoo! Directory, which again was highly sought after by SEOs for listings. So much so, that there were several devious things that SEOs would do in order to get a better search engine ranking (and a listing):

– Crawl DMOZ and look for broken links. If you found a link or a listing, sometimes the domain name would be available for registration–and you could purchase the domain name. I remember a time when I would crawl DMOZ and then purchase domain names to either redirect to a current site for the link and trust value, or just to build a site on the domain.

– There were services you could pay for (SEO tools) that would tell you about domain names that were listed in DMOZ. You could then pick up those domain names or sites and redirect them for links and trust value.

– DMOZ finally caught onto the fact that SEOs were gaming their directory for links, so they got increasingly good at making sure each site listed in the directory went to a real live website.

– You could pay DMOZ editors to get a listing.

– One directory editor told me how to get a listing fast: submit it to his category and then he would recategorize it into the right category. That was an SEO guy who was a DMOZ editor.

– The fastest way to get a listing would be to submit to the local category for your city. So, if you were in Dallas, then submit to the Dallas real estate section if the site is a real estate site.

What’s sad here is that we’re now in an era where AI and other computers are now categorizing and approving websites. It’s no longer a volunteer-editor driven web world. Now, only a search engine algorithm decides whether or not a site is trusted. It used to be that a human approved a site, and a search engine trusted that human’s opinion. No longer.

DMOZ is Shutting Down.

On March 13, 2017 DMOZ should just approve all submissions. Especially those that have been waiting 7+ years to get their site approved.

Domain Notification – Domain SEO Service Spam Email Mon, 27 Feb 2017 18:54:59 +0000 I’ve written several times about various domain name email spam (scams) out there. There is one in particular, now, that seems to be very persistent with their sending of emails. The Subject of the email that you receive is usually “ Notification” where “” is the name of a domain name that you own. The scam is the essentially the same–they’re trying to get you to worry that you have not paid for “search engine optimization submission expiration of your domain”.

You do NOT need to pay for search engine submission services for your domain name. The search engines, when you registered your domain name, have already started crawling it. So, as a result, you do NOT need to pay anything to get your domain name and website into the search engines. However, you *DO* need to pay someone, such as a search engine optimization consultant, so help get your website to show up towards the ‘top of the list’ for your particular targeted keywords.

Paying $64 a year, like this Domain Notification email suggests, is NOT going to get you to the top of the search results. And, it’s not going to allow you to submit your domain name to the search engines. In fact, after reading the email below, I am not sure WHAT service they are actually going to provide for $64.

domain notification

Here’s what the text of the email says:

“This purchase expiration notification offer advises you about the search engine optimization submission expiration of your domain The information in this purchase expiration offer may contain confidential and/or legally privileged information from the processing department of Domain SEO Service to purchase our search engine traffic generator. We do not register or renew domain names. We sell traffic generator software. We offer a high quality search engine optimization service that keeps your site ranking high. This information is intended only for the use of the individual(s) named above.

Non-completion of your domain name search engine optimization service submission by given expiration date for, may result in the cancelling of this search engine optimization notification offer…”

This was sent to me, the owner of, a domain name I own. So, apparently this service does not register or renew domain names, they sell “traffic generator software”.

Did you know that using automated methods of any kind to generate search engine traffic to your website is SPAM? And, if you use this service, or pay for it, the search engines, such as Google, could in fact, BAN your website?

In either case, don’t fall for this scam–don’t pay for traffic generating software, or search engine submission.

The company selling this domain notification scam is, the email associated with that domain name is, and here is the owner, apparently from China:

Registrant Name: JunZhang
Registrant Organization: Zhang Jun
Registrant Street: Xiang Zhou Qu Cui Xiang Jie Dao 142Hao
Registrant City: Zhu Hai
Registrant State/Province: Guang Dong
Registrant Postal Code: 519000
Registrant Country: CN
Registrant Phone: +86.075630651127
Registrant Phone Ext:
Registrant Fax: +86.075630651127

Not All Google Featured Snippets are Informational Fri, 24 Feb 2017 21:30:35 +0000 Ever since Google started introducing Featured Snippets into their search results, I always thought of Google Featured Snippets as being informational and helpful. And, non-commercial. Well, today my opinion of featured snippets quickly changed when I saw the Google showing a featured snippet for this search query: banking equipment.

For that search query, Google displays this featured snippet:

google featured snippets banking equipment

That “featured snippet” certainly reads more like an ad for SEICO’s banking equipment products than something I would expect: a history of banking equipment, a list of various banking equipment machines (coin sorters, paper currency sorters, ATMs, etc.). But no, somehow Google thinks that it’s more appropriate to show ad-like text than anything informational.

Take a look at what I’m seeing in the search results for “banking equipment”:

google featured snippets

I see:
– Google AdWords Ad
– Google AdWords Ad
– Google AdWords Ad
– Google AdWords Ad
– Google Featured Snippet (ad)
– Google Organic Search Result (URL same as featured snippet)
– Wikipedia page

I don’t know if an SEO is behind this particular featured snippet (yes, it’s possible to get Google to show your page as a featured snippet) by optimizing the content on the page. But seriously, Google? The Google Featured Snippets is far from informational and a good example of Google Featured Snippets gone bad.

Maybe my mistake here is that I assume that Google Featured Snippets should be informational and non-commercial?

Google says this about featured snippets: “What’s different with a featured snippet is that it is enhanced to draw user attention on the results page. When we recognize that a query asks a question, we programmatically detect pages that answer the user’s question, and display a top result as a featured snippet in the search results.”

In this case, the search query does not ask a question. My question is, though, should we assume that Google featured snippets are non-commercial in nature and answer a question in a certain way, without promoting a certain company and their products or services?

In this particular case, Google has no question answered for the search query (the search query is not a question), and Google promotes one particular company in the featured snippet.

Google: There is No Duplicate Content Penalty Mon, 13 Feb 2017 16:09:19 +0000 google duplicate content penalty

According to Gary Illyes, a Webmaster Trends Analyst at Google, the Google search engine has no duplicate content penalty. In a Tweet on February 13, 2017, Gary said the following:

“DYK Google doesn’t have a duplicate content penalty, but having many URLs serving the same content burns crawl budget and may dilute signals.” – Gary Illyes, Google

He also confirmed, during replies to his Tweet, that his statement only refers to internal pages on a website. So, for internal pages on your website, there is no duplicate content penalty. However, if you do have lots of duplicate content on your website, you’re “shooting yourself in the foot” if you don’t deal with that content properly.

Every website, according to Google, has a crawl budget. Google will only crawl a certain number of pages on your website, essentially. As a result, if you have too much duplicate content on your website, Google’s crawler, Googlebot, may not be able to crawl all of the pages. And, if you burn through all your crawl budget, they may not see the important pages. That’s just how I’m interpret this, but it seems logical to me as that’s how they’re dealing with duplicate content.

In other words, Google is not penalizing your site for this. You’re penalizing yourself. So, Google’s “passing the buck”, so to speaking, saying that it’s your fault, not theirs.

So how should you deal with duplicate on your website? How do you make sure that you’re not using up all of your crawl budget? Here is what I recommend:

– First, check for duplicate content. Use to check your site.
– Decide if that dupe content on your site is needed for visitors or if it can be deleted.
– If you can simply remove the dupe content, then go for it. That’s the best option. If the content is not there, then it’s not duplicate.
– See if you can remove some internal links to the duplicate content. The content might still be there, but if it’s not crawlable or accessible via links then that can help. For example, remove the tag pages on a blog. Keep the categories, but ditch the tag pages. Or the archives by date. Those are notorious for creating these issues on your site.
– Use the robots.txt file to stop the search engines from crawling certain pages that are creating duplicate content.
– Use the canonical tag, if necessary, to help with duplicate content. This can be helpful for ecommerce sites, especially if you have products with more than one color, size, or features.

While duplicate content isn’t officially something that Google penalizes, it’s one of the top 5 issues I see whenever I am doing a technical SEO audit of a website. It could quite possibly be in the top 3 issues, and sometimes the most important issue to take care of in some cases.

And while Google says that it’s not a penalty per se–I can tell you that the results of having duplicate content issues on your site act, look, and feel as if it’s a penalty. So, if it looks like it, acts like it, and feels like it, it might as well be a penalty.