To skip this post, and simply download the 2014 SEO checklist as a FREE PDF – click here.
SEO isn’t dying, nor has it even evolved if you listened to the signals.
Penguin and Panda gave you a many clues as to where this was going – and the Semantic search goal mentioned across the web in 2013 by Google leadership was even more obvious.
So who benefits from the new algorithm?
Content strategists, digital strategy groups, agile marketing minds and lean UX teams – all of the tribes that saw the combination of inbound marketing and UX being the next generation of Google Search … and pivoted early.
Blogging was a growing trend, social shares continued to build referral traffic, and the more channels and tactics marketers used to distribute content, and the more stories they told – the bigger the reward. With simplistic design and site speed growing, the design minds shared info with content experts – and modern SEO was discovered (even before Hummingbird, a change that only amplified this effect.
So what rules were added? Which ones are still around?
Here’s a basic list of known ranking factors that we’ve put together for the modern SEO, web designer, digital marketer, analyst or tech startup who pays attention…
Page access. If a search engine can’t reach a page, it’s impossible to have it indexed. Make sure that your robots.txt file isn’t accidentally blocking important web pages.
Block inappropriate pages. On the other hand, some pages should not be indexed. For example, incomplete pages, working split-test pages or confidential pages. Block them via the robots.txt file or robots meta tag.
Pagination. Help search engines handle pagination by implementing the rel=”next” and rel=”prev” tags.
Redirects. If your page or site has moved to a new location, 301 status means that a page has permanently moved to a new URL, redirecting the page automatically to prevent 404 errors. For permanent shifts, try to always use 301-redirects instead of 302.
404 errors. Try to reduce the amount of 404 errors to a minimum. This causes bounce rate and bad website reputation, both of which Google is now watching closely.
Site speed. Site speed has become a major ranking factor in SERPs. Faster is better. Use the Page Speed Tools in Google Developers to check the loading time of web pages.
Mobile accessibility. Is your website accessible for mobile devices? Is it responsive, adaptable or neither? Google recommends a responsive design, and this doesn’t simply mean code to resize the same content – it means a mobile first approach to digital design.
Duplicate content issues can be solved using many different methods via a 301-redirect, the rel=”canonical” tag, meta robots tag, URL rewriting, etc.
Set a preferred domain for your website (www OR non-www). Choose one and make sure the internal and inbound links use the same format. Use Google Webmaster Tools to set the preferred version…Automatic URL rewriting is also possible.
Domain extension. Local businesses might want to consider choosing a country-specific domain extension for better ranking, or even big product / service launches for a business.
Subdomain or subfolder. Subdomains are often seen by Google as separate domains, so it’s often best to use sub-folders when ranking for a finite list of offerings/keywords. If you have several, non-related products you can still opt for a sub-domain though.
A few scenarios where a subdomain makes sense:
- Unique languages for site content. Wikipedia adds two character subdomains, such as en.wikipedia.org or fr.wikipedia.org for English vs. French websites.
- Unique regions for products or services, looking to achieve a local brand. Franchises may follow on this model, as it allows the local or regional teams to employ core features on the primary domain, while managing content that’s unique to its local customer base. Craigslist is an example for how to use this type of subdomain.
- Differing product lines for same brand. This is more strategic and where business expansion comes into play. This may be better realized via a hypothetical situation. Say a video game website has obtained a core audience that visits the site for say discussion on role playing games (or RPG’s). Now, with more resources available, the site owners wish to become a prime resource for current news in the entire video game industry. As this could possibly disrupt the flow on the primary domain, or alienate some hardcore RPG visitors, the subdomain, along lines of news.videogame.com allows for a strategic and significant distinction from videogame.com/RPG-forum.
- In scenarios 1 through 3, notice a common theme – all have many resources and staff to help manage website content vs. small businesses or tech startups. The subdirectory could accused of being less secure, as an attack on subfolders may expose vulnerabilities to the primary domain. The trade-off is that a subdomain likely needs more coding and the duplication of online resources, including file and/or data security.
Descriptive URLs. Use a descriptive page name instead of a random number of words and letters – the same goes for subfolders. Oh and don’t stack keywords into every URL name if they aren’t relevant…you’ll get caught. Instead, try creating a relevant website and products that people actually want.
Hyphens. Unless you want to rank for hashtags, project numbers or people who can’t use the space bar, try using keywords in the URL – splitting each word with hyphens.
- mysite.com/aboutus will rank for “Aboutus”
- mysite.com/about_us” will rank for “About_Us”
- mysite.com/about-us will rank for “About Us”
- Example = http://unfunnel.com/wp-content/uploads/product-development-fail.gif = “Product Development Fail
Parameters. You can help Google and other SERPs handle your site’s parameters via Google Webmaster tools. URL parameters fall into one of two categories:
- Parameters that don’t change page content: Parameters like these are often used to track visits and referrers. They have no affect on the web page’s actual content. For example…http://www.example.com/products/women/dresses?sessionid=12345
- Parameters that change or determine the content of a page: For example, this type of parameter can affect content as follows:
- Sorts (for example,
- Narrows (for example,
- Specifies (for example,
- Translates (for example,
- Paginates (for example,
- Other changes in content
HTML sitemap. Not only useful for visitors, but also for search engines as it helps the crawlers of SERPs understand your website architecture.
XML sitemap. This sitemap is only visible for search engines. Allows you to add metadata which can be used to provide additional information about the content of each page. Can contain a maximum of 50,000 links.
Image sitemap. This type of sitemap can help any or all of your website images show up in Google Image Search results and get you some extra visitors.
Video sitemap. A video sitemap can help with the creation of rich snippets for your page, index your visual content in Google Video search results, and boost existing SERP rankings for pages or YouTube videos.
Single keyword per page. Make sure every page targets a single keyword - preferably a phrase in the keyword’s unique context (also a case for why to create more pages = more results in the SERP)
Keyword Research. Ranking for the correct keywords is very important…and the keyword tool isn’t gone – it simply moved to a new location, and changed its name.
Do some research and see which keyword has the highest traffic and the least competition. Balance these two factors and you have a winning focus for any web page, blog post or other digital experience.
Keyword in URL. Use the primary keyword for each page in the URL (page name), within its parameters and the files uploaded to your website.
Keyword in title. Use the primary keyword in the title of the page, preferably in the beginning. This is an important indicator for search engines to categorize a page.
Keyword in headings. Use the primary keyword in one or more header tags (h1, h2, h3…). The H1 tag should only be used once on a page and should be used for its title.
Keyword in content. It’s impossible to write about a topic without using keywords – so don’t worry about density if your page or blog post has a realistic focus. Use the main offering / target / etc. as the primary keyword – including variations and synonyms – in your web copy (like you would in a real conversation about the subject). Try to use the biggest win of a keyphrase in the first paragraph.
Title. Maximum 70 characters long, with the primary NON-BRAND or NON-PRODUCT-NAME keyword leading the title (which is what you’ll rank most for – think 80% keyword #1 ranking factor, everything that follows is a tie-breaker).
Here’s an example of what every meta title should look like…
(no brand keywords leading – it’s your name, and likely your URL…so its the easiest keyword to rank for, and equally, the biggest waste of a meta title intro)
Unique value. Creating and distributing content that provides a unique value attracts inbound links naturally. It also attracts social shares, which are now the highest ranking link among them.
Content type. Choosing a different content type, e.g. infographic, comic, quiz, etc., can help you not only attract links, but also influence the flow of user traffic, leading them to conversions at every touchpoint. It also can be specified in your code, which shows diversity (and authority) to search engines.
Be Crawlable. Search engines have difficulties crawling certain content types, for example images or flash content. A combination of HTML, CSS and web fonts can be used to solve certain crawlability problems.
Language targeting: Inform search engines about translated pages using rel=”alternate” hreflang=”x” annotations, which aids your ranking across regions / countries and user-specified languages.
Freshness. Nothing changed on this one – Google and all other search engines love new content. Regularly posting to your blog / forum / social media / communities, and updating web pages is highly recommended, as Hummingbird only amplified this factor.
Content Length. This is Correlated with rankings. Try to aim for at least 300 words when creating a web page, blog post, news article or other piece of unique digital content for SERP ranking.
File name. Choose a descriptive file name instead of random words and numbers…no one is searching for Project-57-gamma-alpha-project255 or whatever your media team names images.
Alt-tag. Don’t forget to add an alt-tag with a short description of the image. Remember, focus on the non-brand keyword before stuffing your brand name or product name into the end of this (if you absolutely have to keep it in there).
Size. Keep your images as small as possible to improve page load times. Content delivery networks are very handy (and competitively plentiful) in this keeping this one down to a process.
Display in SERP
Meta description. Change the description of a page in the SERPs by adding a meta description tag. Meta descriptions are NOT crawled by Google, BUT A GOOD DESCRIPTION will convince users to click on the link. Don’t forget to use keywords because they appear in bold.
Structured data. Add structured data to your page that can be used to generate rich snippets. Rich snippets can vastly improve the CTR of your pages. Structured data can be added via microdata, RFDa or microformats.
Authorship. For all pages and every single blog post, be sure to add authorship information to your back end (and front, ideally). AuthorRank was recently impacted with good and bad ranking changes, but it remains an increasingly powerful ranking factor nonetheless, with most of its thanks going to Google+ and YouTube.
Thumbnail. Choose an eye-catching thumbnail image for social sharing links – and make sure that it’s at least 200 x 200 pixels.
Title. The social share icon’s title tag acts as both anchor text for your social snippet and the visible title of shared links – in the case or tweets or pins, it’s your only text unless specified otherwise.
URL. The URL of the underlying page. Don’t forget to tag your URL using Google’s URL builder for additional information in Google Analytics, and make sure to use bitly or other shortener to help social networks pass the URL into their API, before the user’s prompt to share to the network is properly shown.
Description. A short description of the content – think of it as social media’s type of meta description for links that are posted. You can use the meta description of the web page for this, if you’d like.
Twitter APIs / Cards. These use Open Graph tags, supplemented with several Twitter-specific tags
Optimization of Pages for Local. Optimize your page as you would do for any other keyword. Use the location (for example, Memphis) in the page title, url and content.
Structured data. Location information can also be added to your site’s code – and seen by SERP crawlers using structured data.
Multiple locations. If you have a chain of businesses, create a unique page with a separate URL for each location.
Google+ Places. Create Google+ Places for your organization – merge each of them with your Google+ business page, which you should have synced with your website (and the authorship of the site linked as well).
Local listing. Get links from local listings (Foursquare, Yelp, etc.) to improve your visibility for local search queries – there are also resources like up-and-comer tech company YEXT that can help you with this one.
Inbound Links. Check the number of inbound links, social shares, bookmarking sites and other referral help agents regularly – and optimize for them daily.
Authority. Links from sites with a high domain authority are more valuable, so when stated to Optimize for them a moment ago, these should be at the top of your focus point.
Anchor text. The anchor text is the text that is used to link to your site. Keywords in an anchor text are very useful, but don’t overdo this. Make sure your link profile looks natural.
Diversity. Try to gain inbound links from multiple, relevant domains instead of only a handful of sites – the more diversity of your referring domains, the more you can multiply the effect of the link juice passed.
Nofollows. Links that use the rel=”nofollow” tag pass no PageRank and keep you from passing juice for too many links per page (example – add the noftag when blogging and using external links).
Bad Links. Links from spammy sites can hurt your rankings. Contact the webmaster to remove them, use the Disavow Tool, or use similar methods that will avoid the drop in your page or site ranking in the SERP.
Social Media. Now among the top of ranking factors in Google’s SERP results, there are a number of ways to capitalize on Social SEO growth. Social sharing icons across touchpoints, adding social bookmarks, and publishing visual content with linking are good ways to start. Beyond the obvious, here are the 7 most underrated social media sites for SEO.
Anything we Missed?
Please let us know your additions to this list in the comments below… To keep this list as a free PDF, click here to download the Ultimate SEO Checklist for Agile Marketers.