Users will occasionally come to a page that doesn't exist on your site, either by following a broken link or typing in the wrong URL. Having a custom 404 page30 that kindly guides users back to a working page on your site can greatly improve a user's experience. Your 404 page should probably have a link back to your root page and could also provide links to popular or related content on your site. You can use Google Search Console to find the sources of URLs causing "not found" errors31.
Hey Brian. Even though our own website ranks constantly (last 3 years now) for SEO Companies at Number 1 of Google (obviously when searching from London UK or nearby that is), I sttill keep reading other people’s posts and sending my own out when I find a gold nugget. However, within your clearly written article I have noticed multiple golden nuggets, and was very impressed by your ‘thinking out the box’ approach, and the choices you made for this article. Anytime you want a job as head of R&D for SEO at KD Web, you just let me know 😉
We expect advertisements to be visible. However, you should not let the advertisements distract users or prevent them from consuming the site content. For example, advertisements, supplement contents, or interstitial pages (pages displayed before or after the content you are expecting) that make it difficult to use the website. Learn more about this topic.38
Whatever industry you’re in, chances are there are at least one or two major conventions and conferences that are relevant to your business. Attending these events is a good idea – speaking at them is even better. Even a halfway decent speaking engagement is an excellent way to establish yourself as a thought leader in your industry and gain significant exposure for your site.
By relying so much on factors such as keyword density which were exclusively within a webmaster's control, early search engines suffered from abuse and ranking manipulation. To provide better results to their users, search engines had to adapt to ensure their results pages showed the most relevant search results, rather than unrelated pages stuffed with numerous keywords by unscrupulous webmasters. This meant moving away from heavy reliance on term density to a more holistic process for scoring semantic signals.[13] Since the success and popularity of a search engine is determined by its ability to produce the most relevant results to any given search, poor quality or irrelevant search results could lead users to find other search sources. Search engines responded by developing more complex ranking algorithms, taking into account additional factors that were more difficult for webmasters to manipulate. In 2005, an annual conference, AIRWeb (Adversarial Information Retrieval on the Web), was created to bring together practitioners and researchers concerned with search engine optimization and related topics.[14]

Another important question is whether or not the Facebook audience would be interested in your blog to begin with? Create specific Facebook content (video > text posts > images > other) that speaks “Facebook” – simple, informal, fun, controversial – that’ll speak to the type of person you want reading your blog. Facebook is not a destination for your blog readers, it’s a funnel to get Facebook readers on your blog.


You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.
Search engines find and catalog web pages through spidering (also known as webcrawling) software. Spidering software "crawls" through the internet and grabs information from websites which is used to build search engine indexes. Unfortunately, not all search engine spidering software works the same way, so what gives a page a high ranking on one search engine may not necessarily give it a high ranking on another. Note that rather than waiting for a search engine to discover a newly created page, web designers can submit the page directly to search engines for cataloging.

Optimizing a website may involve editing its content, adding content, and modifying HTML and associated coding to both increase its relevance to specific keywords and remove barriers to the indexing activities of search engines.[citation needed] Promoting a site to increase the number of backlinks, or inbound links, is another SEO tactic. By May 2015, mobile search had surpassed desktop search.[3]


When referring to the homepage, a trailing slash after the hostname is optional since it leads to the same content ("https://example.com/" is the same as "https://example.com"). For the path and filename, a trailing slash would be seen as a different URL (signaling either a file or a directory), for example, "https://example.com/fish" is not the same as "https://example.com/fish/".
Product images. If you think images don't play a role, think again. When many consumers search for products in the search engines, not only are they looking at the "Web" results, but they're also looking at the "images" results. If you have quality images of that product on your site -- and the files' names contain relevant keywords -- these images will rank well in search engines. This avenue will drive a lot of traffic to your site, as potential customers will click on that image to find your store.

As of 2009, there are only a few large markets where Google is not the leading search engine. In most cases, when Google is not leading in a given market, it is lagging behind a local player. The most notable example markets are China, Japan, South Korea, Russia and the Czech Republic where respectively Baidu, Yahoo! Japan, Naver, Yandex and Seznam are market leaders.
To give you an example, our domain authority is currently a mediocre 41 due to not putting a lot of emphasis on it in the past. For that reason, we want to (almost) automatically scratch off any keyword with a difficulty higher than 70%—we just can’t rank today. Even the 60% range as a starting point is gutsy, but it’s achievable if the content is good enough.
incredible post and just what i needed! i’m actually kinda new to blogging (my first year coming around) and so far my expertise has been in copy writing/seo copy writing. however link building has become tedious for me. your talk about influencing influencers makes perfect sense, but i find it difficult for my niche. my blog site is made as “gift ideas” and holiday shoppers complete with social networks. i get shares and such from my target audience, but i find that my “influencers” (i.e etsy, red box, vat19, etc.) don’t allow dofollow links and usually can’t find suitable sources. I guess my trouble is just prospecting in general.
Very in-depth information, Brian. I love the part about updating old content, I still find old articles in search results, sometimes 3+ years ago that are clearly out of date when it comes to marketing topics. I usually skip those results, and wonder how that content is still ranking, but it would be great if everyone updated that content. This entire post is full of useful tips, as usual. I am bookmarking now, and sharing-
SEO is not an appropriate strategy for every website, and other Internet marketing strategies can be more effective, such as paid advertising through pay per click (PPC) campaigns, depending on the site operator's goals. Search engine marketing (SEM) is the practice of designing, running and optimizing search engine ad campaigns.[56] Its difference from SEO is most simply depicted as the difference between paid and unpaid priority ranking in search results. Its purpose regards prominence more so than relevance; website developers should regard SEM with the utmost importance with consideration to visibility as most navigate to the primary listings of their search.[57] A successful Internet marketing campaign may also depend upon building high quality web pages to engage and persuade, setting up analytics programs to enable site owners to measure results, and improving a site's conversion rate.[58] In November 2015, Google released a full 160 page version of its Search Quality Rating Guidelines to the public,[59] which revealed a shift in their focus towards "usefulness" and mobile search. In recent years the mobile market has exploded, overtaking the use of desktops, as shown in by StatCounter in October 2016 where they analyzed 2.5 million websites and found that 51.3% of the pages were loaded by a mobile device [60]. Google has been one of the companies that are utilizing the popularity of mobile usage by encouraging websites to use their Google Search Console, the Mobile-Friendly Test, which allows companies to measure up their website to the search engine results and how user-friendly it is.

The idea of “link bait” refers to creating content that is so extremely useful or entertaining it compels people to link to it. Put yourself in the shoes of your target demographic and figure out what they would enjoy or what would help them the most. Is there a tool you can make to automate some tedious process? Can you find enough data to make an interesting infographic? Is there a checklist or cheat sheet that would prove handy to your audience? The possibilities are nearly endless – survey your visitors and see what is missing or lacking in your industry and fill in the gaps.


Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]
×