MyThemeShop reserves the right to modify or suspend (temporarily or permanently) a subscription at any point of time and from time to time with or without any notice. Prices of all the products and subscription fees, including but not limited to monthly subscription plan fees can change upon 30 days notice from us. Such changes can be notified via posting it to the MyThemeShop website at any point of time or through our social media accounts or via email to relevant subscribers.​
People love to learn, and webinars are an excellent way to impart your wisdom to your eagerly waiting audience. Combined with an effective social promotion campaign, webinars are a great way to increase traffic to your website. Send out an email a week or so ahead of time, as well as a “last chance to register” reminder the day before the webinar. Make sure to archive the presentation for later viewing, and promote your webinars widely through social media. If you're wondering how to do a webinar, click the link for some tips.

The first relates to internal link structure. I’ve made the mistake you say you’ve seen so often. I have a primary keyword and have used that keyword in the main navigation, linked to a page optimized for that keyword. But I’ve also got a bunch of contextual links in posts pointing to that page, usually with the keyword in the anchor text. I now understand that those internal links aren’t helping much, at least from an SEO perspective. Am I better to remove that keyword and direct link from the menu and simply link the page from multiple posts and pages within the site. Or will I get better results leaving it in the main menu and changing the contextual links in the posts to point to a related page with a different keyword?
Sorry for the long comment, I just am really happy to see that after all those years of struggle you finally made a break through and you definitely deserve it bro. I’ve had my own struggles as well and just reading this got me a little emotional because I know what it feels like to never wanting to give up on your dreams and always having faith that one day your time will come. It’s all a matter of patience and learning from failures until you get enough experience to become someone who can generate traffic and bring value to readers to sustain long term relationships.

#6 Go on podcasts! In 13 years of SEO and digital marketing, I’ve never had as much bang for the buck. You go on for 20 minutes, get access to a new audience and great natural links on high dwell time sites (hosts do all the work!). Thanks for including this tip Brian, I still don’t think the SEO community has caught on to the benefits of podcast guesting campaigns for SEO and more…it’s changed my business for sure.


By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation. In June 2007, The New York Times' Saul Hansell stated Google ranks sites using more than 200 different signals.[26] The leading search engines, Google, Bing, and Yahoo, do not disclose the algorithms they use to rank pages. Some SEO practitioners have studied different approaches to search engine optimization, and have shared their personal opinions.[27] Patents related to search engines can provide information to better understand search engines.[28] In 2005, Google began personalizing search results for each user. Depending on their history of previous searches, Google crafted results for logged in users.[29]
Current search engine optimization focuses on techniques such as making sure that each web page has appropriate title tags and that the content is not "thin" or low-quality. High-quality content is original, authoritative, factual, grammatically correct, and engaging to users. Poorly edited articles with spelling and grammatical errors will be demoted by search engines. For more information on thin content see More Guidance on Building High-quality Sites.
A navigational page is a simple page on your site that displays the structure of your website, and usually consists of a hierarchical listing of the pages on your site. Visitors may visit this page if they are having problems finding pages on your site. While search engines will also visit this page, getting good crawl coverage of the pages on your site, it's mainly aimed at human visitors.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
Brian hello! First off I want to THANK YOU for this fantastic post. I can’t emphasize that enough. I have this bookmarked and keep going through it to help boost our blog. I totally nerded out on this, especially the LSI keywords which made my day. I know, pathetic, right? But when so much changes in SEO all the time, these kinds of posts are so helpful. So thanks for this. So no question – just praise, hope that’s ok 😁
My company has been working on a large link building project. We’ve already performed extensive keyword research and link analysis and now we’re considering executing an email outreach campaign. However, all the content we’ve created up until this point is geared more towards our target audience as opposed to the key influencers of our target audience. Do you think it would be worth it to try to build backlinks to our existing content or are we better off creating new content that directly appeals to the influencers of our target audience?
Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters only needed to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.

Promoting your websites by publishing articles to various article directories is by no means a new idea but still an extremely effective way to drive traffic. If you write content and publish it to websites like Article Base, and Article Dashboard website owners will pick it up and post it. This idea is similar to guest blogging except that you only have to write one piece of content that can end up on hundreds of even thousands of blogs and websites. The same rule applies here: don’t be boring – be creative and interesting and use common keywords in your article and title so website owners can find it!
Search engines may penalize sites they discover using black or grey hat methods, either by reducing their rankings or eliminating their listings from their databases altogether. Such penalties can be applied either automatically by the search engines' algorithms, or by a manual site review. One example was the February 2006 Google removal of both BMW Germany and Ricoh Germany for use of deceptive practices.[54] Both companies, however, quickly apologized, fixed the offending pages, and were restored to Google's search engine results page.[55]

Regarding internal linking, i believe that in the case of two links pointing to an internal page, being one of those links in the group i mentioned above, they will considered only the one witch feed the algorithm with more information. In sites that have the menu before the content, it will be the second link. I think that’s the smart way for them to analyse all the links to better understand the destination page content. And they are smart 😉 .

I love your post. I keep coming back because you always have great content I can use in my business as well as share. Since I own my own Digital Marketing company I guess you would be one of THE influencers in Internet Marketing field. I just started my business and because most influencers on twitter are talking about Content Marketing, that is what I have been writing about. But my site is only about a month old so I will just stay consistent in my writing. I’m also in the process of changing my navigation bar so be know how to get to what they want faster. Which would be “what is SEO”, etc. Thanks and would love any advice you can give me.
Write articles rich in content. Quality articles will get ranked better in search results. Make sure that your articles address the needs of your readers, and that they can find all of the information they need in one spot. This is the most effective means for increasing traffic to a website; offering people something that they cannot obtain elsewhere, or at least, not to the level of quality that you are offering it.[1]
Very good tips on traffic generation. However, for those with time constraint, back-links is a quick method along with another easy method, blog commenting. I would second most of the people who commented in support of guest posting. Yahoo answers may get mixed responses depending on the topic. If you have time, you can also post on related forums and try video marketing.
Advertiser Disclosure: Some of the products that appear on this site are from companies from which QuinStreet receives compensation. This compensation may impact how and where products appear on this site including, for example, the order in which they appear. QuinStreet does not include all companies or all types of products available in the marketplace.

This toolbar is based on the LRT Power*Trust metric that we’ve been using to identify spammy and great links in LinkResearchTools and Link Detox since 2012 and the free browser was just recently launched. It helps you promptly evaluate the power and trustworthiness of a website or page during your web-browsing way exacter than Google PageRank ever did.

Page and Brin founded Google in 1998.[23] Google attracted a loyal following among the growing number of Internet users, who liked its simple design.[24] Off-page factors (such as PageRank and hyperlink analysis) were considered as well as on-page factors (such as keyword frequency, meta tags, headings, links and site structure) to enable Google to avoid the kind of manipulation seen in search engines that only considered on-page factors for their rankings. Although PageRank was more difficult to game, webmasters had already developed link building tools and schemes to influence the Inktomi search engine, and these methods proved similarly applicable to gaming PageRank. Many sites focused on exchanging, buying, and selling links, often on a massive scale. Some of these schemes, or link farms, involved the creation of thousands of sites for the sole purpose of link spamming.[25]


Make data-driven decisions when optimizing your site - It is never smart to invest in optimizing your website based on hunches and guesses. Traffic insight is only a click away with powerful data sources like Google Analytics and visitor tracking software. However, don’t treat this information like an autopsy - use it to make changes and find the story behind your visitor’s experience on your site. Where are people jumping off the most? What do the trends say about your site’s usability? What hypothesis can be made based off of average site visit times and heat maps? Test out your hypothesis with AB testing first to see if you can convert those visitors and then make the widespread changes based off what the testing showed works with your customers.
Thanks for the great post. I am confused about the #1 idea about wikipedia ded links…it seems like you didn’t finish what you were supposed to do with the link once you found it. You indicated to put the dead link in ahrefs and you found a bunch of links for you to contact…but then what? What do you contact them about and how do you get your page as the link? I’m obviously not getting something 🙁
Take the 10 pillar topics you came up with in Step 1 and create a web page for each one that outlines the topic at a high level -- using the long-tail keywords you came up with for each cluster in Step 2. A pillar page on SEO, for example, can describe SEO in brief sections that introduce keyword research, image optimization, SEO strategy, and other subtopics as they are identified. Think of each pillar page as a table of contents, where you're briefing your readers on subtopics you'll elaborate on in blog posts.

Usually Search-engines automatically crawl your articles if it is high-quality but you should also try to submit your blog to search engines like Google, Bing, and Ask etc. Search engines like Google have already simplified the way of submitting your content. Google Webmaster Tools makes it easy for every webmaster to get their website crawled faster.
You can confer some of your site's reputation to another site when your site links to it. Sometimes users can take advantage of this by adding links to their own site in your comment sections or message boards. Or sometimes you might mention a site in a negative way and don't want to confer any of your reputation upon it. For example, imagine that you're writing a blog post on the topic of comment spamming and you want to call out a site that recently comment spammed your blog. You want to warn others of the site, so you include the link to it in your content; however, you certainly don't want to give the site some of your reputation from your link. This would be a good time to use nofollow.
In December 2009, Google announced it would be using the web search history of all its users in order to populate search results.[33] On June 8, 2010 a new web indexing system called Google Caffeine was announced. Designed to allow users to find news results, forum posts and other content much sooner after publishing than before, Google caffeine was a change to the way Google updated its index in order to make things show up quicker on Google than before. According to Carrie Grimes, the software engineer who announced Caffeine for Google, "Caffeine provides 50 percent fresher results for web searches than our last index..."[34] Google Instant, real-time-search, was introduced in late 2010 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.[35]
While short-tail keywords are often searched more frequently, it is more difficult to rank for them on search engines. Targeting long-tail keywords, on the other hand, gives you a better chance of ranking higher (even on the first page) for queries specific to your products and services—and higher ranking means more traffic. Plus, as search engines and voice-to-text capabilities advance, people are using more specific phrases to search online. There are many free tools available to help you find keywords to target, such as Answer the Public.
Brian, I recently found your blog by following OKDork.com. Just want to say you’re really amazing with the content you put out here. It’s so helpful, especially for someone like me who is just starting out. I’m currently writing posts for a blog I plan to launch later this year. I think my niche is a little too broad and I have to figure out how to narrow it down. I essentially want to write about my current journey of overcoming my fears to start accomplishing the dreams i have for blogging, business, and travel. In doing so, I will share the best tips, tools, and tactics I can find, as well as what worked, what didn’t and why.
It’s an awesome post which I like the most and commenting here for the first time. I’m Abhishek founder of CouponMaal want to know more like you’ve said above in the points relaunch your old posts. Here I want to know is there any difference between changing the date, time and year while we’re relaunching old post OR we should relaunch the old post with the previous date, time and year. I mean it matters or not.
Text-based content is all well and good, but video can be a valuable asset in both attracting new visitors and making your site more engaging. Data shows that information retention is significantly higher for visual material than it is for text, meaning that video marketing is an excellent way to grab – and hold – your audience’s attention, and boost traffic to your website at the same time.

If you are using Responsive Web Design, use meta name="viewport" tag to tell the browser how to adjust the content. If you use Dynamic Serving, use the Vary HTTP header to signal your changes depending on the user-agent. If you are using separate URLs, signal the relationship between two URLs by tag with rel="canonical" and rel="alternate" elements.
Create shareable content. In the world of social media, shareable content is king. Your content should be easily share-able so that your readers can spread the word for you. This is a combination of a good headline and an interesting image, as well as a captivating lead-in. All of this creates a perfect bite-sized chunk of your article that others can share through Facebook, Twitter, and other networks.[2]
Expertise and authoritativeness of a site increases its quality. Be sure that content on your site is created or edited by people with expertise in the topic. For example, providing expert or experienced sources can help users understand articles’ expertise. Representing well-established consensus in pages on scientific topics is a good practice if such consensus exists.
Thank you so much for these great SEO techniques you posted on your blog. I also follow you on your youtube and listened to almost all of your videos and sometimes I re-listen just to refresh my mind. Because of your techniques, we managed to bring our website to the first pages within a month. Adding external links was something I never imagined that it would work. But it seems like it is working. Anyway, please accept my personal thank you for coming up with and sharing these techniques. I look forward to your new blog posts and youtube videos!
I love your post. I keep coming back because you always have great content I can use in my business as well as share. Since I own my own Digital Marketing company I guess you would be one of THE influencers in Internet Marketing field. I just started my business and because most influencers on twitter are talking about Content Marketing, that is what I have been writing about. But my site is only about a month old so I will just stay consistent in my writing. I’m also in the process of changing my navigation bar so be know how to get to what they want faster. Which would be “what is SEO”, etc. Thanks and would love any advice you can give me.

SEO may generate an adequate return on investment. However, search engines are not paid for organic search traffic, their algorithms change, and there are no guarantees of continued referrals. Due to this lack of guarantees and certainty, a business that relies heavily on search engine traffic can suffer major losses if the search engines stop sending visitors.[61] Search engines can change their algorithms, impacting a website's placement, possibly resulting in a serious loss of traffic. According to Google's CEO, Eric Schmidt, in 2010, Google made over 500 algorithm changes – almost 1.5 per day.[62] It is considered a wise business practice for website operators to liberate themselves from dependence on search engine traffic.[63] In addition to accessibility in terms of web crawlers (addressed above), user web accessibility has become increasingly important for SEO.

Hi Brian! I enjoy reading your posts and use as much info as I possibly can. I build and sell storage sheds and cabins. The problem I have is that there are no top bloggers in my market or wikipedia articles with deadlinks that have to do with my market. 95% of my traffic and sales are generated via Facebook paid advertising. Would love to get more organic traffic and would be interested in your thoughts concerning this.


To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.[47]
×