He started by finding an offer that resonated with and is relevant to his audience. In his case, his blog was dedicated to teaching people how to use a software called “Sublime Text.” He simply offered a license to the software for the giveaway. By doing this, not only did he increase the chances of success of his giveaway since his incentive was relevant, but he also ensured the quality of subscribers since they were actually people interested in his content. It’s easy to give people an iPad or an iPhone, but how relevant will they be to you at the end of the day?
Very in-depth information, Brian. I love the part about updating old content, I still find old articles in search results, sometimes 3+ years ago that are clearly out of date when it comes to marketing topics. I usually skip those results, and wonder how that content is still ranking, but it would be great if everyone updated that content. This entire post is full of useful tips, as usual. I am bookmarking now, and sharing-
Make it as easy as possible for website visitors to connect with you by adding a live chat box to your homepage. Include a name and photo in the chat box so that users know they are talking to a real, live person and not just an automated robot. When there is nobody to monitor the live chat, be sure to mention that, by saying something along the lines of, “Nobody is here right now but feel free to leave a message and we will get back to you shortly!”
A user-feedback poll is one great, easy way to help you better understand your customers. Kline claims, “Done incorrectly, these can be annoying for a user. Done well, it’s an excellent opportunity to help the customer feel that their opinion matters, while also getting needed insights to better market the company. One poll we ran for an e-commerce client helped us learn that 80% of potential customers cared more about the performance of the product than the price. [So,] we added as much helpful performance information to the website as we could.”
Optimization techniques are highly tuned to the dominant search engines in the target market. The search engines' market shares vary from market to market, as does competition. In 2003, Danny Sullivan stated that Google represented about 75% of all searches. In markets outside the United States, Google's share is often larger, and Google remains the dominant search engine worldwide as of 2007. As of 2006, Google had an 85–90% market share in Germany. While there were hundreds of SEO firms in the US at that time, there were only about five in Germany. As of June 2008, the market share of Google in the UK was close to 90% according to Hitwise. That market share is achieved in a number of countries.
Historical refreshes of content is a good thing, especially if some of your content has expired. Note, this does not mean re-doing your content; simply refreshing it to bring it current if it isn't already evergreen content. Look at ways you can update outdated content on your site to drive more traffic through visibility on search engines like Google.
Google Analytics is an invaluable source of data on just about every conceivable aspect of your site, from your most popular pages to visitor demographics. Keep a close eye on your Analytics data, and use this information to inform your promotional and content strategies. Pay attention to what posts and pages are proving the most popular. Inspect visitor data to see how, where and when your site traffic is coming from.
SEO is also about making your search engine result relevant to the user's search query so more people click the result when it is shown in search. In this process, snippets of text and meta data are optimized to ensure your snippet of information is appealing in the context of the search query to obtain a high CTR (click through rate) from search results.
Consider outsourcing article writing. If you hate the thought of generating content yourself, or your team is not writing-savvy, consider outsourcing this end of the task. Depending on the length, content, specialization and quality required, prices can start as low as US$5 per article. However, don't neglect attempting to write your own work - who better than you knows your own business, hobby or club and can express precisely what needs to be said?
Since heading tags typically make text contained in them larger than normal text on the page, this is a visual cue to users that this text is important and could help them understand something about the type of content underneath the heading text. Multiple heading sizes used in order create a hierarchical structure for your content, making it easier for users to navigate through your document.
Thank you so much for these great SEO techniques you posted on your blog. I also follow you on your youtube and listened to almost all of your videos and sometimes I re-listen just to refresh my mind. Because of your techniques, we managed to bring our website to the first pages within a month. Adding external links was something I never imagined that it would work. But it seems like it is working. Anyway, please accept my personal thank you for coming up with and sharing these techniques. I look forward to your new blog posts and youtube videos!
Squidoo is a website full of 100% user generated content that allows you to create what’s called a “lense.” A lense is a page about a specific topic that you choose to write about (usually something you’re knowledgeable in). After creating your lense other people can find it by searching for terms and keywords related to your lense. Let me just start off by saying Squidoo is an absolute powerhouse in the search engines. Its very easy to rank Squidoo lenses for competitive terms that would prove to be a challenge for websites with lesser authority. Creating a lense on Squidoo gives you 2 traffic opportunities:
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.