Do not be fooled by those traffic sellers promising thousands of hits an hour. What they really do is load up your URL in a program, along with a list of proxies. Then they run the program for a few hours. It looks like someone is on your site because your logs show visitors from thousands of different IPs. What happens in reality is your website is just pinged by the proxy, no one really sees your site. It is a waste of money.
Content gaps – make an inventory of the site’s key content assets, are they lacking any foundational/cornerstone content pieces, non-existent content types, or relevant topic areas that haven’t been covered? What topics or content are missing from your competitors? Can you beat your competitors’ information-rich content assets? Useful guides on Content Gap Analysis:
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.

Whatever industry you’re in, chances are there are at least one or two major conventions and conferences that are relevant to your business. Attending these events is a good idea – speaking at them is even better. Even a halfway decent speaking engagement is an excellent way to establish yourself as a thought leader in your industry and gain significant exposure for your site.
Don’t overlook opportunities for SEO - Being visible online doesn't happen by chance. Having a website that has an SEO friendly framework and staying up-to-date with search trends and algorithms takes strategy and time. Make sure that pages have accurate titles, proper meta tags and relevant keywords. Don’t be fooled into thinking this is a one time deal, either. SEO takes a constant effort to stay competitive and relevant.
Think interviews are only for the big leaguers? You’d be amazed how many people will be willing to talk to you if you just ask them. Send out emails requesting an interview to thought leaders in your industry, and publish the interviews on your blog. Not only will the name recognition boost your credibility and increase traffic to your website, the interviewee will probably share the content too, further expanding its reach.
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]
×