Nothing looks sloppier than websites that don’t abide by any sort of style guide. Is your blog section a complete deviation from your website? If so, this very well could throw off your visitors and decrease engagement. Instead, make sure that all of your web pages are consistent in design, font and even voice. For instance, if you use a very formal tone on your homepage, but a super casual tone in your blog posts, this could highlight brand inconsistency.
Text-based content is all well and good, but video can be a valuable asset in both attracting new visitors and making your site more engaging. Data shows that information retention is significantly higher for visual material than it is for text, meaning that video marketing is an excellent way to grab – and hold – your audience’s attention, and boost traffic to your website at the same time.

People love reading about results. That’s because it’s one of the best ways to learn. You can read information all day, but results show you the practical application of the information. Create content showing real life results. It’s easy in my industry because results are all that matter. But this can work in other industries as well. Here are some non-marketing examples:

If you create memorable content, people will want to come back for more. So instead of churning out lackluster content that can be found anywhere on the web, write higher quality, unique content that caters directly to your audience. Speak your opinion on a subject matter, instead of just objectively providing facts. Create useful, thought-provoking content. Posting three so-so blog posts a week will not be nearly effective as posting one superb blog post per week.


Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
Wow Brian, You have solved my problem. A few days back I was looking for ways to increase traffic on my tech blog, I found this blog post by you while I was looking out for possible tricks to increase traffic. I must say that few of the tricks mentioned above really worked for me. For example, I updated a few old posts on my blog, I did try the broken link building technique and the last I did was to repost my content on Medium.
Early versions of search algorithms relied on webmaster-provided information such as the keyword meta tag or index files in engines like ALIWEB. Meta tags provide a guide to each page's content. Using metadata to index pages was found to be less than reliable, however, because the webmaster's choice of keywords in the meta tag could potentially be an inaccurate representation of the site's actual content. Inaccurate, incomplete, and inconsistent data in meta tags could and did cause pages to rank for irrelevant searches.[10][dubious – discuss] Web content providers also manipulated some attributes within the HTML source of a page in an attempt to rank well in search engines.[11] By 1997, search engine designers recognized that webmasters were making efforts to rank well in their search engine, and that some webmasters were even manipulating their rankings in search results by stuffing pages with excessive or irrelevant keywords. Early search engines, such as Altavista and Infoseek, adjusted their algorithms to prevent webmasters from manipulating rankings.[12]
×