What blog posts are generating the most views? What subjects are most popular? And how can you create more, similar content? These are some of the questions you’ll want to be asking yourself as you analyze your website data. Determine what pages are resulting in the most bounces (exit pages) and the pages through which people are entering your site the most (entry pages). For instance, if the majority of people are leaving your site after reaching the About page, that’s a pretty clear indication that something should be changed there.
Sites like Outbrain and Taboola are great for promoting your website or blog as long as you have some sales funnel setup and an ability to track those individuals who arrive from these platforms. These sites will promote your content across thousands of other similar websites across the internet for a fee. However, be sure to do your due diligence and test things out before diving in headfirst.
I’ve just taken the SEO role at my agency full time and, whilst it can be difficult at times, I am liking the challenge. I wonder if you had any suggestions when it came to finding “opportunity keywords” for term/subjects that don’t necessarily have massive search volumes associated to them? I use a few tools and utilise Google’s related terms already, but wondered if there were any tricks for finding new markets?
There were some great tips in this article. I notice that many people make the mistake of making too many distracting images in the header and the sidebar which can quickly turn people off content. I particularly dislike google ads anchored in the centre of a piece of text. I understand that people want to make a revenue for ads but there are right ways and wrong ways of going about this. The writing part of the content is the important part, why would you take a dump on it by pouring a load of conflicting media in the sides?
Black hat SEO attempts to improve rankings in ways that are disapproved of by the search engines, or involve deception. One black hat technique uses hidden text, either as text colored similar to the background, in an invisible div, or positioned off screen. Another method gives a different page depending on whether the page is being requested by a human visitor or a search engine, a technique known as cloaking. Another category sometimes used is grey hat SEO. This is in between black hat and white hat approaches, where the methods employed avoid the site being penalized but do not act in producing the best content for users. Grey hat SEO is entirely focused on improving search engine rankings.
You may not want certain pages of your site crawled because they might not be useful to users if found in a search engine's search results. If you do want to prevent search engines from crawling your pages, Google Search Console has a friendly robots.txt generator to help you create this file. Note that if your site uses subdomains and you wish to have certain pages not crawled on a particular subdomain, you'll have to create a separate robots.txt file for that subdomain. For more information on robots.txt, we suggest this Webmaster Help Center guide on using robots.txt files13.
Expertise and authoritativeness of a site increases its quality. Be sure that content on your site is created or edited by people with expertise in the topic. For example, providing expert or experienced sources can help users understand articles’ expertise. Representing well-established consensus in pages on scientific topics is a good practice if such consensus exists.
Well, the age of print media is coming to a close. But there’s no reason why some enterprising blogger couldn’t use the same tactic to get new subscribers. Let’s say you have a lifestyle blog targetting people in San Francisco. You could promote the giveaway through local media, posters, and many other tactics (we’ll get into these methods shortly).
By 2004, search engines had incorporated a wide range of undisclosed factors in their ranking algorithms to reduce the impact of link manipulation. In June 2007, The New York Times' Saul Hansell stated Google ranks sites using more than 200 different signals. The leading search engines, Google, Bing, and Yahoo, do not disclose the algorithms they use to rank pages. Some SEO practitioners have studied different approaches to search engine optimization, and have shared their personal opinions. Patents related to search engines can provide information to better understand search engines. In 2005, Google began personalizing search results for each user. Depending on their history of previous searches, Google crafted results for logged in users.
Make data-driven decisions when optimizing your site - It is never smart to invest in optimizing your website based on hunches and guesses. Traffic insight is only a click away with powerful data sources like Google Analytics and visitor tracking software. However, don’t treat this information like an autopsy - use it to make changes and find the story behind your visitor’s experience on your site. Where are people jumping off the most? What do the trends say about your site’s usability? What hypothesis can be made based off of average site visit times and heat maps? Test out your hypothesis with AB testing first to see if you can convert those visitors and then make the widespread changes based off what the testing showed works with your customers.
Content-Delivery Networks (aka CDNs) are a great way of speeding up page delivery across the world. Google and other search engines are inherently concerned about the speed of your site and page content. Use Amazon's AWS, MaxCDN or any number of other tools out there to leverage CDNs along with browser-caching tools like W3 Total Cache, WP Super Cache and others.
In December 2009, Google announced it would be using the web search history of all its users in order to populate search results. On June 8, 2010 a new web indexing system called Google Caffeine was announced. Designed to allow users to find news results, forum posts and other content much sooner after publishing than before, Google caffeine was a change to the way Google updated its index in order to make things show up quicker on Google than before. According to Carrie Grimes, the software engineer who announced Caffeine for Google, "Caffeine provides 50 percent fresher results for web searches than our last index..." Google Instant, real-time-search, was introduced in late 2010 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.
The leading search engines, such as Google, Bing and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review. Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links in addition to their URL submission console. Yahoo! formerly operated a paid submission service that guaranteed crawling for a cost per click; however, this practice was discontinued in 2009.
YouTube is a great resource for driving free organic traffic to your website. Maybe it's because Google loves YouTube, and considering that it's the second most popular search engine in the world, gaining exposure on YouTube could be huge. Create useful tutorials and videos that add an immense amount of value and be sure to link to your content through the description.