If you create content that people enjoy it can easily become popular or even go viral. The important thing is to put your website and content in front of people that are looking for it, right? Social bookmarking is a super easy way to do just that. Social bookmarking sites allow users to bookmark their favorite websites that other people can publicly view and vote up or down. If you bookmark useful content other people will find it, share it, and vote it up so others can enjoy it. Oh yeah, and it only takes about 30 seconds to bookmark your site. The 3 most popular social bookmarking sites are Digg, Reddit, and Delicious. These 3 sites get over 8 MILLION unique visitors a month – funneling off a chunk of that traffic to your website is very doable. (There’s plenty to go around ) Just remember to create content that people will enjoy and/or find useful. The most popular content on social bookmarking sites are usually check lists, “Top 10” lists, tools & resources, and breaking news – so keep that in mind!
A variety of methods can increase the prominence of a webpage within the search results. Cross linking between pages of the same website to provide more links to important pages may improve its visibility. Writing content that includes frequently searched keyword phrase, so as to be relevant to a wide variety of search queries will tend to increase traffic. Updating content so as to keep search engines crawling back frequently can give additional weight to a site. Adding relevant keywords to a web page's metadata, including the title tag and meta description, will tend to improve the relevancy of a site's search listings, thus increasing traffic. URL canonicalization of web pages accessible via multiple URLs, using the canonical link element or via 301 redirects can help make sure links to different versions of the URL all count towards the page's link popularity score.
The problem that most people face isn't about how they can setup a website or even start a blog; it's about how they can actually drive traffic to that digital destination floating about in the bits and bytes of cyberspace. If you're not a seasoned digital sleuth yourself, you've likely struggled with getting the proverbial word out through a variety of forms of online marketing.
However, the more organized you are, and the better you've presented your offer at the outset, the more likely you'll be to succeed with any one of these traffic methods or strategies. So, how do you track all of your efforts to ensure that you're doing the best to understand where your visitors are coming from when it comes driving traffic to your website?
To avoid undesirable content in the search indexes, webmasters can instruct spiders not to crawl certain files or directories through the standard robots.txt file in the root directory of the domain. Additionally, a page can be explicitly excluded from a search engine's database by using a meta tag specific to robots (usually ). When a search engine visits a site, the robots.txt located in the root directory is the first file crawled. The robots.txt file is then parsed and will instruct the robot as to which pages are not to be crawled. As a search engine crawler may keep a cached copy of this file, it may on occasion crawl pages a webmaster does not wish crawled. Pages typically prevented from being crawled include login specific pages such as shopping carts and user-specific content such as search results from internal searches. In March 2007, Google warned webmasters that they should prevent indexing of internal search results because those pages are considered search spam.
This way, when you do drive traffic, you know where that traffic is coming from. Otherwise, you're left in the dark. For example, if you do some content marketing on Quora.com or Medium.com, you could use the campaign source as simply Quora or Medium and the campaign medium as content_marketing and the term as the term you're working to rank for. Get the picture? Then, you'll see all the beautiful results directly in Google Analytics and you'll know specifically where your traffic came from.
Structured data21 is code that you can add to your sites' pages to describe your content to search engines, so they can better understand what's on your pages. Search engines can use this understanding to display your content in useful (and eye-catching!) ways in search results. That, in turn, can help you attract just the right kind of customers for your business.
Promoting your websites by publishing articles to various article directories is by no means a new idea but still an extremely effective way to drive traffic. If you write content and publish it to websites like Article Base, and Article Dashboard website owners will pick it up and post it. This idea is similar to guest blogging except that you only have to write one piece of content that can end up on hundreds of even thousands of blogs and websites. The same rule applies here: don’t be boring – be creative and interesting and use common keywords in your article and title so website owners can find it!
The leading search engines, such as Google, Bing and Yahoo!, use crawlers to find pages for their algorithmic search results. Pages that are linked from other search engine indexed pages do not need to be submitted because they are found automatically. The Yahoo! Directory and DMOZ, two major directories which closed in 2014 and 2017 respectively, both required manual submission and human editorial review. Google offers Google Search Console, for which an XML Sitemap feed can be created and submitted for free to ensure that all pages are found, especially pages that are not discoverable by automatically following links in addition to their URL submission console. Yahoo! formerly operated a paid submission service that guaranteed crawling for a cost per click; however, this practice was discontinued in 2009.
Provide full functionality on all devices. Mobile users expect the same functionality - such as commenting and check-out - and content on mobile as well as on all other devices that your website supports. In addition to textual content, make sure that all important images and videos are embedded and accessible on mobile devices. For search engines, provide all structured data and other metadata - such as titles, descriptions, link-elements, and other meta-tags - on all versions of the pages.
I recently come across a case study where a newly found business was able to climb up high in search results without building links. Their strategy was to create a buzz for their main business term across social media platforms and they made people talk about it as well as visit their respective pages. In 2019, Do we have to look at various means through which we can improve our rankings without building links? What are your thoughts? I am sure Rank Brain also listen to what people talk about a brand or business than just identifying it through backlinks.
Great article, learned a lot from it! But I still really get it with the share trigger and right content. For instance, the influencers now care a lot about the new Koenigsegg Agera RS >> https://koenigsegg.com/blog/ (Car). I thought about an article like “10 things you need to know about the Koenigsegg Agera RS”. The only problem is that I don’t know which keywords I should use and how i can put in share triggers.
In an every social media site there is a feature to share the world about your business. In Facebook you can create your page and promote it, more the people like, more you get visitors to your website. Now a days getting likes in Facebook is really easy because lots of users at the time use these social media sites even your friends. That’s the best method used to bring more visitors..
Just ridiculously good as usual Brian, you continue to set the bar higher and higher each time I see a new post from you, well done. A quick point regarding point 16 about Google only counting the first anchor to a page, what is your opinion about links that go to DIFFERENT pages on the same site. I believe they pass equal weighting but would be good to get your option.
Very good tips on traffic generation. However, for those with time constraint, back-links is a quick method along with another easy method, blog commenting. I would second most of the people who commented in support of guest posting. Yahoo answers may get mixed responses depending on the topic. If you have time, you can also post on related forums and try video marketing.
Today, if you don't understand SEO, you're doing yourself a disservice. Discover the nuances about SEO so that you're engaging in the right type of traffic delivery strategies. You don't want to bend or break the rules. Plus, by really having an understanding of SEO, you could quite literally supercharge your results. Find a good course or audiobook about SEO and learn like the wind.
Search engines find and catalog web pages through spidering (also known as webcrawling) software. Spidering software "crawls" through the internet and grabs information from websites which is used to build search engine indexes. Unfortunately, not all search engine spidering software works the same way, so what gives a page a high ranking on one search engine may not necessarily give it a high ranking on another. Note that rather than waiting for a search engine to discover a newly created page, web designers can submit the page directly to search engines for cataloging.
In December 2009, Google announced it would be using the web search history of all its users in order to populate search results. On June 8, 2010 a new web indexing system called Google Caffeine was announced. Designed to allow users to find news results, forum posts and other content much sooner after publishing than before, Google caffeine was a change to the way Google updated its index in order to make things show up quicker on Google than before. According to Carrie Grimes, the software engineer who announced Caffeine for Google, "Caffeine provides 50 percent fresher results for web searches than our last index..." Google Instant, real-time-search, was introduced in late 2010 in an attempt to make search results more timely and relevant. Historically site administrators have spent months or even years optimizing a website to increase search rankings. With the growth in popularity of social media sites and blogs the leading engines made changes to their algorithms to allow fresh content to rank quickly within the search results.