Yep and sometimes it’s just being a little creative. I’ve started a little blog on seo/wordpress just for fun actually… no great content on it like here though… but because the competition is so tough in these niches I decided to take another approach. I created a few WordPress plugins that users can download for free from wordpress.org… and of course these link to my site so this gets me visitors each day.
Brian, I recently found your blog by following OKDork.com. Just want to say you’re really amazing with the content you put out here. It’s so helpful, especially for someone like me who is just starting out. I’m currently writing posts for a blog I plan to launch later this year. I think my niche is a little too broad and I have to figure out how to narrow it down. I essentially want to write about my current journey of overcoming my fears to start accomplishing the dreams i have for blogging, business, and travel. In doing so, I will share the best tips, tools, and tactics I can find, as well as what worked, what didn’t and why.
This way, when you do drive traffic, you know where that traffic is coming from. Otherwise, you're left in the dark. For example, if you do some content marketing on Quora.com or Medium.com, you could use the campaign source as simply Quora or Medium and the campaign medium as content_marketing and the term as the term you're working to rank for. Get the picture? Then, you'll see all the beautiful results directly in Google Analytics and you'll know specifically where your traffic came from.
I’ve just taken the SEO role at my agency full time and, whilst it can be difficult at times, I am liking the challenge. I wonder if you had any suggestions when it came to finding “opportunity keywords” for term/subjects that don’t necessarily have massive search volumes associated to them? I use a few tools and utilise Google’s related terms already, but wondered if there were any tricks for finding new markets?
Users will occasionally come to a page that doesn't exist on your site, either by following a broken link or typing in the wrong URL. Having a custom 404 page30 that kindly guides users back to a working page on your site can greatly improve a user's experience. Your 404 page should probably have a link back to your root page and could also provide links to popular or related content on your site. You can use Google Search Console to find the sources of URLs causing "not found" errors31.
I first heard you talk about your techniques in Pat Flynn’s podcast. Must admit, I’ve been lurking a little ever since…not sure if I wanted to jump into these exercises or just dance around the edges. The clever and interesting angles you describe here took me all afternoon to get through and wrap my brain around. It’s a TON of information. I can’t believe this is free for us to devour! Thank you!! Talk about positioning yourself as THE expert! Deep bow.
Webmasters and content providers began optimizing websites for search engines in the mid-1990s, as the first search engines were cataloging the early Web. Initially, all webmasters only needed to submit the address of a page, or URL, to the various engines which would send a "spider" to "crawl" that page, extract links to other pages from it, and return information found on the page to be indexed.[5] The process involves a search engine spider downloading a page and storing it on the search engine's own server. A second program, known as an indexer, extracts information about the page, such as the words it contains, where they are located, and any weight for specific words, as well as all links the page contains. All of this information is then placed into a scheduler for crawling at a later date.
Robots.txt is not an appropriate or effective way of blocking sensitive or confidential material. It only instructs well-behaved crawlers that the pages are not for them, but it does not prevent your server from delivering those pages to a browser that requests them. One reason is that search engines could still reference the URLs you block (showing just the URL, no title or snippet) if there happen to be links to those URLs somewhere on the Internet (like referrer logs). Also, non-compliant or rogue search engines that don't acknowledge the Robots Exclusion Standard could disobey the instructions of your robots.txt. Finally, a curious user could examine the directories or subdirectories in your robots.txt file and guess the URL of the content that you don't want seen.
You grant to Us a worldwide, irrevocable, non-exclusive, royalty-free license to use, reproduce, adapt, publish, translate and distribute Your Content in any existing or future media. You also grant to Us the right to sublicense these rights and the right to bring an action for infringement of these rights. If You delete Content, we will use reasonable efforts to remove it from the Service, but You acknowledge that caching or references to the Content may not be made immediately unavailable.
However I feel that batching all the things influencers share , filter whats relevant from whats not… and ultimately niche it down to identify which exact type of content is hot in order to build our own is a bit fuzzy. Influencers share SO MUCH content on a daily basis – how do you exactly identify the topic base you’ll use build great content that is guaranteed to be shared?
Do not be fooled by those traffic sellers promising thousands of hits an hour. What they really do is load up your URL in a program, along with a list of proxies. Then they run the program for a few hours. It looks like someone is on your site because your logs show visitors from thousands of different IPs. What happens in reality is your website is just pinged by the proxy, no one really sees your site. It is a waste of money.
×