CAP Euro was incredible. Do your best to get to Amsterdam. I think its going to be absolutely huge.
I’ve ahd a few requests for this and I have a session covering this in Amsterdam but its big for affiliates – especially the goetargeting covered in Tip 9, so I’m posting it now to give you a heads-up and a bit of a head start.
If you have any questions regarding any of the content put them here. I’ll answer them all.
Geotargeting is huge everyone – all the big guys are talking to me about it as well. You have a great advantage in that you could incorporate this right now.
Hope you find something useful.
Your Website is looking good and has all the things you’ve been told it needed; great titles, H1-H3 tags, meta tags, good unique content and easy navigation.
So why aren’t you getting the traffic you deserve?
These tips are for beginner to advanced SEO’s. I have many other good tips in version 2007 and 2008 Top Tips, and I have struggled to come up with 12 more unique tips that are also worthy of being included in anything called “Top Tips”, but I think I have found them.
I’ve laid these tips out in a different way than in the past. Rather than spit out 12 random tips, I have arranged these tips into a step-by-step strategy that most anyone with a bit of knowledge can implement. The first part covers getting your website in shape, and the latter half will give you strategies to take advantage of the work you have done.
1. Make Sure Your Website Is Ready for SEO
Most website owners have succumbed to the realisation that one way of another, they are going to have to start injecting content into their website to have any chance at a top 10 ranking. So many site owners set about creating or buying content to add to various sections of their website in an attempt to pacify the search engines. In most cases it lasted for a few weeks and was abandoned because it really takes a lot to create good content on a daily basis. Unfortunately the road to Google Hell (not ranking) is paved with good intentions. Unless you were smart and integrated WordPress, Joomla or some type of content generating application early on, you may have pages all over the place and this means you may also have internal navigation issues causing problems with the robots when they try to crawl your website. If the bots cannot find your pages or get stuck in a loop as a result of poor navigation then any work could potentially be a waste of your time and money.
Run Xenu (a free tool) on your website to be sure this isn’t the case. This will give you a detailed overview of your websites navigation and if you have any problems with your internal linking structure. It will also notify you of any redirects or errors in linking, particularly any 302 redirects or 404 pages that are being returned throughout your entire website. You want to identify any adverse navigation problems that it finds. 302 redirects for instance are frowned upon by Google but are often left in place whilst working on a site. If you are getting 404’s you’ll be able to identify and fix these.
Since we are getting our site healthy to maximise the effects of the other 11 steps, let’s go to Webmaster Tools and check out your overview. This also does what Xenu does but it shows you what Google in particular is seeing. The image below is an overview in Webmaster Central Tools from Google. The various errors are problems and need to be fixed.
Your browser may not support display of this image.
Not found errors could be pages that have been moved or replaced, or even renamed. You may have internal, or even worse, external links pointing to these pages and these linking pages could have a significant impact on your rankings.
URLs timed out could be many issues, but whatever they are they need to be resolved. If the Googlebot can’t get to the page then any pages beyond this page, and anything on the page may not be indexed. If this is the case then Google will rely on cached versions or whatever they were able to crawl the last time they were able to crawl the page. This could potentially cause ranking problems because its stale content or content that cannot be verified as being relevant to the backlinks pointing at the page.
Unreachable URLs are internal and external links pointing at a page within your website that cannot be reached. These are internal and external links that someone was nice enough to put up for you, but you moved the page or they may have misspelled the URL. You could; Contact the site owner and have it fixed or; create the page relevant to the anchor text used or; do a permanent 301 redirect and any Page Rank or authority it has will be passed onto the new destination page. I have seen websites with thousands of these errors caused after re-launching their website or whilst rewriting URLs. This could be huge for some of you!
Once you have fixed these issues create a new xml sitemap and manually submit it through Webmaster Central.
2. Check your internal canonicalization
Websites can have more than one URL. (e.g. http://www.bluewidgets.com and http://www.bluewidgets.co.uk). If you have been around for a while and people are linking to you they could be linking to either URL. By designating a primary it gets 100% of the above benefits. Go to Google’s Webmaster Central and in the tools section designate one as your primary. Do a 301 redirect on the non-primary page to pass on any backlink juice, PR and authority that the page has to the primary page. Internal navigation needs to be checked to be sure all links go to the correct version as well. Bad navigation is common, especially with websites constantly being populated or worked on by many individuals. Pick one and use it throughout.
3. Re-structure Your Site Navigation Using a the Silo Method
Unless you planned your website correctly in from the start, or you went through the process I mention above and have been adding pages here and there, you probably have segmented your site into broad sections. The problem is that there is no real hierarchy and you are diluting valuable link juice.
An example of this is a Bingo website trying to rank for 75-ball, 80-ball and 90-ball bingo games. The goal is to be recognised as an authority on each of these games so that you will rank high when are searched individually. You run the risk of Google clustering the pages together and only ranking well for that game.
The silo method of segmenting your website is by-far the best way fix this common problem. It will maximise the functionality and end-user experience and also go a long way in improving your ranking in the search results. A silo is a vertical page linking design. You have your landing page, or your main page, at the top of the silo and underneath this page you have pages which support your main landing page theme. (If you are building a new site I recommend using Drupal as it has built-in functionality to accomplish this.)
Here is a generic example of a silo. This example shows a clear path that either the end-user or a robot can follow.
Your browser may not support display of this image.
There are several ways to create silos:
1) Tagging
2) Create categories
3) Create directories
4) Install a related pages plugin
5) ONLY link to landing pages using your target keyword/phrase.
6) Create a mini-sitemap on each page.
Expected Results:
1) Higher rankings
2) Increase in overall traffic
3) More unique visitors
4) More traffic from long tail keywords.
Now that you have finished the housecleaning on your site in steps #1 and #2, you are now ready to start building some unique content and implementing SEO techniques.
4. Start Identifying and Targeting Longer Keyword Phrases with PPC
There are reports that say up to 60% of the quality/converting traffic comes from niche or longtail search terms. Search queries as long as 3-6+ words long are increasing as end-users learn more about targeted searches. Start a ppc campaign to identify longtail terms that you should be targeting. It’s very cheap at .01-.05p a visitor. Do this on all 3 search engines because the landscape differs from one to the other, as well as the techniques used to optimise for each. You can spend as little as a hundred Euros and get a ton of invaluable information.
Here are a few tricks to maximise your return using PPC;
Get your logfiles from your ISP or get an analysis tool or software like Hittail. These tools allow you to see what term/s and which search engine directed the searcher to your website (Hittail does it live), and more importantly what page was identified by the search engine as being relevant enough to show up in the results.
When you do this you need to be sure to set it up correctly. First off, be sure that you use existing pages. Google uses a portion of the organic algorithm to rate or “grade” each keyword. Because of this the same optimisation techniques should be used in your PPC campaign. Use existing pages that are already ranking and have a few backlinks, and one that is aged as well. If you need to, add a visible CTA (call to action) into an existing page that is showing PR.
Secondly, when you set up the account all the setting should be put at their maximum (CPC and daily budget). Keep the campaign turned off so it doesn’t cost you anything and usually over the next 48hrs. The campaign, adgroup, keyword and targeted landing page will (9 times out of 10) be evaluated automatically, and provided you do not already have a poorly constructed campaign or any history for Google to go on, it will give you a higher quality score, which means your ad will be higher in the PPC SERP’s at a lower cost. (*note – I have tested Google’s automated grading of new campaigns in Adwords and have found that if you revive and old account, or even add to an existing one and the grade scoring was bad previously, that any new campaign will inherit the poor grades. If this is the case you need to start an entirely new account with a new credit card and start from scratch.)
Next, create adgroups by the top level keyword like casino, slots, poker, bingo, etc. Sort all of the related niche and longtail terms that you have collected from your logfiles or Hittail into each of these because you will be adding more and more each day/week/month depending on the amount of traffic that you are getting.
When you are selecting the landing page, be sure to follow the same rules that you do when acquiring backlinks to the website; be sure the terms are relative to the landing page.
Setup Google Analytics. It will give you a better look into your stats and identify the converting terms.
Once a week go into your logfiles or Hittail data and add the terms into the relative account. This technique will not just give you a lot of data, it may even prove to be a worthwhile investment if the conversion rate is under your commission rate.
You can go after the low-hanging fruit because it’s so cheap, and you can gather valuable data that can be used for building additional content.
5. Build New Pages Using Latent Semantic Analysis
Once you have identified the converting terms in your PPC campaigns, start to build unique pages for them with at least 300-350 words. The data that you get from your PPC campaigns, logfiles and/or Hittail is great for creating new content. Hittail is especially good to use for content because you are able to see exactly what the searcher entered. I’ve seen strings of queries 6,7,8 and even 9 terms long. (If you target foreign languages this can be especially helpful.)
The important element here is that each page needs to have unique content. This can be difficult when building pages for niche and longtail terms that are aligned with other primary or secondary terms. Let’s say you are building individual pages for these terms;
* 1. casino
2. online casino
3. free online casinos
4. online casino with no download
5. free online casino with no download
6. UK casino with no download
And so on. How many different pages can you create good content for whilst keeping the pages individual?
Well I can’t help much with the actual writing process but I can tell you how to be sure the pages are uniquely individual and how to keep them from possibly being clustered. (Google has admitted that when they detect duplicate content, such as through variations caused by URL parameters, they group the duplicate URLs into clusters. They select what they believe to be “the best” URL to represent the cluster in their search results, then consolidate properties of the URLs into the cluster, such as link popularity, to the representative URL).
There are a few simple elements that you need on-page that will keep the pages separate, and ranking for their own individual longtail terms. Use the phrase/keyword set;
1. For the page title
2. In the first or second paragraph of the text, and in bold
3. Elsewhere in the content underlines or italicized
4. In the meta description
5. In an internal link’s anchor text
6. In an external link from another website, press release or Social Bookmark
*Important! When you are building a page that targets a specific term, be sure your internal and external links that point back to the page are keyword specific. If the keyword that you are targeting has multiple variations that you wish to target (bingo>online bingo>free online bingo, etc.), or if you already have targeted pages that are in place, it is crucial that you do not dilute the link popularity. Internal links to internal pages are just as important as external links to internal pages. Build individual pages for each converting keyword phrase.
6. Use Nofollows to Link Sculpt PR (Page Rank)
Page Rank is basically the level of trust and authority that Google has assigned to a website and each individual page within the site. PR is based off of historical information, sometimes months old, so check to see if the linking page has any links showing in Google for a better insight. There are many tools that have this functionality but I use Firefox browser with the SEO Quake plugin because it also shows Yahoo and MSN backlinks, social bookmarks, cache date and many other details that are important.
If you are linking from a page (whether internally or from an external website) PR becomes very important. This PR, from a rating of 0/10 up to 9/10, is the amount of available “link juice” that you have to play with within your website, or that is being divided up amongst the links on an external website that is linking back to you.
Trust and authority are determined by the number of backlinks from other websites, the relevancy of content on your page, who else they are linking to, who else you are linking to, and many other attributes. The little PageRank bar (see image below – if you have the Google toolbar installed) shows Page Rank and is somewhat of an indicator of a sites level of trust and authority as determined by Google, including your own website and pages within your website. Links from internal pages with page rank can be sculpted in a way that boosts the amount of PR, and therefore the rankings for the term or terms that the page is targeting within Google’s search results.
Your browser may not support display of this image.
There are two factors Google looks at; the PR being passed from external sources; and the PR being passed internally through links.
What we are going to use it for now is sculpting your internal Link Juice to help boost your rankings for these pages.
According to Wikipedia, the nofollow was intended to reduce the effectiveness of certain types of search engine spam, thereby improving the quality of search engine results and preventing spamdexing from occurring in the first place. Matt Cutts of Google and Jason Shellen from Blogger created it around 2005.
What it does is tell the search engines that you do not endorse the page you have linked to. It looks like this;
Normal link;
Your Link
Changed to a nofollowed link;
But this has created a useful tool for search engine optimisation. We can nofollow the links in our website to preserve the PR and pass it on to the pages that we want to rank well. The ‘About Us’, ‘Contact Us’ and ‘Login Here’ pages/links are obviously not pages that we care to rank for. So by adding nofollows to these internal links, we can funnel more PR to the important pages.
Without nofollows With nofollows
Your browser may not support display of this image. Your browser may not support display of this image.
Another great way to use this tip is when you are creating new pages for niche phrases that you identified through your logfiles or Hittail. Build a doorway page off of your homepage. I use this as a ‘holding’ area for newly identified hotlist terms that I want to rank quickly. Put nofollows on all of the template/navigational links. Add relative content on the doorway page using your newly found keyword and anchor the keyword or keyword phrase (turn it into a link) to the new page that you created.
This will funnel all of the available link juice to the new page, targeting the new term, and you will see your page rise through the rankings.
7. Test your landing pages and evaluate click through rates
Google has admitted in the last few months that they do look at click through rates in their rating process. I already knew Yahoo did this so it was no surprise that Google does it as well. It just makes sense. Google takes information gathered from their toolbar and Google Analytics to use in their algorithm. Not much of this can be realistically proven but this is something that you should be doing anyways to identify problems or other issues. A typical CTR is between 25-35%. Do A/B or multivariate testing to improve CTR on your landing page. Utilising a PPC account is a great tool for this. The end-goal is to get the user to click through to at least one additional page within your website. Affiliates should be vigilant in the implementation of this technique because most Affiliate sites that I have seen either have banners, and the like, with CTA’s (call to actions) or they have an immediate redirect to their Provider’s website. Neither of these techniques is beneficial to your overall trust or authority rating. Google looks for traffic to stay on your site (at least past the front page), so I optimise landing pages to provoke click-through rates. There are countless ways to implement this. You can incentivise it or use some other technique, but the goal is to get them through to a secondary landing page before you send them to your Provider. This technique may not be for everyone, but for those who do I felt it was a top tip that even if you can’t try out now, you can file away for another day.
8. Identify Supplemental/Omitted Pages & Get Them Out
You may have 100’s or 1000’s of pages that are in supplemental or omitted results. This usually happens because you have duplicate or similar pages, (as is often the case with Affiliate websites), the page has not been updated recently, Google has clustered the URL’s, too little content, no back links or poor internal navigation.
Google has said that they have eliminated Supplemental Results. I believe this is because of the integration of Universal Search, (a.k.a. Blended Search) which just happened to occur shortly before the change. Since aged or orphan pages could actually be documents, news articles, videos, blogs and forums that held valuable and more relevant information than a new page, this “filter” needed to be changed to include all available resources.
The best way to identify these pages is to do enter “site:[URL=”http://www.yourwebsite.com%E2%80%9D”%5Dwww.yourwebsite.com”%5B/URL%5D into the Google search bar and take note of the number of pages Google is Showing indexed;
Your browser may not support display of this image.
This search shows Google can see 3,930 pages.
“Supplemental sites are part of Google’s auxiliary index. Google is able to place fewer restraints on sites that we crawl for this supplemental index than they do on sites that are crawled for the main index. For example, the number of parameters in a URL might exclude a site from being crawled for inclusion in the main index; however, it could still be crawled and added to Google’s supplemental index.
The index in which a site is included is completely automated; there’s no way for you to select or change the index in which your site appears. Please be assured that the index in which a site is included does not affect its PageRank.”
Nonsense!
At the time of this article Google was already starting to eliminate their search results showing supplemental results. Until recently, all you had to do was go to the last few pages of your query and locate the pages that had ‘ – Supplemental Result’ just after the page size. They aren’t showing these anymore. Here’s what they had to say, “Since 2006, we’ve completely overhauled the system that crawls and indexes supplemental results. The current system provides deeper and more continuous indexing. Additionally, we are indexing URLs with more parameters and are continuing to place fewer restrictions on the sites we crawl. As a result, Supplemental Results are fresher and more comprehensive than ever. We’re also working towards showing more Supplemental Results by ensuring that every query is able to search the supplemental index, and expect to roll this out over the course of the summer.
The distinction between the main and the supplemental index is therefore continuing to narrow. Given all the progress that we’ve been able to make so far, and thinking ahead to future improvements, we’ve decided to stop labeling these URLs as “Supplemental Results.” Of course, you will continue to benefit from Google’s supplemental index being deeper and fresher.”
Google then said that the easiest way to identify these pages is like this; “First, get a list of all of your pages. Next, go to the webmaster console [Google Webmaster Central] and export a list of all of your links. Make sure that you get both external and internal links, and concatenate the files.
Now, compare your list of all your pages with your list of internal and external backlinks. If you know a page exists, but you don’t see that page in the list of site with backlinks, that deserves investigation. Pages with very few backlinks (either from other sites or internally) are also worth checking out.”
Nonsense!
Okay so now you have identified the pages that are in supplemental results and not showing up in the results anywhere.
Now we need to identify why they are there. The main reasons that a page goes to supplemental results are;
1. Duplicate Content
2. 301’s. Redirected Pages that have a cache date prior to the 301 being put in place
3. A 404 was returned when Google attempted to crawl it
4. New Page
5. Bad Coding
6. Page Hasn’t Been Updated in Awhile
7. Pages That Have Lost Their Back Links
According to Matt Cutts of Google,”PageRank is the primary focus determining whether a URL is in the main web index vs. supplemental results”
Now this isn’t the end-all, but it covers about 95% of the reason that you may be in the supplementals.
So now we know what they are, how to find them and why they are most likely in the supplemental results. Now let’s get them out of there.
Here are the different methods that I use when I find that a page has gone supplemental;
1. Add fresh content to the page
2. Add navigation to the page from the main page
3. Move the pages to the first subdirectory if it is not already there
4. Get a back link to the page and/or create a link from an existing internal page with the anchor text containing the keywords for that page
5. Do some social bookmarking on the page
6. Make sure the page is included in my xml sitemap and then resubmit it to Webmaster Central.
7. Lastly, if none of the above seem to be working after 90 days, and I have another page that is relevant and does have PageRank and isn’t listed in the supplemental, I do a 301 (permanent redirect) to it from the supplemental page.
9. Use Geotargeting for Language and Regional Targeting
The ways that people search and the results the search engines are delivering are evolving rapidly. Smarter queries and more complex algorithms mean that you need to use various techniques to be sure you are showing up in the results. Local search, advanced search, regional search and language-based searches are some of the filters an end-user or a search engine can use in determining who shows up, when they show up and where they show up.
Geotargeting is one tool Google has refined and one that you can manipulate to a point in order to increase saturation in any market.
Beyond the obvious on-page considerations, different searches will deliver (in most cases) a different set of results. The results can differ greatly depending on several considerations;
1. The IP of the end-user
2. The server location of the website
3. Any geographically targeted settings in Webmaster Central
4. The relationship between the search filters and the resulting web pages (I.e. Did they search for Pages from [region] or Pages in [language]
5. If the end-user is searching a different extension than the defaulted engine (they manually enter Google.com searching for US or English results in a non-US region.
The other elements that will affect rankings will be back links;
1. Are the links from a TLD that matches the destination URL (I.e. .nl linking to a .nl website)?
2. Is the IP linking website located in the same region and the linked URL?
3. Page rank,linking anchor text, additional outbound links on the page linking to you
4. On-page relevancy
5. Language based meta-tags
6. Everything in the above 5 items relating to the linking website/page
Any one of these elements can give you an edge over your competition.
Searching any of Google’s (non-US) datasets will generally return a variety of websites when no language or location filter is selected. These can include internal pages in a website, subdirectories (http://www.yoursite.com/french), subdomains (http://www.french.yoursite.com), and various TLD’s (top level domains like .com and .nl). All 11 of the above factors are present in the automatic algorithm.
The problem is that no one really knows which approach is best, or which algorithmic attribute is the most effective, so what can we do with this?
What we want to do is to look at the existing results using the available search filters, and the existing websites that are ranking high and determine what the best strategy for your website is. This takes deep page analysis of your competitors.
The important thing to note is that there is a hierarchy between one and the other in terms of which is the best solution. Every website has its own individual solution based on their demographics, site mechanics and available resources. What you need to consider are;
1.
1. Your target market?
2. If you need or don’t need geographical targeting?
3. If you need language based subdomains or subdirectories?
4. Should you move hosting?
5. Can I afford to do it all?
How & When to Use Geographical Targeting
Here’s what to do if you wish to;
Geographically target a region?
1.
1. Create a subdomain or a subdirectory in the native language and use Webmaster Central to geographically target it
2. Host the subdomain on a server in the native region and use geographical targeting
3. Build back links from similar TLD’s
Target a specific language?
1.
1. Create a subdirectory in the native language (I.e. http://www.yoursite.com/nl/)
2. Build back links from same language websites
3. Do not use geographical targeting
The reason that you do not want to use geographical targeting along with a language-based strategy is that if the end-user searches in the native language on Google.com, a site using content in that language will be stronger than the same site with geographical targeting in place. (This isn’t dependent on whether you use subdirectories or subdomains unless you hosted the subdomain in the target region).
The answer for me is that I want it all…and NOW!!
I’ve recently had subdomains rank with geographical targeting turned on and in the native language rank top 10 in 6 weeks. I’ve had brand new websites with the appropriate TLD’s (I.e. .nl, .de & .es) show up in 8 weeks. I’ve even had a .com hosted in the US without geographical targeting show up in the top 10 results for “Hollywood” terms when they had never been in results in the UK.
You can start with subdomains. Look at your logfiles to determine where the current traffic is coming from to tell you what to do first. Bounce rates can also tell you a lot.
For example, if your secondary traffic source is Germany and you have a high bounce rate, start with a language-based subdirectory, then maybe move onto creating a subdomain, hosting it in Germany, then set the geographical targeting to Germany in Webmaster Central. Then go back and start all over again using the region that has the next highest contribution.
Important Things to Remember!
*
o To target a language using only subdirectories do not use geographic targeting
o You can target a language with both subdomains and subdirectories but if you have a top-level TLD (.com) use subdirectories versus subdomains.
o You can use Google geographical targeting on subdomains and subdirectories
o Your title should be in the native language and/or use regional slang terms where they apply.
o Use language-based meta tags whenever targeting language-based searches
o Host subdomains that are for geographical targeting in the target region
o When you implement the subdomain strategy, link to it from the original website
o Create new sitemaps for each subdomain
o When creating meta tags and content be sure to use native slang. (If you sold pants in the US and the UK. Pants are referred to as trousers. Sweaters are referred to as jumpers.
o Get back links from same TLD’s (get a .nl link to your .nl site in the native language)
o If you have a TLD (like .nl or .de) do not use geographical targeting. These domains are already associated with its designated region
So in a nutshell, I recommend that if you already have an existing website with a TLD like a .com or .cu.uk, and they are your target market, do not use the geographical targeting option. Start building subdirectories using the top native language determined by looking at Google Analytics or your log files. Identify your top referrer language. If the languages are close, as it the case between the US, UK, New Zealand and Australia, use native slang in the title, metatags and content. Build a new xml site map and manually submit it through all the main search engines.
The next step is to create a subdomain and get it hosted in the region that you are targeting. Build content in the native language and get r submit it, as well as setting up the geographical target in Webmaster Central.
By implementing this strategy, you will have a significant advantage over most of your competition (or a little less after this article is released). Whether the search is initiated in the region or outside the region, whether your site is located in the region or just hosted there, or even if they search in the native language or manually enter a specific Google engine like Google.com.mx or Google.es, you will have improved saturation.
10. Use social bookmarking websites for short-term ranking boost and blogs/forums to establish long term trust and authority
Social Bookmarking – Wikipedia defines it: In a social bookmarking system, users store lists of Internet resources that they find useful. These lists are either accessible to the public or a specific network, and other people with similar interests can view the links by category, tags, or even randomly. Most social bookmarking services allow users to search for bookmarks which are associated with given “tags”, and rank the resources by the number of users which have bookmarked them. Many social bookmarking services also have implemented algorithms to draw inferences from the tag keywords that are assigned to resources by examining the clustering of particular keywords, and the relation of keywords to one another.
GaryTheScubaGuy defines it this way:
One of the best free ways to get increased ranking, back links and traffic, for very little time commitment other than setup.
This very moment most search engine algorithms are placing a ton of weight on end-user ‘bookmarking’, ‘tagging’ or one of various types of end-user generated highlighting.
Before doing any of this run a rank report to track your progress. I have tested this on terms showing on page one, on terms ranked 11th through 12th and others buried around pages 5-10. It works on them all in different time frames, and they last for different periods of time. This you will need to test yourself. Be careful because you don’t want to be identified as a spammer. Be sure to use genuine content that provides a benefit to the user.
Here is how I recommend using social bookmarking;
1. Download this; Roboform. (It says it will limit you but I’ve had as many as 30+ passwords created and stored in the trial version) This will allow you to quickly fill out signup forms and store passwords for the 10 Bookmark sites that I am going to be sending you to.
2. Within Roboform go to the custom area and put a username and password in, as well as your other information that sites usually ask for to register. This way when you are using these different bookmarks it’s a 1-click login in and becomes a relatively quick and painless procedure.
3. Establish accounts with these Social Bookmark Sites;
a. Digg
b. Technorati
c. Del.icio.us
d. NowPublic
e. StumbleUpon
f. BlinkList
g. Spurl
h. Furl
i. Slashdot
j. Simpy
k. Google Toolbar (w/Google Bookmarking)
4. Internet Explorer, Firefox and most other browsers have an “add a tab” option, but I use Firefox because I can bookmark the login pages in one file, then “open all tabs” in one click. From here I click on each tab and in most cases, if you set it up right, Roboform will have already logged you in. Otherwise you’re on the login page and by clicking on the Roboform button everything is prefilled, all you need to do is click submit. (some of the bookmark sites will allow you to add their button into your browser bar, or you can get an extension from Firefox like the Digg Add-on to make things quicker)
5. Lastly, Install the Google Toolbar. It has a bookmark function as well, and you can import all your bookmarks from Firefox directly into it. Google looks at many different things when assigning rank and trust. For instance, when you search for something and go into a website, Google will remember how long you stayed, how deep you went, and if you came back out into the search to select another site, which means you didn’t find what you were looking for. This is all part of the Privacy Issues that have been in the news.
Here’s what Google actually says!
“The Google Toolbar automatically sends only standard, limited information to Google, which may be retained in Google’s server logs. It does not send any information about the web pages you visit (e.g., the URL), unless you use Toolbar’s advanced features.”
They practically spell it out for you. Use their bookmark feature just like you were doing the social bookmarking I outlined above. This is just one more click.
Some of the elements that Google looks at when grading a website are;
• How much time did the average visitor spend on the site?
• What is the bounce rate on the landing page?
• How many end-users bookmarked the page?
• How many users returned to the search query and then on to a different site?
Each time you publish an article put a Google Alert on a unique phrase. Each time Google sends you an alert, bookmark it on every bookmark site. This will take some getting used to, but will eventually become second-nature. Remember what I said in the beginning; “One of the best free ways to get links and traffic, for very little time commitment other than setup”.
11. Target Universal Search Results
Universal or ‘Blended Search’ is still fairly new in the search engines and they are working hard at filtering the bad sites from the good sites, but they are also delivering much more than websites in the results. You may have seen this when doing a search and you see a video in the top 5 results.
Google has turned off their supplemental filter and each time a query is entered, they virtually search their entire database for relevant results.
The significant difference now is that the results will often include videos, news articles, .doc, .xls and .pdf docs, forums posts and other data in their inventory.
All of these can be optimised for search.
One example is to use Adobe Pro to convert pages of your website into a PDF format. Name the file using your keywords and optimise the document just like you would a website using H1-H5 header tags, linked images and keyword anchor text that links back to your website.
Any links within your PDF or Word doc will credit the links within the document and will deliver additional traffic streams. I used this method a year or so ago on many of my documents and all are indexed and all show in the results. (Search Top 12 SEO Tips for 2008)
PDF’s are showing up more and more in the top results lately so this can be significant. Video is incredibly viral. Great examples are video of people hitting jackpots on slots. Even just a snapshot of a winner has proven to be a huge traffic source for casino and slot websites.
Create separate xml sitemaps for each set of videos or documents and submit them manually through Webmaster Central. Be sure to list each individually in your robots.txt file to tell Google where they are located.
12. Create a Link Acquisition Campaign
If you haven’t done this yet, you are already behind. Link building is an acceptable practice if it is done the right way. Here I’ll tell you the right way.
You need to set some type of budget. Whether you’re an individual with one or two accounts, or an agency with dozens, you need to have some type of budget set aside for this. It can be money or it can be time.
Here is how I segment my campaigns;
*
· 15% – 25% to purchase 1-way backlinks. I create custom/bespoke articles that will compliment the owner’s site, and that have my keyword phrase within it as my anchor text. I also make sure that it is a relevant site to my article/anchor text.
*
· 25% – 30% reciprocal link exchange. Not text links. I create custom/bespoke articles that will compliment the owner’s site, and that have my keyword phrase within it as my anchor text. I also make sure that it is a relevant site to my article/anchor text.
*
· 25% for blogs and forums. It’s considered Guerilla Marketing. This takes a little longer because you need to establish yourself within communities and become somewhat of an authority that can post links to relevant and useful content on a site. This will attract actual traffic (and improved rankings), and also create natural back links from other end-users.
*
· 25% Use one of many automated tools (I.e. IBP 9.0 – Axandra) to find potential link partners.
Now whether you hire students to do these tasks or you do them yourself, they need to be part of your daily routine. I have tested dozens of techniques, each having its own merits dependent on actual demographics, but every campaign has a planned strategy.
Obviously there are other considerations such as building good content that people want to link to, creating top 10 lists, how-to guides and reviews, but not all markets have the ability to do these in a relevant way. My recommendation in this type of situation, and really any others, is to do a ‘who-is’ lookup and pick up the phone and start calling. These are the best kind of back links.
Be very careful if you are planning to buy links. Exclusivity and non-broker links are safe bets, but it takes time and money…consistently and residually. But the payoff is worth it.
So can I get an UH-HUH!
THIS is SEO! (transparent – that’s just how I roll)
I was at CAP, but needed to go to a different seminar that overlapped with yours, so only managed to catch your last 4 tips on the day.
So, to see that you did this write-up is fantastic
Cheers
Bill
Your information clearly condenses just about everything I’ve ever heard about SEO.
It certainly has a lot of new information.
It also points out the rewards of going to CAP meetings.
Your post is worth more than all those crummy ‘how to do’ pitches I ever seen.
SUPER!
Thanks for that.
You can reach me just about anywhere; here, garythescubaguy.com or garythescubadiver (at) gmail
Thanks again!
Gary
How best would you like us to contact you if we have clarification questions regarding any of your 2009 tips?
Much thanks again.
Please login or Register to submit your answer