Get exclusive CAP network offers from top brands

View CAP Offers

Jagger

[bsa_pro_ad_space id=2]
  • This topic is empty.
Viewing 2 posts - 1 through 2 (of 2 total)
  • Author
    Posts
  • #591271
    Anonymous
    Inactive

    Great article about jagger:

    http://www.seo-news.com/

    November 24, Issue #95

    Google’s Jagger Update Completing Cycles
    By Jim Hedger (c) 2005, StepForth News Editor,
    StepForth Placement Inc.

    Ever since Google introduced its latest algorithm update in September, a fair amount of column space has been dedicated to telling webmasters and small business owners to wait until the update is complete. In so much as it can be said that the Jagger Update will ever be complete, the final cycle of the immediate update appears to be playing out.
    Jagger was a different sort of algorithm update for Google. Its infamous predecessors, Florida and Hilltop were generally limited shifts in the values Google assigned domains based on content and links. After the immediate punch of previous updates, the search engine results pages (SERPs) would generally return to a stable and predictable state. SERPS generated by Jagger are expected to constantly update themselves with a greater degree of flux and change.

    So, what exactly happened during the Jagger Update and what might it mean to your website? Quite a bit as it turns out.

    The Jagger Update was introduced for three main reasons. The first was to deal with manipulative link-network schemes, sites generated with scraped content and other forms of SE-Sp@m. The second was to allow and account for the inclusion of a greater number of spiderable documents and file types. The third was to allow and account for new methods of site acquisition beyond the use of the spider Googlebot.

    The update made its first public appearance in late September but had its greatest impact in early October. At that time, hundreds of thousands of websites that enjoyed previously strong listings were suddenly struck and sent to the relative oblivion found beyond the second page of results.

    Most of those sites lost position due to participation in what Google obviously considers inappropriate linking schemes. This was actually one of the first conclusions we came to in late September based on the experience of a few clients who joined link-networks that had not been recommended or vetted by our link-experts. This is nöw backed up by discussion in various search engine forums. While most of those hurt by this part of the update are good people running honest businesses, Google put out notice that irrelevant link-networks, no matter how simple or complex, are unhealthy additions to what might otherwise be a good website.

    The problem Google faced was some webmasters misunderstood what links are for and how Google uses them to rank documents. For some unknown reason, many webmasters or site administrators participated in wholesale link mongering, bulking up on as many inbound links as possible without consideration of the most important factor (in Google’s estimation), the relevance of inbound links.

    Nöw, Google appears to be applying filters based on historic data it has collected about all sites in its index over time. In other words, Google likely knows a lot more about documents linking to a particular website than the person who placed or requested the link in the first place. SEOs and webmasters should brush up on the Information retrieval based on historical data patent application Google filed on March 31, 2005 for highly detailed information.

    Google is judging sites on who they link to along with who links to them. Before the update, a link from your site to an irrelevant site was more a waste of time than a waste of opportunïty. Today irrelevant links seem to be both. Google’s desire to offer stable and highly relevant SERPS while preventing outright manipulation of those SERPS was the biggest cause of the shift.

    The second and third reasons for updating the algorithm at this time is the allowance for indexing documents or information obtained through alternative sources such as Google Base, Froogle, and blogs and other social networking tools. Google’s stated goal is to grow to include reference to all the world’s information. That information is being expressed in multiple places using several unique file formats, some of which are difficult to weigh against others. By checking the file or document in question against the long-term history of documents linking to it, Google is better able to establish its theme and intent.

    Mass adoption of blogs, while promoted by Google gave the search engine a number of problems. Webmasters and search marketers will take almost any opportunïty to promote their sites, by any means available. Blogs provided ample opportunities and soon issues ranging from comment sp@m to scraped content Splogs started to gum up the SERPS. By comparing document content with the history of other related documents in its index, Google has become much better at spotting blog-enabled sp@m.

    Google faced problems with forms of search engine sp@m such as fake directories and on-page sp@mming techniques such as hiding information in CSS files. The Jagger Update seems designed to deal with these issues by applying Google’s vast knowledge about items in its index against every document or file it ranks. A site that scrapes content, for example, might be weighed against the documents that content was originally published on and the intent of the republisher. One that hides information in the CSS file will similarly trip Google’s memory of how the same domain looked and operated before the sp@m-content was inserted.

    The third reason for the algo update comes from the expansion of Google itself. Google is nöw much largër than it was when the Bourbon update was introduced in the early summer. Audio and video content is spiderable and searchable. Google’s comparison shopping tool Froogle is starting to integrate itself in with Google Local, just as Google Local and Google Maps are beginning to merge. There is some speculation in the SEO community that Google is preparing to integrate personalized data into the search results served to specific individuals. A strong assumption is that Jagger is part of Google’s movement towards personalization though there is little to firmly point at to support this idea.

    If your website is still suffering the lagging effects of the Jagger Update, your SEO or SEM vendor should be able to offer good advice. Chances are, the first thing he or she will do is a point by point inspection of your inbound and outbound links associated with your website. Next, they will likely suggest making it easier for Google to spider various document file types in your site by providing an XML sitemap to instruct Google’s spider cycle. Lastly, they will likely suggest a look at how website visitors behave when visiting your site. Site visitor behaviours will play a part in Google’s view of the importance and relevance of sites in its index. The introduction of Google Analytics provides webmasters with a lot of frëe information regarding site visitors, along with other information on how the site fares on Google’s search engine. It also provides Google with a lot of information about sites running it. More on the effect of Google Analytics on the SERPS next week.

    About The Author
    Jim Hedger is a writer, speaker and search engine marketing expert based in Victoria BC. Jim writes and edits full-time for StepForth and is also an editor for the Internet Search Engine Database. He has worked as an SEO for over 5 years and welcomes the opportunïty to share his experience through interviews, articles and speaking engagements. He can be reached at [email protected].

    This is a great site also. http://www.seo-news.com/

    #677209
    Anonymous
    Inactive

    Great Article. Thanks for posting Dom!

Viewing 2 posts - 1 through 2 (of 2 total)