This makes no sense to me.
If the visitor arrives on a very targeted page, s/he will read and leave.
When the page is semi targeted, s/he will drill deeper and stay.
Bounce rate seems to punish the most relevant content.
Maybe, but I don’t see how new sites would get a fair shake with this. If all of these things will contribute to rankings, then how does a new site with no traffic, no rankings ever get there? Maybe google is doing something like this now, hence why sometimes new sites instantly rank strong for a few days after getting indexed, then fall off the map. Something like a test run to see how users react to the site.
Yes I’ve spotted that. I suspect through the fact of sheer volume, they can afford to play with the longtail and build up a site’s profile gradually, perhaps using one of the many data centers every now and again to artificially increase sites briefly to judge response. They can probably afford to have one or two sites in the Top 10 that aren’t great for short periods.
…I firmly believe this is the future. How they get there, I have no idea, but the bounce rate, click rate, page views – ie: the general behaviour of a user to a site or sites found through a search term – will become the main factor in SERPS placement
Maybe, but I don’t see how new sites would get a fair shake with this. If all of these things will contribute to rankings, then how does a new site with no traffic, no rankings ever get there? Maybe google is doing something like this now, hence why sometimes new sites instantly rank strong for a few days after getting indexed, then fall off the map. Something like a test run to see how users react to the site.
My main point is that even when/if you narrow the audience down to the geo and the niche you will be left with a handful of pages…. those pages will be ordered by optimization standards (most links, best keyword usage, domain name, etc)
Brings up an interesting point – how does one define “optimisation” I guess. I agree on the domain name remaining an important element, and I agree with you and Dom about on page “SEO” playing a role for a long time to come.
Totally agree on country hosting – not a fan of this at all…it’s Google telling us who we can and cant target. Wrong. Regards the CNN point, and this one…
…I firmly believe this is the future. How they get there, I have no idea, but the bounce rate, click rate, page views – ie: the general behaviour of a user to a site or sites found through a search term – will become the main factor in SERPS placement.
Hence why I think content will be King one day, and SEO as we know it today (links, keywords, titles, semantics, layout and design etc) will not be a major ranking factor.
The bottom line is, if you ran a search engine it is not sensible to allow outside forces (ie: webmasters) to manipulate your search results because while they can, your users can be served with crap. Simple as that. I believe that’s their ultimate goal.
IMO, this is true, blackhat seo in it’s former and current forms will disappear.
However, on-page seo will always be there in some form.
Also, regardless of what the search engine’s rules and methods will be in the future, there will always be people looking to find loopholes and game the engines.
Consequently, search engine positions will always be in a flux, and sites would do well to gain as much independence as they can.
My main point is that even when/if you narrow the audience down to the geo and the niche you will be left with a handful of pages…. those pages will be ordered by optimization standards (most links, best keyword usage, domain name, etc) For instance: assume five sites about baseball legend “Babe Ruth” – then let’s assume that the content on all the sites is ORGINAL and good articles, the sites are all hosted in the U.S. and all the searchers are from New York…. who gets the top listing?
Content – original content is sometimes licensed by something like Associated Press…. so on articles like this – what site will out rank what site? What about original articles with massive duplicates on the same subject … Reuters vs A.P. articles? surely if the engines are still algorithmic the best optimized site wins (best being keywords, most links, domain, trustrank, age, etc, etc) ? How could this NOT be unless a human got involved? How would they do it? It would be curious, for sure, when CNN can’t get top ratings because all the local news stations were out ranking them on national stories….
Country Hosting – this seems like a mistake when you consider world new sites (BBC, etc) – for instance will FoxNews outrank BBC because it is hosted in the USA and I am in the USA? What about global websites? For instance a site in France reviewing in english a casino in Macau?
Geo-IP – doesn’t currently work very well because of dynamic IP addresses.. being a satellite user I know that first hand.
Reaction to a site – this is definitely already being used, and I do agree the engines are getting smarter. Hopefullly that will mean less black-hat tricks that will hopefully be more limited.
Just my thoughts….. ….. fire away !!
Please login or Register to submit your answer