Inbound Deep Links Benefit Page Rank Distribution Sitewide
Many a time, you would have come across sites (especially the large ones) where the deeper you dig into the site hierarchy, you can see the Pagerank toolbar grayed out or having a value 0. In general, the home page is the starting point for a website and it accrues the maximum Page rank.
The entire domain’s authority and trust is reflected by this page rank value. The home page then tends to distribute this page rank to the first level (categories), the second level (sub-categories) and the third level product pages which we often refer to as link juice. In general, the first level pages tend to derive the maximum link juice from the home page. But in a site with excessive number of sub-categories and product pages (money pages), the pagerank distribution is not proportional with some gaining link juice and a large majority not gaining any.
Possible Related Posts
Posted by Ravi of Netconcepts Ltd. on 11/15/2009
Permalink | | Print | Trackback | Comments (0) | Comments RSS
Filed under: Link Building, PageRank, Search Engine Optimization, SEO, Site Structure auckland search engine marketing, categories and sub-categories, decent internal link juice flow, domain authority, domain trust, google index, great internal link juice flow, inbound deep links, internal link juice flow, internal linking architecture, marginal internal link juice flow, Netconcepts, page rank distribution, pagerank sculpting, pagerank threshold, ppc services, product pages
Amazon’s Secret to Dominating SERP Results
Many e-tailers have looked with envy at Amazon.com’s sheer omnipresence within the search results on Google. Search for any product ranging from new book titles, to new music releases, to home improvement products, to even products from their new grocery line, and you’ll find Amazon links garnering page 1 or 2 rankings on Google and other engines. Why does it seem like such an unfair advantage?
Can you keep a secret? There is an unfair advantage. Amazon is applying conditional 301 URL redirects through their massive affiliate marketing program.
Most online merchants outsource the management and administration of their affiliate program to a provider who tracks all affiliate activity, using special tracking URLs. These URLs typically break the link association between affiliate and merchant site pages. As a result, most natural search traffic comes from brand related keywords, as opposed to long tail keywords. Most merchants can only imagine the sudden natural search boost they’d get from their tens of thousands of existing affiliate sites deeply linking to their website pages with great anchor text. But not Amazon!
Amazon’s affiliate (“associate”) program is fully integrated into the website. So the URL that you get by clicking from Guy Kawasaki’s blog for example to buy one of his favorite books from Amazon doesn’t route you through a third party tracking URL, as would be the case with most merchant affilate programs. Instead, you’ll find it links to an Amazon.com URL (to be precise: http://www.amazon.com/exec/obidos/ASIN/0060521996/guykawasakico-20), with the notable associate’s name at the end of the URL so Guy can earn his commission.
However, refresh that page with your browser’s Googlebot User Agent detection turned on, and you’ll see what Googlebot (and others) get when they request that same URL: http://www.amazon.com/Innovators-Dilemma-Revolutionary-Business-Essentials/dp/0060521996 delivered via a 301 redirect script. That’s the same URL that shows up in Google when you search for this book title.
So if you are a human coming in from affiliate land, you get one URL used to track your referrer’s commission. If you are a bot visiting this URL, you are told these URLs now redirect to the keyword URLs. In this way, Amazon is able to have its cake and eat it too – provide an owned and operated affiliate management system while harvesting the PageRank from millions of deep affiliate backlinks to maximize their ranking visibility in your long tail search query.
(Note I’ve abstained from hyperlinking these URLs so bots crawling this content do not further entrench Amazon’s ranking on these URLs, although they are already #4 in the query above!).
So is this strategy ethical? Conditional redirects are a no-no because it sends mixed signals to the engine – is the URL permanently moved or not? If it is, but only for bots, then you are crossing the SEO line. But in Amazon’s case it appears searchers as well as general site users also get the keyword URL, so it is merely the affiliate users that get an “old” URL. If that’s the case across the board, it would be difficult to argue Amazon is abusing this concept, but rather have cleverly engineered a solution to a visibility problem that other merchants would replicate if they could. In fact, from a searcher perspective, were it not for Amazon, many long tail product queries consumers conduct would return zero recognizable retail brands to buy from, with all due respect to PriceGrabber, DealTime, BizRate, NexTag, and eBay.
As a result of this long tail strategy, I’d speculate that Amazon’s natural search keyword traffic distribution looks more like 40/60 brand to non-brand, rather than the typical 80/20 or 90/10 distribution curve most merchants (who lack affiliate search benefits) receive.
Possible Related Posts
Posted by Brian of Brian on 06/03/2008
Permalink | | Print | Trackback | Comments Off on Amazon’s Secret to Dominating SERP Results | Comments RSS
Filed under: General, Google, PageRank, Search Engine Optimization, SEO, Site Structure, Tracking and Reporting, URLs
GravityStream Does Local SEO: Now Fixes Store Locator Pages
I’m pleased to announce that GravityStream can now optimize store locator pages for those retailer sites which provide search utilities for their local outlets.
As you may recall, I’ve written before about how dealer locators are terribly optimized and how store locator pages can be optimized. A great many store locator sections of major corporate sites are not allowing search engine spiders to properly crawl through and index all the locations where they may have brick-and-mortar outlets.
Most large companies seem fairly unaware that their store locators are effectively blocking search engine spiders and are making it impossible for endusers to find their locations through simple keyword searches. I’ve also listed out a number of top store locator providers which produce locational services like this for many Internet Retailer 500 companies.
Read on for details on our results…
Possible Related Posts
Posted by Chris of Silvery on 01/08/2008
Permalink | | Print | Trackback | Comments Off on GravityStream Does Local SEO: Now Fixes Store Locator Pages | Comments RSS
Filed under: Content Optimization, Local Search, Local Search Optimization, Search Engine Optimization, SEO, Site Structure, Tools Automatic-SEO, dealer-locators, Search Engine Optimization, SEO, store-locators
Advice on Subdomains vs. Subdirectories for SEO
Matt Cutts recently revealed that Google is now treating subdomains much more like subdirectories of a domain — in the sense that they wish to limit how many results show up for a given keyword search from a single site. In the past, some search marketers attempted to use keyworded subdomains as a method for improving search referral traffic from search engines — deploying out many keyword subdomains for terms for which they hoped to rank well.
Not long ago, I wrote an article on how some local directory sites were using subdomains in an attempt to achieve good ranking results in search engines. In that article, I concluded that most of these sites were ranking well for other reasons not directly related to the presence of the keyword as a subdomain — I showed some examples of sites which ranked equally well or better in many cases where the keyword was a part of the URI as opposed to the subdomain. So, in Google, subdirectories were already functioning just as well as subdomains for the purposes of keyword rank optimization. (more…)
Possible Related Posts
Posted by Chris of Silvery on 12/12/2007
Permalink | | Print | Trackback | Comments Off on Advice on Subdomains vs. Subdirectories for SEO | Comments RSS
Filed under: Best Practices, Content Optimization, Domain Names, Dynamic Sites, Google, Search Engine Optimization, SEO, Site Structure, URLs, Worst Practices Domain Names, Google, host crowding, language seo, Search Engine Optimization, SEO, seo subdirectories, subdomain seo, subdomains
Dealer Locator & Store Locator Services Need to Optimize
My article on local SEO for store locators just published on Search Engine Land, and any company that has a store locator utility ought to read it. Many large companies provide a way for users to find their local stores, dealers, or authorized resellers. The problem is that these sections are usually hidden from the search engines behind search submission forms, javascripted links, html frames, and Flash interfaces.
For many national or regional chain stores, providing dealer-locator services with robust maps, driving directions and proximity search capability is outside of their core competencies, and they frequently choose to outsource that development work or purchase software to enable the service easily.
I did a quick survey and found a number of companies providing dealer locator or store finder functionality: (more…)
Possible Related Posts
Posted by Chris of Silvery on 09/13/2007
Permalink | | Print | Trackback | Comments Off on Dealer Locator & Store Locator Services Need to Optimize | Comments RSS
Filed under: Best Practices, Content Optimization, Dynamic Sites, Local Search Optimization, Maps, Search Engine Optimization, SEO, Site Structure chain-stores, dealer-locators, Local Search Optimization, local-SEO, store-location-software, store-locators
Double Your Trouble: Google Highlights Duplication Issues
Maile Ohye posted a great piece on Google Webmaster Central on the effects of duplicate content as caused by common URL parameters. There is great information in that post, not least of which it validates exactly what a few of us have stated for a while: duplication should be addressed because it can water down your PageRank.
Maile suggests a few ways of addressing dupe content, and she also reveals a few details of Google’s workings that are interesting, including: (more…)
Possible Related Posts
Posted by Chris of Silvery on 09/12/2007
Permalink | | Print | Trackback | Comments Off on Double Your Trouble: Google Highlights Duplication Issues | Comments RSS
Filed under: Best Practices, Dynamic Sites, Google, PageRank, Search Engine Optimization, SEO, Site Structure, URLs Canonicalization, duplicate-content, duplication, Google, Search Engine Optimization, SEO
Automatic Search Engine Optimization through GravityStream
I’ve had a lot of questions about my new work since I joined Netconcepts a little over three months ago as their Lead Strategist for their GravityStream product/service. My primary role is to bring SEO guidance to clients using GravityStream, and to provide thought leadership to the ongoing development of the product and business.
GravityStream is a technical solution that provides outsourced search optimization to large, dynamic websites. Automatic SEO, if you will. Here’s what it does…
Possible Related Posts
Posted by Chris of Silvery on 07/17/2007
Permalink | | Print | Trackback | Comments Off on Automatic Search Engine Optimization through GravityStream | Comments RSS
Filed under: Content Optimization, Dynamic Sites, HTML Optimization, Search Engine Optimization, SEO, Site Structure, Tools Automatic-Search-Engine-Optimization, GravityStream, Netconcepts, Outsourced-Search-Engine-Optimization, Search Engine Optimization, SEO
Subdomains for Local Directory Sites?
Earlier this week, my column on “Domaining & Subdomaining in the Local Space – Part 1” went live at Search Engine Land. In it, I examine how a number of local business directory sites are using subdomains with the apparent desire to get extra keyword ranking value from them. Typically, they will pass the names of cities in the third-level-domain names (aka “subdomains”). Some sites doing that include:
- CitySearch
- Craigslist
- Local.com
In that installment, I conclude that the subdomaining for the sake of keyword ranking has no real benefit.
This assertion really can be extended out to all other types of sites as well, since the ranking criteria that the search engines use is not limited to only local info sites. Keywords in subdomains really have no major benefit.
SEO firms used to suggest that people deploy their content out onto “microsites” for all their keywords – a different domain name to target each one. This just isn’t a good strategy, really. Focus on improving the quality of content for each keyword, founded on its own page, and work on your link-building efforts (quality link-building, not unqualified bad-quality links). Tons of keyword domains or subdomains is no quick solution for ranking well.
Possible Related Posts
Posted by Chris of Silvery on 04/26/2007
Permalink | | Print | Trackback | Comments Off on Subdomains for Local Directory Sites? | Comments RSS
Filed under: Local Search Optimization, Search Engine Optimization, SEO, Site Structure, URLs Domain Names, Local Search Optimization, local-search-engine-optimization, local-SEO, SEO, subdomaining, subdomains
Podcasts of Neil Patel, Eric Ward, and Vanessa Fox
I’ve been interviewing speakers of the AMA’s Hot Topic: Search Engine Marketing events taking place April 20th in San Francisco, May 25th in NYC, and June 22 in Chicago (all three of which I will be chairing). I had fascinating and insightful conversations with link builder extraordinaire Eric Ward, Googler Vanessa Fox, and social media marketing guru Neil Patel. There’s some real gold in those interviews.
Download/Listen:
- Neil Patel interview (15 minute MP3, 3 megs) – getting to the front page of Digg and other social media sites
- Eric Ward interview (36 minute MP3, 8 megs) – tips and secrets on how to garner links
- Vanessa Fox interview (40 minute MP3, 9 megs) – Google’s webmaster tools, SEO impacts of AJAX, Flash, duplicate content, redirects, etc.
More podcasts to come from other speakers, so be sure to subscribe to the RSS feed so you don’t miss them. Also be sure to register for the conference at one of the three cities, it’ll be great!
Possible Related Posts
Posted by stephan of stephan on 03/28/2007
Permalink | | Print | Trackback | Comments Off on Podcasts of Neil Patel, Eric Ward, and Vanessa Fox | Comments RSS
Filed under: Google, Link Building, SEO, Site Structure, Social Media Optimization digg, Google, Link Building, link-baiting, podcasts, Social Media Optimization, social-media-marketing, Webmaster-Central
Dupe Content Penalty a Myth, but Negative Effects Are Not
I was interested to read a column by Jill Whalen this past week on “The Duplicate Content Penalty Myth” at Search Engine Land. While I agree with her assessment that there really isn’t a Duplicate Content Penalty per se, I think she perhaps failed to address one major issue affecting websites in relation to this.
Read on to see what I mean.
Possible Related Posts
Posted by Chris of Silvery on 03/18/2007
Permalink | | Print | Trackback | Comments Off on Dupe Content Penalty a Myth, but Negative Effects Are Not | Comments RSS
Filed under: Best Practices, Content Optimization, Search Engine Optimization, SEO, Site Structure, Spiders, URLs duplicate-content, Duplicate-Content-Penalization, Jill-Whalen, Search Engine Optimization, SEO, URL-Optimization