Natural Search Blog


SEO May Be Eclipsed by User-Centered Design

I’ve been seeing indications that Google has shifted their weighting of the ~200 various signals they use in their ranking soup over the past couple of years. It used to be that PageRank along with the number of keyword references on a page were some of the strongest signals used for what page comes up highest in the search results, but I’ve seen more and more cases where PageRank and keyword density seem relatively weaker than they once were. I see a lot of reasons to believe that quality ratings have become weighted more heavily for rankings, particularly among more popular search keywords. Google continues to lead the pack in the search marketplace, so their evolution will likely influence their competitors in similar directions, too.

So, what is my evidence that Google’s development of Quality criteria is becoming more influential in their rankings than PageRank and other classic optimization elements? Read on and I’ll explain. (more…)

Nouveau Meta Tags for SEO

Back in the earliest days of search optimization, meta tags were a great channel for placing keywords for the search engines to associate with your pages. A meta tag does just what it sounds like — they are the html tags built to hold metadata (or, “data describing the data”) about pages. In terms of SEO, the main meta tags people refer to are the Keywords and Description meta tags. Meta tags are not visible to endusers looking at the page, but the meta tag content would be collected by search engines and used to rank a page — it was really convenient if you wanted to pass synonyms, misspellings, and various term stems along with the specific keywords.

Classic Meta Tags - people used to pack keywords into metatags

Immediately after people realized that meta tags could allow a page to be found more relevant in the major search engines, unscrupulous people began abusing the tags by passing keywords that had little or nothing to do with the content of their sites, and the search engines began to reduce using that content for a keyword association ranking factor because it couldn’t be trusted. Eventually, search engines pretty well dropped using them for ranking altogether and newer search engines didn’t bother to use them at all, leading Danny Sullivan to declare the death of the metatags in 2002.

Fast forward to 2006, and the situation has changed yet again. Your meta tag content can once again directly affect your pages’ rankings in the SERPs!

(more…)

Brave New Future of SEO & SEM? Marketing thru Second Life

I came to hear about Second Life after reading about it in a blog by Greg Sterling, former editor and director of the Kelsey Group’s Interactive Local Media program. If you haven’t heard about it, read about it in this article. Second Life is basically a virtual reality (“VR”) platform (or, “world”, or “metaverse” if you will). People go in there, buy VR property or other objects, and interact with thousands of other participants.

Second Life

“So what?” you might say. So, how does this differ from World of Warcraft, the Sims Online, or EverQuest?

First, there’s no goal to Second Life, per se — users just go into the thing, hang out, and interact with other users and the virtual environments. Second, users can own property in this world, and they can sell the property for real money in the real world! (Contrast with EverQuest, where they’ve actively worked to keep people from selling characters on eBay and such.) Finally, people have begun marketing through this new media — like gangbusters!

Second Life interactive scene
Copyright 2006, Linden Research, Inc. All Rights Reserved.

People like Anshe Chung are now making hundreds of thousands of dollars per year by designing and selling the SL virtual real estate. Others are offering services like architecture, event planning, selling artworks, scripting, and even financial or legal services. Some universities are now teaching within the space, too!

(more…)

Hey Digg! Fix your domain name for better SEO traffic!

Hey, Digg.com team! Are you aware that your domain names aren’t properly canonized? You may be losing out on good ranking value in Google and Yahoo because of this!

Even if you’re not part of the Digg technical team, this same sort of scenario could be affecting your site’s rankings. This aspect of SEO is pretty simple to address, so don’t ignore it and miss out on PageRank that should be yours. Read on for a simple explanation.

(more…)

Tips for Local Search Engine Optimization for Your Site

Increasingly, businesses are becoming aware of Local Search, and how optimizing for this channel is vital for those that have local outlets. Each of the main search engines has focused effort on their local search tools as the best strategy for continuing growth in online advertising, and the subject has become sufficiently important enough to merit a special Search Engine Strategies Conference devoted to the subject tomorrow in Denver. The importance of Local Search is further underscored by stats issued in a press release today by comScore, showing that Local Search continues to gain in marketshare.

So, how exactly could one optimize towards Local Search?

Read on and I’ll outline a few key tips.
(more…)

Using Flickr for Search Engine Optimization

I’ve previously blogged about optimization for Image Search. But, images can also be used for optimization for regular web search as well. Where online promotion is concerned, it appears to be an area for advantage which remains largely untapped. Many pros focus most of their optimization efforts towards the more popular web search results, and don’t realize that optimizing for image search can translate to good overall SEO.

Flickr is one of the most popular image sharing sites in the world, with loads of features that also make it qualify as a social networking site. Flickr’s popularity, structure and features also make it an ideal vehicle for search engine optimization. So, how can image search optimization be done through Flickr? Read on, and I’ll outline some key steps to take. (more…)

Robots Meta Tag Not Well Documented by Search Engines

Those of us who do SEO have been increasingly pleased with the various search engines for providing or allowing tools and protocols to allow us to help direct, control, and manage how our sites are indexed. However, the search engines still have a significant need to keep much of their workings a secret out of fear of being exploited by ruthless black-hats who will seek to improve page rankings for keywords regardless of appropriateness. This often leaves the rest of us with tools that can be used in some limited cases, but there’s little or no documentation to tell us how those tools operate functionally in the complex real world. The Robots META tag is a case in point.

The idea behind the protocol was simple, and convenient. It’s sometimes hard to use a robots.txt file to manage all the types of pages delivered up by large, dynamic sites. So, what could be better than using a tag directly on a page to tell the SE whether to spider and index the page or not?  Here’s how the tag should look, if you wanted a page to NOT be indexed, and for links found on it to NOT be crawled:

<meta content=”noindex,nofollow” name=”ROBOTS”>

Alternatively, here’s the tag if you wanted to expressly tell the bot to index the page and crawl the links on it:

<meta content=”index,follow” name=”ROBOTS”>

But, what if you wanted the page to not be indexed, while you still wanted the links to be spidered? Or, what if you needed the page indexed, but the links not followed? The major search engines don’t clearly describe how they treat these combinations, and the effects may not be what you’d otherwise expect. Read on and I’ll explain how using this simple protocol with the odd combos had some undesirable effects.

(more…)

Sneak Peek: Chasing The Long Tail of Natural Search

Phew – After 7 long months slogging away, we will finally officially release the long awaited white paper “Chasing the Long Tail of Natural Search” next week Monday (Aug 7th) at SES San Jose and the Etail Philadelphia show.

One is always a little cautious about postulating grand theories into the wide world. But after studying over 1 million unique unbranded keywords across 25 major retailer search programs, we couldn’t resist – referring to the concept we outline as “Page Yield Theory.” This is an underpinning notion that the “long tail” of unbranded search keyword traffic is inextricably linked to the website’s number of uniquely indexable site pages. To those of us who subscribe to the “every-page-should-sing-its-own-song” philosophy, that seems like an obvious statement.

Yet the challenge behind it, and the impetus for the research, arose from the fact that many (unoptimized) well-branded multichannel retailers have 10’s/100’s of thousands of unique and indexed website pages. However most of their natural search traffic (usually over 90%) comes from searches related to their own company name. How could such strong brands and massive websites produce such little traffic for generic terms, terms other than the company name?

(more…)

If you can’t do good design or good SEO… use witchcraft!

I just read this story on CNN today about how some firms offer to optimize your website through applying principles of vaastu shastra and feng shui to increase usage.

Interesting idea: If you can’t do good engineering for usability, good graphic design, and good SEO to bring traffic to your site, use witchcraft!

muscle growth steroids

The Long Tail and prioritizing your time on design and SEO

I am a big fan of the Long Tail, the term coined by Chris Anderson, Executive Editor of Wired Magazine to refer to what happens in economics when the bottlenecks that stand between supply and demand in our culture start to disappear and everything becomes available to everyone.

In this article I found it quite interesting that UIE applied the concept of the Long Tail to prioritizing where you spend the bulk of your time on design and usability. Sure, there are a few pages that get a large chunk of traffic, such as the home page, but that doesn’t mean that that is where you should spend most of your design time. Instead look at the buckets of pages that add up to a large chunk of your traffic. For example, if all of the articles on your site add up to a large amount of your traffic, then you should spend a reasonable amount of your time in your redesign focusing on the articles template.

I think this same argument applies to search engine optimization (SEO) as well as to design. If your product pages account for 50% of your traffic, half of your SEO time should be spent on the product pages (rather than your articles, FAQs, etc.).

Spend your time on the tail!

дэвид пол

RSS Feeds
Categories
Archives
Other