Natural Search Blog


Brave New Future of SEO & SEM? Marketing thru Second Life

I came to hear about Second Life after reading about it in a blog by Greg Sterling, former editor and director of the Kelsey Group’s Interactive Local Media program. If you haven’t heard about it, read about it in this article. Second Life is basically a virtual reality (“VR”) platform (or, “world”, or “metaverse” if you will). People go in there, buy VR property or other objects, and interact with thousands of other participants.

Second Life

“So what?” you might say. So, how does this differ from World of Warcraft, the Sims Online, or EverQuest?

First, there’s no goal to Second Life, per se — users just go into the thing, hang out, and interact with other users and the virtual environments. Second, users can own property in this world, and they can sell the property for real money in the real world! (Contrast with EverQuest, where they’ve actively worked to keep people from selling characters on eBay and such.) Finally, people have begun marketing through this new media — like gangbusters!

Second Life interactive scene
Copyright 2006, Linden Research, Inc. All Rights Reserved.

People like Anshe Chung are now making hundreds of thousands of dollars per year by designing and selling the SL virtual real estate. Others are offering services like architecture, event planning, selling artworks, scripting, and even financial or legal services. Some universities are now teaching within the space, too!

(more…)

Hey Digg! Fix your domain name for better SEO traffic!

Hey, Digg.com team! Are you aware that your domain names aren’t properly canonized? You may be losing out on good ranking value in Google and Yahoo because of this!

Even if you’re not part of the Digg technical team, this same sort of scenario could be affecting your site’s rankings. This aspect of SEO is pretty simple to address, so don’t ignore it and miss out on PageRank that should be yours. Read on for a simple explanation.

(more…)

Tips for Local Search Engine Optimization for Your Site

Increasingly, businesses are becoming aware of Local Search, and how optimizing for this channel is vital for those that have local outlets. Each of the main search engines has focused effort on their local search tools as the best strategy for continuing growth in online advertising, and the subject has become sufficiently important enough to merit a special Search Engine Strategies Conference devoted to the subject tomorrow in Denver. The importance of Local Search is further underscored by stats issued in a press release today by comScore, showing that Local Search continues to gain in marketshare.

So, how exactly could one optimize towards Local Search?

Read on and I’ll outline a few key tips.
(more…)

Using Flickr for Search Engine Optimization

I’ve previously blogged about optimization for Image Search. But, images can also be used for optimization for regular web search as well. Where online promotion is concerned, it appears to be an area for advantage which remains largely untapped. Many pros focus most of their optimization efforts towards the more popular web search results, and don’t realize that optimizing for image search can translate to good overall SEO.

Flickr is one of the most popular image sharing sites in the world, with loads of features that also make it qualify as a social networking site. Flickr’s popularity, structure and features also make it an ideal vehicle for search engine optimization. So, how can image search optimization be done through Flickr? Read on, and I’ll outline some key steps to take. (more…)

Putting Keywords in Your URLs

Recently Matt Cutts blogged that:

doing the query [site:windowslivewriter.spaces.live.com] returns some urls like windowslivewriter.spaces.live.com /Blog/cns!D85741BB5E0BE8AA!174.entry . In general, urls like that sometimes look like session IDs to search engines. Most bloggy sites tend to have words from the title of a post in the url; having keywords from the post title in the url also can help search engines judge the quality of a page.

He then clarified his statement above, in the comments of that post:

Tim, including the keyword in the url just gives another chance for that keyword to match the user’s query in some way. That’s the way I’d put it.

What does this mean? It means that from Google’s perspective, keywords in your URLs are a useful thing to have. It’s another “signal” and can provide ranking benefits.

How should you separate these keywords? Not with underscores, that’s for sure. Matt Cutts has previously gone on the record to say that Google does not treat underscores as word separators. Use hyphens instead. Or plus signs would be okay too.

Also, I’d avoid too many hyphens in the URL, as that can look spammy. Try to keep it to three or fewer. Unless your site is powered by WordPress, in which case Google probably makes an exception for that, given how popular it is and how many legitimate bloggers have loads of hyphens in their permalink URLs. By the way, you can trim those down using the Slug Trimmer plugin for WordPress.

тренировки

Robots Meta Tag Not Well Documented by Search Engines

Those of us who do SEO have been increasingly pleased with the various search engines for providing or allowing tools and protocols to allow us to help direct, control, and manage how our sites are indexed. However, the search engines still have a significant need to keep much of their workings a secret out of fear of being exploited by ruthless black-hats who will seek to improve page rankings for keywords regardless of appropriateness. This often leaves the rest of us with tools that can be used in some limited cases, but there’s little or no documentation to tell us how those tools operate functionally in the complex real world. The Robots META tag is a case in point.

The idea behind the protocol was simple, and convenient. It’s sometimes hard to use a robots.txt file to manage all the types of pages delivered up by large, dynamic sites. So, what could be better than using a tag directly on a page to tell the SE whether to spider and index the page or not?  Here’s how the tag should look, if you wanted a page to NOT be indexed, and for links found on it to NOT be crawled:

<meta content=”noindex,nofollow” name=”ROBOTS”>

Alternatively, here’s the tag if you wanted to expressly tell the bot to index the page and crawl the links on it:

<meta content=”index,follow” name=”ROBOTS”>

But, what if you wanted the page to not be indexed, while you still wanted the links to be spidered? Or, what if you needed the page indexed, but the links not followed? The major search engines don’t clearly describe how they treat these combinations, and the effects may not be what you’d otherwise expect. Read on and I’ll explain how using this simple protocol with the odd combos had some undesirable effects.

(more…)

Yahoo update beefs up on authority sites

Aaron Wall posted a blog about how Yahoo!’s recent algorithm update has apparently increased weighting factors for links and authority sites.

Predictibly, a number of folx have complained in the comments added to Yahoo’s “Weather Report” blog about the update. Jeremy Zawodny subsequently posted that their search team was paying close attention to the comments, which is always nice to hear.

Coincidentally, I’d also just recently posted about Google’s apparent use of page text to help identify a site’s overall authoritativeness for particular keywords/themes.

As they say, there’s nothing really new under the sun. I wonder if the search engines are all returning to the trend of authority/hub focus in algorithm development? It’s a strong concept and useful for ranking results, so the methodology for identifying authorities and hubs is likely here to stay.

New WordPress Plugin for SEO

I’ve just released “SEO Title Tag”, a plugin for WordPress. As the name implies, it allows you to optimize your WordPress site’s title tags in ways not supported by the default WordPress installation. For example:

Get the plugin now: SEO Title Tag WordPress Plugin

I’d love your feedback, as this is my first WordPress plugin.

Enjoy!

Google Sitemaps Reveal Some of the Black Box

I earlier mentioned the recent Sitemaps upgrades which were announced in June, and how I thought these were useful for webmasters. But, the Sitemaps tools may also be useful in other ways beyond the obvious/intended ones.

The information that Google has made available in Sitemaps is providing a cool bit of intel on yet another one of the 200+ parameters or “signals” that they’re using to rank pages for SERPs.

For reference, check out the Page Analysis Statistics that are provided in Sitemaps for my “Acme” products and services experimental site:

Google Sitemaps Page Analysis

It seems unlikely to me that these stats on “Common Words” found “In your site’s content” were generated just for the sake of providing nice tools for us in Sitemaps. No, the more likely scenario would seem to be that Google was already collating the most-common words found on your site for their own uses, and then they later chose to provide some of these stats to us in Sitemaps.

This is significant, because we’ve already known that Google tracks keyword content for each page in order to assess its relevancy for search queries made with that term. But, why would Google be tracking your most-common keywords in a site-wide context?

One good explanation presents itself: Google might be tracking common terms used throughout a site in order to assess if that site should be considered authoritative for particular keywords or thematic categories.

Early on, algorithmic researchers such as Jon Kleinberg worked on methods by which “authoritative” sites and “hubs” could be identified. IBM and others did further research on authority/hub identification, and I heard engineers from Teoma speak on the importance of these approaches a few times at SES conferences when explaining the ExpertRank system their algorithms were based upon.

So, it’s not all that surprising that Google may be trying to use commonly-occuring text to help identify Authoritative sites for various themes. This would be one good automated method for classifying sites for subject matter categories and keywords.

The take-away concept is that Google may be using words found in the visible text throughout your site to assess whether you’re authoritative for particular themes or not.

 

Google Sitemaps upgrades help webmasters

The Google Sitemaps team just last week announced a number of changes on their blog.

I was really happy and excited that they appear to’ve done a few of the things I suggested in a post on the Google Sitemaps Group.

They did the following things I had suggested:

There were some additonal things they did which are also interesting:

I’m sure other folx must’ve requested some of the same things I’d suggested, and Google’s good at providing useful features, but it’s really gratifying to see some of the changes I’d wanted showing up now!

Stay tuned for a follow-up posting from me about some of these changes. Some of these new features actually provide some great intel on parameters/methods that Google uses to rank pages.

clenbuterol liquid

RSS Feeds
Categories
Archives
Other