Blog of Affordable Web Designs Inc, offer tips and resources on web development, website design, marketing, and website promotion tactics.

Thursday, June 28, 2007

Google Spam Team Targets Real Estate Websites

By Marc Rasmussen (c) 2007

Originally published in SiteProNews, June 25, 2007

If you have any experience or background in SEO (search engine optimization) you know that Google likes website that have links pointing to it. A part of their search algorithm involves the popularity of the site determined by the number of other sites pointing to it via a link. With all else being equal Google will rank a website with quality links pointing to it higher than another website with no links. If you read Google's Webmaster Guidelines (Webmaster Central) the very first bit of advice they give you is:

* have other relevant sites link to yours.

Google loves links and they admit it. Webmasters figured this out and came up with all kinds of different techniques for getting links to their sites - link baiting, reciprocal links, ninja links, 3 way links, one way links, contextual links etc. The problem is a lot of the techniques are frowned upon by Google. They want you to get links the natural way and not to try to cheat the system.

Their guidelines say:

* Don't participate in link schemes designed to increase your site's ranking or PageRank. In particular, avoid links to web spammers or "bad neighborhoods" on the web, as your own ranking may be affected adversely by those links.
* Avoid tricks intended to improve search engine rankings. A good rule of thumb is whether you'd feel comfortable explaining what you've done to a website that competes with you. Another useful test is to ask, "Does this help my users? Would I do this if search engines didn't exist?

In one breath Google tells you to get links to your site and then in the other breath tells you not to participate in any link scheme designed to increase your site's ranking. How do you know what a link scheme is? What is acceptable? What is not acceptable? I guess you have to ask yourself "Does this help my users?" It definitely is not a black and white situation and has plenty for room for interpretation.

Google has a web spam team headed by Matt Cutts. Their job is to enforce these Google Webmaster Guidelines and crack down on link spam. When he talks people listen. He frequently speaks at search engine conferences around the world.

I could not find it anywhere (didn't really look long) but heard that Matt Cutts in the past has mentioned that he does not like the reciprocal linking that Realtors engage in. Supposedly, he had mentioned this several times over the last few years but has not done anything about it.
Real estate agents trade links with other agents around the country and world with the argument that we refer business to each other so therefore it is a legitimate link exchange and not done for the sole purpose of pumping up our rankings. This may be true in some instances but for the most part it is bologna. I would venture to say that 95% of the agents who trade links do so to achieve higher rankings in the results. That was the only reason I did it. What was the likelihood that a visitor to my Sarasota, Florida website was there to find an agent in Albuquerque, New Mexico? The only reason I ever traded a link with an agent outside Florida was to get to number 1 in Google.

In the beginning, when I knew absolutely nothing, I was told to trade links with anyone and everyone. I did that for awhile. Then someone told me to only trade with people in the real estate industry. So, I changed it up and did that. The link exchanging worked. I wasn't number 1 but was on the first page for a couple key phrases.

Fast forward a little bit, I changed to a more search engine friendly website, traded links sporadically, added plenty of original content, included and updated a blog frequently. My overall goal was to become an authority site and build a valuable, useful website to people looking for anything real estate related in Sarasota, Florida. I simply wanted to make a website better than my competitors. Eventually, I became number 1 for the most sought after key phrases in my market and was on the first page for hundreds of phrases. I got there by having a better website than most. That is a great thing about the Google search engine. It generally does the best job of providing relevant results.

It dawned on me that reciprocal linking with hundreds of agents around the country was pretty useless to my visitors so I stopped doing it about a year or so ago. I did a couple contextual link exchanges with other agents in Florida. I figured that would be useful to someone looking for real estate in Florida but was not sure in what part of the state to look.
I had stopped paying attention to what was going on in the SEO world for awhile as I get paid to service buyers and sellers of real estate, not updating my website. I continued to post in my blog but that was the extent of it. So, I was out of the loop for 6 months or so.

A shot across the bow:

Advanced Access is an enormous real estate website provider with around 30,000 Realtor websites. I am not up to speed on the details but apparently many or all of the Advanced Access users were penalized by Yahoo. Many of the AA users would trade links with each other and it was possibly viewed by Yahoo as one big incestuous link farm.
Advanced Access hired Greg Boser, a search engine guru, to solve their problems with Yahoo. His recommendation to AA users was to erase all state pages containing excessive reciprocal links. He viewed them as useless spam. Not all of the AA users took his advice. Many of the customers resisted because of the hundreds of man hours they put into accumulating all of the links. Others were probably ranking well and did not want to rock the boat.
In April of 2007 many high ranking Advanced Access websites were suddenly missing from the Google search results. Was it an algorithm change? Probably not since the AA website were the only ones affected. Ultimately, the consensus was that the AA websites were manually given a penalty by Google. Of course, no one was 100% sure because you can't call Google on the phone or email them to get a definitive answer. Why would Google focus on only one website provider? Why the largest provider of real estate websites? Were they penalized for excessive reciprocal links? Probably. Evidently, Matt Cutts previously made warnings about it. Did the well connected internet guru, Greg Boser, have anything to do with the penalty? I don't know. Some people believe that. He supposedly knows Matt Cutts.

[ed. note I: Greg does know Matt Cutts. I do too. Greg most certainly could NOT ever get Matt to interfere for commercial reasons. Matt doesn't do that sort of thing, nor does Google. - jh]
In the Advanced Access forums one participant noticed that some AA websites were penalized while others were not. Almost all of the sites had state pages and excessive reciprocal links. Why were some chosen and other spared? How did Google determine which websites to hit and why? The forum participant noticed that just about everyone who voiced their opinions, beliefs and thoughts in the forums were hit with the penalties while the site owners who were not active went penalty free. Did Google hand pick the loudest website owners?
In May, just over 30 days later, most of the AA sites had their penalties dropped and were found again in Google.

Real Estate Webmasters is another large and very visible real estate website provider. Many of their customer sites rank exceptionally well in Google. They also host a very active forum for webmasters. If someone (or Google) wanted to send a message through the real estate community this is a great place to do it. I found it interesting that Greg Boser, the well connected internet guru, became very active in the Real Estate Webmasters forum on April 27th. Just a few weeks prior to many of their websites getting hit with a penalty.

On May 9th, a number of high ranking Real Estate Webmaster websites were suddenly missing from the Google search results. Sites from all over country had been hit with a Google penalty. I own one of those sites and was probably penalized for state pages and reciprocal linking. The damn thing is that I deleted all of my state pages and links on the 4th of May. Google most likely pinpointed me as a violator before the 4th and it took until the 9th to employ the penalty.
It is interesting that Google would hit customers of another large and potentially loud real estate website provider with an active forum. Is Google trying to send a message to the real estate community about excessive reciprocal links? Maybe. Probably. Greg Boser became active in the forums a couple of weeks prior to the penalty. Did he have something to do with it? I don't know. Possibly.

[ed. note II. Again, probably not. We will try to reach Greg for comment however we strongly doubt he was involved in having listings removed. It is possible he was warned or has received information regarding the penalty but extremely unlikely he was directly involved in its planning or execution. - jh]

Here we are 44 days later and almost all of the Real Estate Webmasters sites are still penalized. We all have deleted our state pages, reciprocal links, removed the URL's through the Google webmaster tools and have asked forgiveness via a re-inclusion request.

Is Google waiting for us to spread the word throughout the Realtor community about the no-no's of reciprocal linking? I don't know. Is it fair that a small group of us were targeted while others continue to fill their websites with spammy reciprocal links? No, but life is not fair.
Rand Fishkin at www.seomoz.org recently interviewed Matt Cutts. You can watch the interview here http://www.seomoz.org/blog/the-smx-diaries-iv-the-matt-cutts-interview. Watch the 2nd video and fast forward it to around the 5:30 mark. Matt acknowledges the Real Estate Webmaster thread discussing the recent penalties. He also mentions "a shot across the bow" technique of policing.

Isn't everyone employed by Google a genius? Couldn't they have come up with a better system to policing the real estate industry other than hurting a few mom and pop Realtors?
Lessons Learned:

* Don't try to game the Google search engine. They will eventually figure you out.
* Don't trade links with hundreds of Realtors around the country. Build links only for the benefit of your users, not your search engine rankings.
* Build a site with tons of unique content, provide lots of value to visitors and eventually links will come with time. I have received several one way links from the Sarasota Herald Tribune (my local newspaper) from articles they wrote about my website and blog. I imagine this is the way Google likes to see links built.
* Google isn't perfect.
* If you rely on Google search results to feed you and your family you better have some money stashed away in the event you get penalized. Fortunately, I did this.
* If you do make money off the web keep at least one eye on what is happening in the search engine world to stay competitive.
* I rank fairly well in MSN and really don't get that much traffic from it.
* Ranking very well in Google will bring tons of traffic to your website.
* Being penalized by Google sucks.

Marc Rasmussen is a realtor in Sarasota Florida. He publishes The Sarasota MLS website.

Labels:

Increase Search Engine

Every web site owner wants to íncrease search engine traffíc. It's free and the visitor is targeted to your subject matter, product or service. What more could you ask for in search engine traffíc? The downside is that you need to understand search engine ranking methods and that is quite a challenge for many.

The number of opinions and "experts" on ways to íncrease search engine traffíc is overwhelming. And regardless of what anyone might tell you, they're all guessing. The search engines themselves don't divulge how their methods work for one simple reason. As soon as anyone figures out the method, there's a mad rush to implement changes based on the method.

In a perfect world, where there were no scoundrels, this might not be a factor. Everyone would organize their web site information so that a visitor could easily find what they are looking for, and life would be good. But we certainly don't live in a perfect world and scoundrels are everywhere.

So we are at the mercy of the search engines to help us sort through the clutter to find what we want. And that's the value that the search engines provide, accurate and meaningful search engine results that are related to the search terms or phrases. So it goes back to the quality of content, that's the only common factor in all 3 major search engines' ranking methods.
Each of the big 3 (Google, Yahoo, and MSN) search engines use a little different method and technology to arrive at any given web site's ranking under specific search terms. As mentioned above, no one knows exactly how each method works. But you can test different strategies and methods to see how they impact your rank.

And therein lies the only true method of determining what the search engines might look for when ranking your web page on specific search terms. I'm sure most of us are aware that most processes can be expressed in mathematical equations. I'm not sure if that's the best method for search engine ranking, but it is the most popular for the search engine ranking process.
And consider the fact that when someone has determined (or thinks they have determined) one of the factors used in search engine rankings they beat it to death. Every discovered aspect in the past few years has been exploited immediately to the point of the search engines abandoning the tactic. As soon as the search engines see that someone can beat the system, they change it.

That's one of the big reasons you see constant change in ranking methods. Since there is a tremendous amount of revenue at stake for all concerned, i.e. sales of products and services for the web site owners, plus the advertising revenue for the search engines, any advantage is huge.

If you would have tested and tracked all the changes and methods in search engine rankings for the past few years, one constant factor would stand out. This is also one of the most misunderstood and often overlooked elements in search engine ranking for a specific term.
So what's the one thing that the search engines can't change about their mathematical algorithms? You got it, CONTENT! The search engines can play with the process, methods, or means to judge web site content. But, if web site owners stick to the basic philosophy of providing meaningful content, in relation to the search term, the impact of changes are far less, if even felt.

And that is where many web site owners run afoul of getting good search engine rankings. Many jump on the bandwagon with every new revelation in search engine strategies based on the latest changes. It's the old forest and trees scenario, Internet style. Even if you get a slight advantage from all these "new" tactics, it will be short lived. As soon as the search engines catch on that you are working the system, they will change the system.
So the best way to íncrease your rankings for a particular search term is to provide meaningful information or content based on that search term. Here a few guidelines I've found that help:
Make sure you focus on the subject matter (don't try to satisfy too many terms with one web page).

Get inside the searcher's head – figure out what they want and give it to them.
Be specific and provide details – don't generalize and be descriptive.
For a sales page use benefits and features to fully explain the problem and the solution.
Update your information often – setup a schedule to update and add more content.

There are many sub factors that can have an impact on how well the search engine bots can determine the value of the content. The search engine bots are software programs that go out and "read" your web page and then provide the information to rank your web pages on what they discovered.


You'll find many "experts" who will give you a hard líst of items and how to present this information. And I don't disagree with suggestions to include the search term in your web page in key areas like title, description, and font designations like H1. That helps the search engine software bots to determine the content.

But I've seen web pages with no meta-tags, title, or description but that had rock solid content and still had high search engine rankings on a particular search term. So make it easy for the search engines, but always remember that the content is the single consistent factor in search engine rankings.

There are a lot of different aspects to consider when trying to improve your search engine ranking, and íncrease search engine traffíc. The keyword and phrase research, interpreting the search term or phrase to provide the best result, reviewing other popular web sites with the same intentions, linking to other like content web sites, and more. How you present the information and content is also an important issue.


But all those factors come after good content. So if you start with providing the best content, you can't go wrong. After all, this is both an art and a science, not to mention a moving target. If you would like to learn more, please visit our other article links below.

About The AuthorVisit jd WebWorks, to see some recent case studies and get better results with SEO Web Site Design.

Checking Supplemental Index Status for URLs in Large Sites

For sites with fewer than 1000 pages, it's possible (if not monotonous) to see which URLs are in Google's Supplemental Index. Simply run a site: command for your domain (example) and scroll through the results pages until you start to see "Supplemental Result" next to some of the URLs.

But what if your site has 50,000 pages and the supplemental results don't start until the final 10,000? Even the fairly common site:domain.com *** -view query isn't totally accurate, and it's still subject to the 1000 URL display limit.

Depending on which case you find yourself, it can be either tedious or impossible to detect whether a specific URL is Supplemental.

Using our blog site as an example, suppose I suspect -- but can't confirm -- that an old post about Yahoo Sitemaps is in the SI. A simple info: query doesn't tell you whether the URL is supplemental or not. For example, the following shot came from the query:

[info:http://seoblog.intrapromote.com/2006/11/an_update_on_ya.html]

Instead, a quick way to check Supplemental status is to pull a unique string from the URL in question (such as a folder or filename) and tack it into an inurl:-filtered site: query. In other words, the following shot came from this query, in which I added the filename (minus extension) into the inurl: command:

[inurl:an_update_on_ya site:seoblog.intrapromote.com]

In this result, note the Supplemental Index status.
The bottom line is to find an inurl: string that will quickly filter down the site: query results so that your specific URL shows up quickly.
posted by Erik Dafforn at June 26, 2007 08:50 AM
Intrapromote: [ Case studies SEO services Bios ]

Labels: , , ,

Judge rejects Google's anti-Microsoft antitrust bid

Posted by Anne Broache

Editor's note: This story was updated at 10:38 a.m. PST
WASHINGTON--A federal judge refused on Tuesday to rule on a last-minute Google antitrust complaint about Windows Vista's desktop search, saying she trusted government attorneys who said they were already satisfied with Microsoft's planned changes.
U.S. District Judge Colleen Kollar-Kotelly said she would rely on the U.S. Department of Justice and state attorneys to alert her if any further action is needed to address antitrust allegations lodged on Monday by Google that the search function still won't allow for adequate "user choice."

"The plaintiffs, as far as I'm concerned, stand in the shoes of the consumer," Kollar-Kotelly said at a periodic status conference here. She added that Google "is not a party to the case."
Last week, Microsoft and its government antitrust overseers outlined a number of steps Redmond had agreed to take to address Google's ongoing concerns. Those moves included, among other things, adding a mechanism that would allow both computer makers and individuals to choose a default desktop search program--much as they can choose a rival browser or media player.

But Google argued in a filing on the eve of the already scheduled court hearing that it wasn't convinced those tweaks went far enough. The search giant also asked the judge to consider extending the November 12 expiration date for certain parts of the consent decree to ensure Microsoft was truly complying with an antitrust agreement dating back to 2002.
Kollar-Kotelly said she plans to rule on whether Google was allowed to file the seven-page brief it submitted to the court on Monday (Microsoft has opposed the filing as procedurally out-of-bounds). But she said she was "not going to take any position on it or comment on" the brief's content.

The judge did suggest, however, that perhaps Google lacks complete information about the proposed Vista changes. She said she expected the government attorneys and Microsoft to supply a fuller description than that revealed in their most joint recent court filing.
Google said after Tuesday's hearing that it was encouraged by what it viewed as the judge's sensitivity to its request for more information.

"As a result of our raising concerns about Vista desktop search, the Department of Justice and the states secured remedies from Microsoft that will provide consumers more choices than existed before," senior policy counsel Alan Davidson, who attended the hearing, said through a spokesman. "We are pleased that the authorities have provided important oversight here, and hope they will closely monitor the implementation to ensure that consumers' interests are served."

In their appearances before the judge, attorneys for the U.S. Department of Justice and state plaintiffs continued to emphasize they were satisfied the Google complaint had been fully resolved. They also said they expected to receive the beta version of the code in time to test it for compliance before the November expiration date. "Certainly if we don't, we will be back here" in court, said Justice Department attorney Aaron Hoag.
Microsoft attorney Charles Rule told the judge that he believed much "misinformation" has been circulating about the way Vista's desktop search works. Using a print-out of a desktop screenshot as a prop, he walked the courtroom audience through the planned changes and existing features.

For instance, contrary to what some have said, Vista's search indexer has been "designed to back off" when other applications are running, which should assuage any fears about third-party desktop search applications encountering problems, Rule said. He also said users can already set up their machines so that third-party desktop search options appear in the left-hand side of the Start menu and in menus that appear when the desktop is right-clicked--and said that wouldn't change.

Kollar-Kotelly indicated she was also pleased with the progress that Microsoft and antitrust authorities had reported in other areas, such as its communications protocol program, a required licensing regime aimed at helping third-party developers to create software that works with Windows.

"This has been a productive report and a productive hearing," she said near the event's close.
Microsoft general counsel Brad Smith said in a statement afterward that the company is "going to work hard to implement the resolution we reached with all the governments involved and presented to the court today."

The parties are currently due back in Kollar-Kotelly's courtroom on September 11, with a status report to be filed a few weeks beforehand.

Labels: , , , ,

Winner Best Keyword Research Tool » Online Marketing Blog

Posted by Lee Odden on Jun 25th, 2007 in Online Marketing, Keyword Research


The poll is closed and the results are in with 154 votes for the most recent TopRank OMB Reader Poll: Best Keyword Research Tools. The two long standing tools of choice for many search marketers, Keyword Discovery and Wordtracker, battled it out and in the end, tied.
What surprised me was how popular the old Overture tool continues to be despite not being updated since January 2007. Perhaps that should spell an opportunity for Yahoo to somehow use the url for a new and improved keyword tool with ads for Panama on it.

Keyword Discovery (18%)
Wordtracker (18%)
WordZe (15%)
Google Keyword Tool (14%)
SEO Digger (12%)
Overture Keyword Selector (6%)
SEO Boook Keyword Tool (3%)
KeywordSpy (3%)
SpyFu (3%)
Digital Point Keyword Suggestion Tool (3%)
NicheBOT (2%)
Hitwise Search Intelligence (2%)
Google Suggest Scraper (1%)
comScore qSearch (0%)
AdGooRoo SEM Insight (0%)

Labels: , , ,