Reading User Agents to Identify Bot Traffic

by demtron on Monday, February 02, 2009 08:41 AM

Sometimes, it's a real pain to tell the difference between a human visitor and a bot when reviewing a Web server traffic log.  I recently had an acquiantance ask me for information on this topic.  I found a great resource at that catalogs nearly 3000 known user agents that are associated with bots.

In many cases, there are identifiers such as GoogleBot, msnbot and Slurp that are easy to spot as these are common bot user agent signatures.  Unfortunately, there's no common identifier among all of them.  I figure that this list could be pulled into a lookup table and used for matching against a server log.  What I wan't able to identify is how frequently this list is updated for new signatures.

Killing Your Search Engine Ranking in 7 Easy Steps

by demtron on Wednesday, January 14, 2009 07:40 PM
Have you ever wanted to completely destroy your search engine ranking or do it for someone else?  Maybe you never want your site to get found again?  Believe it or not, I've helped clients in the last six months with each of these problems that were torching their search engine profile and strangling their organic traffic.
1) Domain masking: I took over one site where the entire site was using domain masking.  In seven years, the client had absolutely no idea that the previous designer was doing this to save himself a buck on a hosting plan for the site.  Only the home page was found in search engines.  It turns out the designer did exactly the same thing with all the rest of his clients' sites that have been live for years.  They, too, only have the home page to show for it in search engine results.
2) Use only a JavaScript menu for linking pages: Sure, JavaScript menus are cool.  They can drop down, slide across, have pictures and generally spice up a site.  But they can't be crawled by search engines.  What's more, the links in them don't contribute anchor text, either.  One recent client had over a hundred pages in a JavaScript menu and practically no linking using anchor tags.  Most of the 200 pages of the site had not been crawled at all
3) Use a JavaScript page strip: ASP.Net is famous of offering page navigation strips using is doPostBack JavaScript methods.  Another client I acquired had over 6 thousand pages on a site, but only 32 pages actually crawled.  The remaining pages were all accessible through paginated tables.  Another great waste of code that a search engine ignores.
4) All pictures and Flash with little text: Some designers with a flair for graphic design take sites a little overboard.  If you're a famous pop star and have zillions of fans finding your site every day by just typing, then who cares?  In the real world, most sites are not wildly popular and are only found through search engine results.  SEs love text, especially keyword-rich, backlinked text.  Pictures and Flash sure are pretty, but they basic tell nothing to a crawler.
5) Renaming pages without redirects:  One site I redesigned earlier this year had tons of links from other sites pointing to a page that was non-existent.  What a complete waste of free traffic and promotion!  Both search engines AND human visitors wouldn't find the site.  Oh, what a little bit of 301-redirect action did to help out that one.
6) Leaving title tags blank:  One of the aforementioned sites had about 60 of it's 200 pages with blank titles.  How is anyone going to find those pages, and why would anyone click on them?  Here, let's write a book, then tear the front cover off and leave one of those "this page intentionally left blank" pages as the new front cover.  Real slick.
And last, but not least...
7) user agent: *  disallow: / in the ROBOTS.TXT file: This one didn't actually happen, although it was close.  The site had the disallow all set for a user agent of Google.  So, they kissed 81% of their traffic away just by a simple screw-up by the former designer.
And there you have it.  If you implement these seven key steps, your success with annihilating your search engine exposure and traffic is pretty much guaranteed.  Good luck and happy destroying!

Milwaukee SEO: Keywords for Real Estate Web Sites

by demtron on Wednesday, January 14, 2009 04:38 PM

Real estate agents can achieve significant exposure on the Web through their Web sites and blogs.  According the National Association of Realtors 2006 survey of Internet use (, 24% of all buyers first learn about a home they purchased through the Internet, and a whopping 77% of all buyers used the Internet to search for homes.  An increasing number of home buyers (and sellers) are savvy with technology and conduct their initial investigation of homes real estate web sites. 

Unfortunately, there is a surprising number of real estate Web sites that don’t rank well in search engine results, even for properties in their own communities.  The agents who own these sites likely don’t know much traffic and Web visibility their missing. 

Real Estate Location Keywords for SEO

So, how do we go about determining which keywords will work well for an agent’s site?  First and foremost, it’s critical to know the specific cities to target, the size and relative importance of each, and what variations exist for each name.  For example, let’s choose Waukesha County, Wisconsin for as a sample.  Here’s the list of most populous cities in descending order according to the 2000 U.S Census:



New Berlin





These should be the core of the keywords used for a Waukesha county real estate site as most buyers and sellers will target specific communities when conducting searches.  Variations should be included such as Saint and St., Mount and Mt.,  directionals such as North and N., and Fls. and Falls.  Don’t forget to include Waukesha County itself!

Subject Keywords for Search Engine Optimization

The next sets of words and phrases to consider are subject words.  These represent the subject matter of the site (and their derivations) for which visitors would look.  Typical phrases that should be a the top of the list are:

Real Estate





Home Selling

Foreclosures (if these are important)



Real Estate


Again, this is a standard list that immediately comes to mind.  Consider using a keyword suggestion tool or using the services of a Milwaukee Real Estate SEO Expert to help you determine which ones are most valuable or others that belong on the list.

Secondary Real Estate Keywords for Long-Tail Searches

The last set of words to consider is secondary words that will likely be part of long-tail searches.  Those are searches that go beyond the typical two to three core words that were already chosen and they’ll generally be a part of the majority of search traffic that will come to a real estate site.  A few examples are:




New Homes

For Sale




Selecting keywords is one of the steps required in creating a strong organic search engine profile for real estate Web sites.  For more information on optimizing real estate web sites for search engines, contact Demtron, your Milwaukee and Waukesha SEO Experts.

Centering a DIV Window with Cross-Browser JavaScript

by demtron on Wednesday, January 14, 2009 02:28 PM

Years ago, I wrote a little bit of JavaScript to position a DIV control in the center of a browser window space by calculating the browser area, the window size, then determine the coordinates of the place where the DIV would need to be placed.  Since I forgot where I put it and it brobably wouldn't be cross-browser compatible anyway (orginally written for IE 5/6), I searched the web and ran across this JavaScript centering code that accomplishes exactly that.  Here are the excerpts of the code needed to do this.

window.size = function()
var w = 0;
var h = 0;
//strict mode
if(!(document.documentElement.clientWidth == 0))
w = document.documentElement.clientWidth;
h = document.documentElement.clientHeight;
//quirks mode
w = document.body.clientWidth;
h = document.body.clientHeight;
w = window.innerWidth;
h = window.innerHeight;
return {width:w,height:h};
} = function()
var hWnd = (arguments[0] != null) ? arguments[0] : {width:0,height:0};
var _x = 0;
var _y = 0;
var offsetX = 0;
var offsetY = 0;
//strict mode
if(!(document.documentElement.scrollTop == 0))
offsetY = document.documentElement.scrollTop;
offsetX = document.documentElement.scrollLeft;
//quirks mode
offsetY = document.body.scrollTop;
offsetX = document.body.scrollLeft;
offsetX = window.pageXOffset;
offsetY = window.pageYOffset;
_x = ((this.size().width-hWnd.width)/2)+offsetX;
_y = ((this.size().height-hWnd.height)/2)+offsetY;

Here's some example code to show how it's supposed to be used:


<!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Strict//EN" "">
<script type="text/javascript" src="Window.Size.js"></script>
<script type="text/javascript">
function showCenter(point)
var div = document.createElement("div"); = "#dedede"; = "absolute"; = point.y + "px"; = point.x + "px"; = "100px"; = "100px";
<div style="height:1200px"></div>
<input type="button" value="Get Center" onclick="showCenter({width:100,height:100}))"/>


Powered by BlogEngine.NET
Theme by Mads Kristensen · Adapted by Demtron

Bookmark and Share


<<  May 2024  >>

View posts in large calendar
Log in

Milwaukee SEO Company

Milwaukee Access Programmer/Developer

Milwaukee Website Designer and Developer

Marketing / SEO

Blog Directory
blogarama - the blog directory
Milwaukee area SEO, SEM, ASP.Net