Killing Your Search Engine Ranking in 7 Easy Steps

by demtron on Wednesday, January 14, 2009 07:40 PM
Have you ever wanted to completely destroy your search engine ranking or do it for someone else?  Maybe you never want your site to get found again?  Believe it or not, I've helped clients in the last six months with each of these problems that were torching their search engine profile and strangling their organic traffic.
 
1) Domain masking: I took over one site where the entire site was using domain masking.  In seven years, the client had absolutely no idea that the previous designer was doing this to save himself a buck on a hosting plan for the site.  Only the home page was found in search engines.  It turns out the designer did exactly the same thing with all the rest of his clients' sites that have been live for years.  They, too, only have the home page to show for it in search engine results.
 
2) Use only a JavaScript menu for linking pages: Sure, JavaScript menus are cool.  They can drop down, slide across, have pictures and generally spice up a site.  But they can't be crawled by search engines.  What's more, the links in them don't contribute anchor text, either.  One recent client had over a hundred pages in a JavaScript menu and practically no linking using anchor tags.  Most of the 200 pages of the site had not been crawled at all
 
3) Use a JavaScript page strip: ASP.Net is famous of offering page navigation strips using is doPostBack JavaScript methods.  Another client I acquired had over 6 thousand pages on a site, but only 32 pages actually crawled.  The remaining pages were all accessible through paginated tables.  Another great waste of code that a search engine ignores.
 
4) All pictures and Flash with little text: Some designers with a flair for graphic design take sites a little overboard.  If you're a famous pop star and have zillions of fans finding your site every day by just typing yourname.com, then who cares?  In the real world, most sites are not wildly popular and are only found through search engine results.  SEs love text, especially keyword-rich, backlinked text.  Pictures and Flash sure are pretty, but they basic tell nothing to a crawler.
 
5) Renaming pages without redirects:  One site I redesigned earlier this year had tons of links from other sites pointing to a page that was non-existent.  What a complete waste of free traffic and promotion!  Both search engines AND human visitors wouldn't find the site.  Oh, what a little bit of 301-redirect action did to help out that one.
 
6) Leaving title tags blank:  One of the aforementioned sites had about 60 of it's 200 pages with blank titles.  How is anyone going to find those pages, and why would anyone click on them?  Here, let's write a book, then tear the front cover off and leave one of those "this page intentionally left blank" pages as the new front cover.  Real slick.
 
And last, but not least...
 
7) user agent: *  disallow: / in the ROBOTS.TXT file: This one didn't actually happen, although it was close.  The site had the disallow all set for a user agent of Google.  So, they kissed 81% of their traffic away just by a simple screw-up by the former designer.
 
And there you have it.  If you implement these seven key steps, your success with annihilating your search engine exposure and traffic is pretty much guaranteed.  Good luck and happy destroying!

Custom Reporting in Google Analytics

by demtron on Thursday, November 13, 2008 09:36 AM
Google recently began offering a beta test version of their custom reporting tool for Google Analytics.  With this enhancement, Analytics just got a whole lot better.

Like any reporting tool, one common complaint with Analytics has been Google’s limitations on reporting data with multiple variables or criteria.  The Customer Reporting feature is a way to move beyond the canned reports supplied by Google and create one that fit specific business needs.

Custom reports are made first by picking dimensions and metrics.  A dimension is an attribute (time of day, geographic location, or page name) to report on.  There is sometimes confusion between dimensions and metrics.  One easy way to remember the difference is that a dimension is like a column title (like "city") and metrics are the numbers that appear in the column (like the number of hits generated from that city).

Any metric can be chosen for reporting.  One important point to note is that not all dimensions can be paired with all metrics.  Google supplies a chart to identify which pairings are possible.  There are some combinations, such as network location, page names, and visits that would be highly useful but aren’t possible.

All in all, this is a great additional to the free Google Analytics tool that is a "must learn" for those that are serious about using Analytics as a reporting tool.

Duplicate Content and SEO

by demtron on Wednesday, October 01, 2008 07:34 PM
Duplicate content is bad for SEO - Demtron can help!

I was recently asked to review a site that was ranking poorly and not indexed well in search engines, especially Google.  The site looked well designed, had nice internal linking, and a fair amount of SEO performed on it.  A quick search on Google and Yahoo uncovered the problem - duplicate content!

Duplicate content plays tricks in search engines, and that's nothing but bad for you SEO efforts.  For example:

  • The popularity of any one page that's duplicated is diluted, reducing the likelihood that it comes up in search results
  • Some of all of the URLs may not appear friendly to a visitor, thereby reducing the likelihood of that visitor clicking on it

Upon finding duplicate content, Google will remove content it finds duplicated and only show one link that appers to be the "best".  Larger sites will often have dynamically generated pages with little change in content, URL, title, or keywords.  This is an immediate RED FLAG and will cause Google to remove the offending pages.  Any SEO done for the tossed pages has also beed tossed out the window.

If you think your site may have duplicated content, our SEO services can help you.  Contact us for more information.

Tags: ,

Google | SEO


Latest google PageRank Update September 2008

by demtron on Monday, September 29, 2008 11:57 AM
I noticed over the weekend that Google PageRank was updated again.  After the last update in July, many of the sites I maintain ranked higher.  This time, it was a mixed bag.  The Demtron home page rose from 2 to 3, but another site went from 2 to 1. The rest of my sites stayed the same. I checked out a few of my competitors and every one of them stayed the same as well. From what I'm reading, it seems likely that toolbar PR updates will move to every two months now. Ah... that little green bar. What a time waster!

Powered by BlogEngine.NET 1.5.1.18
Theme by Mads Kristensen · Adapted by Demtron

Bookmark and Share

Calendar

<<  December 2024  >>
MoTuWeThFrSaSu
2526272829301
2345678
9101112131415
16171819202122
23242526272829
303112345

View posts in large calendar
Log in

Milwaukee SEO Company

Milwaukee Access Programmer/Developer

Milwaukee Website Designer and Developer



Marketing / SEO

Blog Directory
blogarama - the blog directory
TopOfBlogs
Milwaukee area SEO, SEM, ASP.Net