Want to work your way up in SERPs? Get and share ideas on link building, on/off page optimizing, social SEO and more.

Traffic Travis ambiguous stats

paulie69
Posts: 77
Joined: 06 Jun 09
Trust:

Traffic Travis ambiguous stats

Hi all,

I thought I'd ask if anyone else has noticed what seems to be ambiguous (inaccurate?) stats with Traffic Travis page analysis. Maybe it's obvious to others (I'm not sure) but I thought I'd let everyone know anyway - sorry for the long post.

I've been reviewing my page stats using TT and cross-checking with a little freeware app called Content Screener and wondered why they were so different. I've been pulling my hair out over it until I contacted TT support.

Content Screener looks at the stats (keyword density, # keywords etc.) on a given article while (as support told me) TT includes the WHOLE "page" (I say that loosely) ie absolutely everything (which appears to both on the immediate page and off it) including navigation bar, footer, links etc. I'm sure there's a rational reason for it, but it seems a bit illogical that TT should be including those things, I mean shouldn't it just be analyzing the page contents (ie the page where the current article is) instead of counting all the extraneous stuff? Basically, it's actually looking OUTSIDE the article page as well.

I can't see what point there is in adding navigation bar, copyright text and page footer text into the stats but that's exactly what TT does. It seems to even add the keyword in the menu at the left of the page! Does anyone know if this is taken into account by the search engines?

The net affect in my case (and I'm assuming, everyone else's) is that TT always reports higher stats than it should. Given TT supports response and looking at any of my pages I still can never make up for any of the discrepancies (ie extra values) TT gives me. I still just don't know where TT is getting it's values from to produce it's stats as I certainly don't have any keywords in my footer or copyright etc.

But perhaps this is the way on page SEO should be calculated in which case I can breathe easy! Any ideas anyone?

Thanks,
Paul.
  • 0
Site Admin
markling
Posts: 2071
Joined: 13 Jun 06
Trust:
Hi Paul

It's definitely good to be conscious of keyword density - keyword stuffing can get you hit with a penalty quicker than saying [word banned] 50 times! At the same time you need to remember density is not that precise e.g. it's not like 1% is ok and 1.5% will get you penalized. Some sources say anything under 5% is acceptable, and also that Yahoo & MSN are more forgiving than Google. I generally recommend aiming for 1%.

Many SEOs will tell you that keywords anywhere on the page count towards the final keyword density percentage so it is best to err on the side of caution and use that figure, rather than risking incurring a search penalty.
  • 0
Limited time special - Pathway to Passive for $37: https://www.affilorama.com/pathwaytopassive
 
Site Admin
michellerana
Posts: 1874
Joined: 05 May 09
Trust:
It's difficult to say the reason behind variations on different tools without knowing how the products work. These tools have their own system of doing the page analysis so the results would vary. What you could do is choose between the two tools and modify your pages accordingly.
  • 0
Michelle
Customer Support
=========================

Limited time special - Pathway to Passive for $37: http://www.affilorama.com/pathwaytopassive
 
paulie69
Posts: 77
Joined: 06 Jun 09
Trust:
Hello Mark and michellerana,

Ahhhh now I understand! As usual, I just can't thank you guys enough for your help, it makes a lot of sense now someone explains it to me. Sorry it was probably a silly question but it was confusing for me. Thanks again VERY MUCH Mark and Michellerana :)

Kind regards
Paul.
  • 0

This topic was started on Oct 10, 2009 and has been closed due to inactivity. If you want to discuss this topic further, please create a new forum topic.

cron