CDSEO Version 1.8.3
Xcart 4.2.1
Is this normal or not? After installing CDSEO we are seeing a huge increase in errors on google webmaster tools. Given all their recent changes and penalizing sites with errors we are extra sensitive to these errors. So we need to know if something is broken in CDSEO or if this is normal.
Issue #1 ) Google now reports over 5K of blocked URL's from the new CDSEO robots.txt below
User-agent: *
Disallow: /administrator/
Disallow: /cache/
Disallow: /components/
Disallow: /images/
Disallow: /includes/
Disallow: /installation/
Disallow: /language/
Disallow: /libraries/
Disallow: /media/
Disallow: /modules/
Disallow: /plugins/
Disallow: /templates/
Disallow: /tmp/
Disallow: /xmlrpc/
Disallow: /xcart/*printable=Y
Disallow: /xcart/*js=*
Disallow: /xcart/*sort=*
Disallow: /xcart/*sort_direction=*
Disallow: /xcart/product.php*
Disallow: /xcart/home.php?cat=*
Disallow: /xcart/catalog/
Disallow: /xcart/search.php
Disallow: /xcart/cart.php
Disallow: /xcart/help.php
Disallow: /xcart/giftcert.php
Disallow: /xcart/product.php
Disallow: /xcart/orders.php
Disallow: /xcart/register.php
Disallow: /xcart/icon.php
Disallow: /xcart/image.php
Disallow: /xcart/error_message.php
Disallow: /xcart/offers.php
Disallow: /xcart/product_image.php
Sitemap:
http://www.tvrepairkits.com/xcart/sitemap.xml
Sitemap:
http://www.tvrepairkits.com/xcart/sitemap.xml.gz
Issue #2) Category files not indexed...
http://www.tvrepairkits.com/xcart/home.php?cat=2959 reports that its "Denied by robots.txt" - We suspect this is to prevent duplicate content...but wanted to check.
Issue #3 ) Duplicate Content - Google tools is still reporting over 4K of Duplicate meta descriptions, Duplicate Title Tags
Issue #4) Sitemap has over 7K of entries but Google has not indexed above 3900 pages in over a week....