| ||||||||||
![]() |
Shopping cart software Solutions for online shops and malls | |||||||||
![]() |
![]() |
|
X-Cart Home | ![]() |
![]() |
![]() |
![]() |
![]() |
![]() |
Create Embedded Page not shown under Help. | ||||
![]() |
|
|
Thread Tools | Search this Thread |
#11
|
|||||||
|
|||||||
![]() Sweet, It worked. Ok so now my next question is this, since I have my robots.txt file set so that *.php pages are not indexed ( I have cdseo regular not pro) to set a line up in my robots file to allow the static pages within "pages/..." to be indexed? There won't be danger of duplicate content on those pages since I'm not rewriting the URL's. I didn't know if there was a way to set up exceptions on a robots file or not.
__________________
www.uscandleco.com - X-Cart Version 4.7.11 Gold Plus php7.3 mods: reCaptcha running on UNIX www.keystonecandle.com X-Cart Gold Plus - Version 4.7.11 php7.2 mods: reCaptcha cdseo pro running on UNIX |
|||||||
#12
|
|||||||||
|
|||||||||
![]() Good to hear.
The files in skin1/pages/US/ are html not php. This is why I set mine up the way I do. It points to an html page not a php file. Mike Quote:
|
|||||||||
#13
|
|||||||
|
|||||||
![]() Hoosierglass,
I re-read your system of setting that up in the help.tpl and it looks like it also would have worked. I see that the files in the pages/US/ directory are html but the actual link to them points to something like http://www.uscandleco.com/pages.php?pageid=6. If I point directly to the file like this http://www.uscandleco.com/skin1/pages/US/fundraising.html then I lose all the extra stuff (links,header,footer,etc..) and only get the text. Obviously if I get CDSEO pro that would be taken care of but I was wondering if there was another way. I have been seeing a few people having issues with CDSEO pro and having to enter stuff (meta info) twice...I don't know if they just don't have it set up right or what but I was kind of holding off on upgrading. Do you know how to block all .php file except pages.php?... in a robots.txt file? I don't mind having those pages indexed as php urls instead of html but I'm blocking them at the moment.
__________________
www.uscandleco.com - X-Cart Version 4.7.11 Gold Plus php7.3 mods: reCaptcha running on UNIX www.keystonecandle.com X-Cart Gold Plus - Version 4.7.11 php7.2 mods: reCaptcha cdseo pro running on UNIX |
|||||||
#14
|
|||||||||
|
|||||||||
![]() You would have to limit each .php type by line, and then make sure that pages.php was allowed. Basically this would eliminate a global command to block all php ie Disallow: /*.php would be no good and you would have to list each .php file individually.
Mike |
|||||||||
#15
|
|||||||
|
|||||||
![]() Oh, That would be a good bit of work plus having to update the robots.txt file each time I added a new page. Not a lot of fun. Ok well I'll have to think about what I'm going to do.
__________________
www.uscandleco.com - X-Cart Version 4.7.11 Gold Plus php7.3 mods: reCaptcha running on UNIX www.keystonecandle.com X-Cart Gold Plus - Version 4.7.11 php7.2 mods: reCaptcha cdseo pro running on UNIX |
|||||||
|
|||
X-Cart forums © 2001-2020
|