• Resolved ptpusa

    (@ptpusa)


    I have trouble with the Smartphone Crawler of Google as some of my style elements are not being loaded. The reason for that is that they are apparently located on SFTP URL https://22j.927.myftpupload.com/ and the robots.txt file there blocks the crawler from loading them. On my “real URL”, https://www.ptpusa.org/robots.txt the crawler has access.

    I have spoken to GoDaddy at length and they tell me I cannot change the robots.txt on the SFTP side. I can only change the robots.txt on my main URL.

    How can I prevent google from trying to access items on the https://22j.927.myftpupload.com site?

    The page I need help with: [log in to see the link]

Viewing 4 replies - 1 through 4 (of 4 total)
  • How can I prevent google from trying to access items on the https://22j.927.myftpupload.com site?

    Simple: don’t load resources from there.

    Think of Google’s crawler in this sense as a normal user sitting behind a computer or phone and opening your site in the browser: styles, images, JavaScript, and other resources can and will only be loaded from the locations as referenced in your website. The computer cannot (yet) read your intentions and desires.

    And if your site is set up to load all resources from an external URL as seen below, that’s exactly what Chrome, Firefox, Safari, Opera and Google crawlers will do: they’ll load the resources as from those URLs.

    I have spoken to GoDaddy at length and they tell me I cannot change the robots.txt on the SFTP side. I can only change the robots.txt on my main URL.

    That may be true, as you may not have direct access to that FTP site to upload a custom robots.txt file (for the record: Google accepts and follows robots.txt files for FTP sites).

    But its also true that you can set robots.txt rules ONLY for resources in the directory where the robots.txt lives.

    You cannot, for instance, set up rules for an external domain as you seem to want to. Just imagine what could possibly go wrong if I could set up rules in my website example.com to prevent Google from accessing your website example.org!

    In any case, I believe you’re only trying to treat the symptoms and not the real issue that needs addressing. The real issue, in my opinion, is the fact that you are loading ALL your site’s resource: these, plugins, styles, scripts, even media files in posts… from this external domain instead of loading them from your own domain.

    Do you mind sharing why you’re choosing to do this, against conventional practice?

    Thread Starter ptpusa

    (@ptpusa)

    Hi George, thank you for the reply.

    This setup was not a choice of me, the domain & wordpress installation is brand new, so I believe these were default settings from the managed wordpress product from GoDaddy. GoDaddy called the external domain the ‘preferred FTP for wordpress installation’ but I see your point that if the site refers to that domain, the crawlers just follow.

    I will research how to change the load location of the style sheets towards my ptpusa.org.

    Thank you for putting me on the right track.
    Marcel

    Thread Starter ptpusa

    (@ptpusa)

    So, I checked my wp-config.php and the root directory there is set to the right domain.

    efine(‘WP_HOME’,’https://ptpusa.org’);
    define(‘WP_SITEURL’,’https://ptpusa.org’);

    Spoke to GoDaddy again and they stated that “everything is fine” on their end and did not want to explain the gd-config.php file that shows the following:

    <?php
    define( ‘GD_ACCOUNT_UID’, ‘ef231f49-e8a3-11ea-81eb-3417ebe60eb6’ );
    define( ‘GD_ASAP_KEY’, ‘edd84c62c84a84551dd207a6c1f0fbfd’ );
    define( ‘GD_CDN_ENABLED’, TRUE );
    define( ‘GD_GF_LICENSE_KEY’, ‘icr1ucnetuFifLeALcYQWrGUJGbQD5f1’ );
    define( ‘GD_HMT_SERVICE_KEY’, ‘fada44a0-1218-464a-a748-456d1d38f1a7’ );
    define( ‘GD_PLAN_NAME’, ‘Deluxe Managed WordPress’ );
    define( ‘GD_RESELLER’, 1 );
    define( ‘GD_RUM_ENABLED’, TRUE );
    define( ‘GD_SITE_CREATED’, 1598511600 );
    define( ‘GD_SITE_TOKEN’, ‘3aee6e9d-5292-432a-b448-e4e2f22badcf’ );
    define( ‘GD_TEMP_DOMAIN’, ’22j.927.myftpupload.com’ );
    define( ‘GD_VIP’, ‘72.167.241.46’ );

    Not sure what my next step is, perhaps a forwarding from the myftpupload.com to the main domain?

    Thread Starter ptpusa

    (@ptpusa)

    Allright, I figured it out. It’s the CDN setting that drives load root folder to the SFTP site. I deactivated the CDN and it is now better.
    Marcel

Viewing 4 replies - 1 through 4 (of 4 total)
  • The topic ‘Style Sheets in old URL’ is closed to new replies.