Viewing 10 replies - 16 through 25 (of 25 total)
  • Hi there,

    great news for pro version.

    One problem I’ve found is that it doesn’t work with the new Image Editor in wordpress. After cropping and image and saving it, a new file isn’t uploaded to s3, but the path is still replaced. The pro version will allow the resizing and cropping ?

    thanks

    Hi again

    My main purpose using this plug ( Amazon S3 ) is to allow store media library over Amazon S3 services , thats OK if you use ” default” Word Press Media Library.

    but plugs like NexGen Gallery uses its own gallery directory ,If I try to change settings on NextGen or AWS trying to match default WordPress library simply doesn’t work for me.

    Is it a way around to solve this , or is just not posible ???

    Plugin Contributor Brad Touesnard

    (@bradt)

    Just posted on update on the pro version:
    https://deliciousbrains.com/sneak-peek-amazon-s3-cloudfront-pro/

    Can we still use any cache plugin like w3 cache or hyper cache etc? Will that still manage the caching. Thanks for building such a simple and direct access AWS plugin. Any update on release of the Pro version? ??

    Or, rather- these cache plugins will be required or not? ofcourse, with your pro plugin.

    We are also waiting for the Pro Version, but in the mean time the best solution that we have found is to use the aws-cli AWS command line. Here is the install info:
    https://docs.aws.amazon.com/cli/latest/userguide/installing.html

    From there you will need to configure a profile (IAM role for S3)

    aws configure --profile YOUR_PROFILE_NAME_HERE

    Now Sync all the files to the S3 Bucket

    aws s3 --profile YOUR_PROFILE_NAME_HERE sync /path/to/your/local/content s3://YOUR_BUCKET_NAME_HERE --cache-control "max-age=315576000"

    This does mean you will need access to the files locally. If you don’t want to download all of it with ftp then you can run this from the server with ssh.

    This is extremely fast and handles large loads well and using the –cache-control=”max-age=315576000″ adds all content with cache headers.

    There is much more you can do with the command line.
    The above command just adds and modifies, but does not remove.
    If you want to make your bucket exactly the same as your local then add the –delete option which removes files from your S3 bucket that are not on your local.

    More details here: https://docs.aws.amazon.com/cli/latest/userguide/using-s3-commands.html

    Well we tried this on our WordPress site. We uploaded our all existing content to S3 without any plugin. You can do it with some AWS CLI commands and with some changes on EC2 instance. If you need the solution then contact me on my email id [email protected] ??

    @daniel27lt – just had the occasion to do this again and the solution I mentioned above worked fine.

    For images stuck on several different servers, there was one trick I didn’t mention above: after clicking “Regenerate Thumbnails” button, and completing the regeneration, there’s a “To try regenerating the failed images again, click here.” below the “Regenerate Thumbnails” header. This will retry the images still stuck on a different server with their non-S3 paths.

    @brad Touesnard – thanks for a great plugin!

    Hi,

    Just wanted to add.
    Regenerating thumbnails did moved the existing file and rewrote URL
    But the problem (for me anyway) was it didn’t watermarked the image.

    Thanks

    Hi,

    Didn’t find edit button for previous message.
    I found out that as my source image/ full size image wasn’t watermarked in the first place, when regenerating thumbs, it uses that original image. So regenerated thumbs won’t be watermarked.

    So I tried to copy the existing watermarked thumbs to S3 and change the Media URLs. Didn’t work (tried replacing it in DB wp_posts manually by running a SQL query, used a script call “Search Replace DB”)

    As nothing I tried work, I decided to do it hard way.
    So I would download the whole uploads folder, filter out the original files, batch watermark it using a tool called XnView,then reupload those files to server, regenerate thumbs.)

    I know its a lot of work. But that’s the only way I know how to copy existing files WITH watermark to S3. (if your images weren’t watermarked or watermark is not important, just regenerating thumbs will do it)

    Also if you don’t have a high speed net connection to download/upload work I found a service provided in here:
    https://www.wjunction.com/94-other

    They look legit. They call them RDPs and some sellers offer them for as low as 5$. Once you pay, you’ll get access to a machine similar to your Home PC but with a high speed connection.

    Thanks

Viewing 10 replies - 16 through 25 (of 25 total)
  • The topic ‘Copy Existing Files To S3?’ is closed to new replies.