Error hit while uploading to S3: AWS S3 operation failed

User 984 | 7/17/2015, 6:21:27 PM

I'd like to deploy a job on AWS but am running into difficulties when calling graphlab.deploy.job.create. It appears that Graphlab is unable to gain write access to my bucket for logging, even though I believe I've enabled logging in the AWS console. I'm certain I have read access because I can create SFrames from .csv files in my bucket. Maybe my s3 directory structure not set up correctly? My environment is: ec2 = gl.deploy.environment.EC2('ec2', s3_folder_path=s3_path, region='us-west-2', instance_type='m3.xlarge') and the create command is ob_ec2 = gl.deploy.job.create(train_network, environment = ec2).

The full stacktrace when it crashes is included below, but I'd be happy to sync up in private to provide more details about my deployment. [INFO] Error hit while uploading to S3: AWS S3 operation failed [INFO] Retrying 1 out of 5 [INFO] Error hit while uploading to S3: AWS S3 operation failed [INFO] Retrying 2 out of 5 [INFO] Error hit while uploading to S3: AWS S3 operation failed [INFO] Retrying 3 out of 5 [INFO] Error hit while uploading to S3: AWS S3 operation failed [INFO] Retrying 4 out of 5 [INFO] Error hit while uploading to S3: AWS S3 operation failed [INFO] Retrying 5 out of 5 Traceback (most recent call last): File "carwash.py", line 61, in <module> ob_ec2 = gl.deploy.job.create(train_network, environment = ec2) File "/Library/Python/2.7/site-packages/graphlab/deploy/job.py", line 260, in create job = exec_env.run_job(job) File "/Library/Python/2.7/site-packages/graphlab/deploy/_executionenvironment.py", line 327, in run_job aws_credentials = self.environment.get_credentials(), silent = True) File "/Library/Python/2.7/site-packages/graphlab_util/file_util.py", line 164, in upload_to_s3 raise e RuntimeError: AWS S3 operation failed [INFO] Stopping the server connection.

Comments

User 18 | 7/18/2015, 12:46:33 AM

Hi Bill,

Can you double check these things:

  1. The bucket exists. (Sounds like this is true.)
  2. The name of the bucket and region are correct.
  3. You have full read/write access to the bucket.

Let us know if fixing any of those solves your problem.

Alice


User 984 | 7/21/2015, 7:21:08 PM

Hi Alice,

It turns out that the issue was with my default region. I was trying to connect to 'us-west-2a' but the bucket was on 'us-west-2'.

Thanks, Bill