Predictive Services - EC2 Deployment Error

User 2146 | 8/7/2015, 5:27:05 PM

Hello there,

I have some issues to launch a job on an EC2 instance. What I did is very similar to the article here: https://dato.com/learn/gallery/notebooks/deploy-scikit-learn-in-ps.html, using my own EC2 instance, but an exception throws persistenly– “Unable to successfully retrieve correct EC2 image to launch for this version. This could be a temporary problem. Please try again in a few minutes. If the problem persists please contact support@dato.com”. Most of the time I am behind a firewall, but the issue is still there after I turn off the firewall.

The complete log is as below:

Comments

User 1174 | 8/7/2015, 6:01:44 PM

Hi Joyce,

Thank you for trying out Predictive Service. Can you please post your Ec2 configuration here so we can help to debug further? (Please do not paste your AWS credentials). My guess is that the instance type for your Ec2 instance might not be supported.

Thanks,

Yifei


User 1174 | 8/7/2015, 6:06:01 PM

The API doc here specifies which instances are supported. https://dato.com/products/create/docs/generated/graphlab.deploy.Ec2Config.html#graphlab.deploy.Ec2Config


User 2146 | 8/7/2015, 8:59:18 PM

Hi Yifei,

Thank you for replying. Here is the EC2 configuration

At first I don't have any instances, since I though predictive service can create one instead. Then I created a m3.xlarge instance, and the problem is still there.


User 1174 | 8/7/2015, 10:03:55 PM

Hi Joyce,

It seems that our server did not receive any launch request from your GraphLab-Create. This can be a result of your firewall configuration blocking outgoing requests to our server. Can you try turning off the firewall again?

To make sure you can make outgoing requests from your Python session, you can try using urllib2 to query google.com. import urllib2 res = urllib2.urlopen('https://www.google.com') res.read()


User 2146 | 8/10/2015, 7:46:29 PM

Thank you for helping out Yifei. I turned off my firewall and then tested it out using the urllib2 query, and everything works fine. But this time another error throws as below.

So I am wondering if there is some config issues with my AWS instance. My instance has Instance Type: m3.xlarge, with Security Group assigned to 'GraphLab'. I also enabled my s3 bucket Static Website Hosting... Is there anything else I miss?

Thank you very much!

---------------------------------------------------------------------------
error                                     Traceback (most recent call last)
<ipython-input-3-6718bd1d6afa> in <module>()
      3                                          ec2_config = ec2_config_one,
      4                                          state_path = ps_state_path,
----> 5                                          num_hosts = 1)

C:\Users\AppData\Local\Continuum\Anaconda\envs\dato-env\lib\site-packages\graphlab\deploy\predictive_service.pyc in create(name, ec2_config, state_path, num_hosts, description, api_key, admin_key, ssl_credentials, cors_origin, port)
    201 
    202     try:
--> 203         region = _file_util.get_s3_bucket_region(s3_bucket_name, aws_credentials)
    204     except _S3ResponseError as e:
    205         _logger.error("Unable to connect to state_path's bucket; check your AWS credentials")

C:\Users\AppData\Local\Continuum\Anaconda\envs\dato-env\lib\site-packages\graphlab_util\file_util.pyc in get_s3_bucket_region(s3_bucket_name, aws_credentials)
    263 def get_s3_bucket_region(s3_bucket_name, aws_credentials={}):
    264     conn = boto.connect_s3(**aws_credentials)
--> 265     bucket = conn.get_bucket(s3_bucket_name)
    266     return bucket.get_location() or "us-east-1" # default=us-standard
    267 

C:\Users\AppData\Local\Continuum\Anaconda\envs\dato-env\lib\site-packages\boto\s3\connection.pyc in get_bucket(self, bucket_name, validate, headers)
    500         """
    501         if validate:
--> 502             return self.head_bucket(bucket_name, headers=headers)
    503         else:
    504             return self.bucket_class(self, bucket_name)

C:\Users\AppData\Local\Continuum\Anaconda\envs\dato-env\lib\site-packages\boto\s3\connection.pyc in head_bucket(self, bucket_name, headers)
    519         :returns: A <Bucket> object
    520         """
--> 521         response = self.make_request('HEAD', bucket_name, headers=headers)
    522         body = response.read()
    523         if response.status == 200:

C:\Users\AppData\Local\Continuum\Anaconda\envs\dato-env\lib\site-packages\boto\s3\connection.pyc in make_request(self, method, bucket, key, headers, data, query_args, sender, override_num_retries, retry_handler)
    662             data, host, auth_path, sender,
    663             override_num_retries=override_num_retries,
--> 664             retry_handler=retry_handler
    665         )

C:\Users\AppData\Local\Continuum\Anaconda\envs\dato-env\lib\site-packages\boto\connection.pyc in make_request(self, method, path, headers, data, host, auth_path, sender, override_num_retries, params, retry_handler)
   1066                                                     params, headers, data, host)
   1067         return self._mexe(http_request, sender, override_num_retries,
-> 1068                           retry_handler=retry_handler)
   1069 
   1070     def close(self):

C:\Users\AppData\Local\Continuum\Anaconda\envs\dato-env\lib\site-packages\boto\connection.pyc in _mexe(self, request, sender, override_num_retries, retry_handler)
    911         i = 0
    912         connection = self.get_http_connection(request.host, request.port,
--> 913                                               self.is_secure)
    914 
    915         # Convert body to bytes if needed

C:\Users\AppData\LocalHTTP/1.1 200 OK

Transfer-Encoding: chunked Date: Thu, 21 Jul 2016 23:13:36 GMT Server: Warp/3.2.6 Content-Type: application/json

016A ["37zyefqi2sweveyp","42fn7zeo6v5ui427","66pt5sk2wz2jrbzu","awoljknjigytdyls","cj2lanoogknwopto","cnm3adnh35xmsx3f","ebxs4t2y6xr5izzy","eg5zus2pz72mr7xb","exshwew2w2jv3n7r","hxrxgzvgms3incmf","hymu5oh2f5ctk5jr","jkisbjnul226jria","lag7djeljbjng6bu","o3l65o4qzcxs327j","qsk2jzo2zh523r24","t7k6g7fkndoggutd","xfllvjyax4inadxh","ygtjzi2wkfonj3z7","yycjajwpguyno4je"] 0


User 1174 | 8/10/2015, 10:14:56 PM

Hi Joyce,

There is no issue with your instance type. When launching a Predictive Service, the Predictive Service will write a metadata file to your S3 bucket.

The error above is an error from accessing AWS S3 bucket. "[Errno 10061] No connection could be made because the target machine actively refused it" is a result of either there is no service on the server-side (AWS S3) thats listening to your request, or there is a firewall blocking this request. Since AWS S3 is very reliable, I believe it is more likely that your firewall is actively blocking connections directly to AWS S3.


User 2146 | 8/11/2015, 3:56:45 AM

Thanks Yifei. You are right. I tried the deployment code snippet using home internet and the problem is solved.

I appreciate your help!