ML MODEL DEPLOY IN DOCKER CONTAINER AND STORE IN AWS SERVICE
Purpose of the article: Machine learning models deploy external data docker container in the aws catalog.
Intended Audience: Useful for the people working on AWS service
Tools and Technology: AWS, Python, Docker hub
Keywords: Docker container to pull the image.
AWS Machine Learning (AWS DL Containers) are Docker images pre-installed with deep learning frameworks to make it easy to deploy custom machine learning (ML) environments quickly .The complicated process of building and optimizing your environments from scratch. High performance and scalability right away.
why docker?
- Docker is a containerization service that allows for websites, APIs, databases, and, in our case, data science models to be deployed anywhere
- Docker is lightweight and doesn't take up as much memory as other methods and has a faster startup time.
- Docker containers are a popular way to deploy custom ML environments that run consistently in multiple environments
Step 1:
- Flask code in vs code .Run and check in local server.
- output
- open and install docker in laptop.
- click to login with some credentials
- Then go to vs code and click docker .
- Right click and select image bulid,it will build to the container.
- As shown in below.
- We need to check whether the image loaded or not in the container.
- firstly open the docker and then go to --> to image .check.
- final image got push in the container.
Step 4:
I. Create AWS account and login
- our goal is to deploy flask application to docker in AWS Cloud.
- For this, we need to have an AWS account.
- After creating the account you need to go
to AWS Management Console and need to sign in to AWS by using the account that
you have just created.
II. Create EC2 instance
- AWS Management Console and search for EC2 and need to click on running instances.
- Click on Launch Instance to create the new EC2 instance.
- Go to EC2 dashboard launch instance and then create the instance
III. Choose an Amazon Machine Image (AMI)
- We need to choose AMI which means choosing an OS for the server.
- In this case, we can choose the UBUNTU 20.04which is a free tier and proceed next.
- Now we need to choose instance type for our server.
- In this case, we can go with a free tier
instance.
- Once you followed with the above steps you will be able to click on the launch button in “Review Instance Launch”. You will be getting a pop-up for choosing a private key pair.
- Let’s click on create a new private key pair and choose a keypair name.
- Now we need to click on the download key pair button to download it.
- Done with create instance launching.Step 5:
- Download putty, putty gen
- https://www.putty.org/
- · convert a private key from PPM format which you have downloaded previously to PPK format using PuttyGen.
- ·
Firstly open the putty
and load that to pem file paste and its will automatically connected into ppk file
Step 6:
- Open aws ec2 instance copy public ipv4 DNS link.
- Open putty gen paste public ipv4 DNS in host name.
- click the advanced filter and select ssh then click authentication
- search form ppk file and then paste the link.
- Now click on advanced options and go to “Authentication” and insert the created private key in PPK format
- save it finally login it will connect with serve
Some command used in ubuntu to execute the aws server
- open the docker hub whether image loading or not
- Using some commands we can push the image .As shown in image.
- In this way we can store the image and data in docker to aws services
No comments:
Post a Comment