Tuesday, January 25, 2022
Machine learning is FUN !!!!!!!!
Introduction to machine learning:
- Machine learning is the idea that there are generic algorithms that can tell you something interesting about a set of data without you having to write any custom code specific to the problem.
- Instead of writing code, you feed data to the generic algorithm and it builds its own logic based on the data.
Two kinds of Machine Learning Algorithms:
- Supervised Learning
- Unsupervised Learning
Configuring a Snowflake storage integration to access Amazon S3
Purpose of the article: Snow flake storage integration with IAM role and storage in s3 buckets
Tools and Technology: AWS, Snowflake.
Keywords: Connect with snowflake and store the data in amazon console.
why storage integration is important????
Many more secure way of integration the external agent.- Snowflake provide is called as storage integrating
- provide trust between snowflake the external agents with represent to the storage.
step by step process to "snowflake and store the data in amazon console":
Step 1:
- Firstly login with aws account then search for s3 service, now we need to create s3 buckets.
- After clicking create bucket give buckets name.
- Block public access setting for this bucket need to uncheck and select block public access acknowledge.
- done s3 buckets get created in aws console.
- An IAM role is similar to an IAM user, in that it is an AWS identity with permission policies that determine what the identity can and cannot do in AWS. However, instead of being uniquely associated with one person, a role is intended to be assumable by anyone who needs it.
- Search for roles in dashboard and select IAM (Identity and Access Management ) feature
- In this we need to search for s3. As shown in figure.
- Click s3 allow s3 to call AWS services on your behalf.
- Next click to next permission.
- In this we have some query to execute in snowflake to fetch the data .
- copy the STORAGE_AWS_IAM_USER_ARN ,STORAGE_AWS_EXTERNAL_ID they properties values.
- Connect between the aws and snowflake.
- A policy is an object in AWS that, when associated with an identity or resource, defines their permissions. AWS evaluates these policies when an IAM principal (user or role) makes a request. Permissions in the policies determine whether the request is allowed or denied. Most policies are stored in AWS as JSON documents.
- In this fig we notice trust relationship -----> edit trust relationship.
- we check like edit trust relationship--->condition--->keys--->value.
- Using some query we can find connection between the snowflake to aws.
Monday, January 24, 2022
Why python why not some other languages?????
There are plenty of programming languages to learn. But, Python is one of the best programming languages designed with simplicity in mind. "
But until when will that trend continue? When will Python trend its rock the world compare to other languages.
- Python is fast growing platform now a days.
- Simple and Easy to learn.
- Some many domain python used like wed application , gaming application, data science and machine learning and artificial intelligent model design.
- some more languages like java c c++ . I am not saying like its difficult but execute and understand step are different.
- present days this languages are not booming technology's java c c++.
- As survey say that python fast booming technology as per 2022.we can list in this fig.
- More job opportunity in future.
Hope you have a clear idea about Python. So why are you still waiting? Python has plenty of platform to learn for beginners. So give Python a go!
Friday, January 21, 2022
ML MODEL DEPLOYED IN DOCKER CONTAINER AND STORE IN AWS SERVICE
ML MODEL DEPLOY IN DOCKER CONTAINER AND STORE IN AWS SERVICE
Purpose of the article: Machine learning models deploy external data docker container in the aws catalog.
Intended Audience: Useful for the people working on AWS service
Tools and Technology: AWS, Python, Docker hub
Keywords: Docker container to pull the image.
AWS Machine Learning (AWS DL Containers) are Docker images pre-installed with deep learning frameworks to make it easy to deploy custom machine learning (ML) environments quickly .The complicated process of building and optimizing your environments from scratch. High performance and scalability right away.
why docker?
- Docker is a containerization service that allows for websites, APIs, databases, and, in our case, data science models to be deployed anywhere
- Docker is lightweight and doesn't take up as much memory as other methods and has a faster startup time.
- Docker containers are a popular way to deploy custom ML environments that run consistently in multiple environments
Step 1:
- Flask code in vs code .Run and check in local server.
- output
- open and install docker in laptop.
- click to login with some credentials
- Then go to vs code and click docker .
- Right click and select image bulid,it will build to the container.
- As shown in below.
- We need to check whether the image loaded or not in the container.
- firstly open the docker and then go to --> to image .check.
- final image got push in the container.
Step 4:
I. Create AWS account and login
- our goal is to deploy flask application to docker in AWS Cloud.
- For this, we need to have an AWS account.
- After creating the account you need to go
to AWS Management Console and need to sign in to AWS by using the account that
you have just created.
II. Create EC2 instance
- AWS Management Console and search for EC2 and need to click on running instances.
- Click on Launch Instance to create the new EC2 instance.
- Go to EC2 dashboard launch instance and then create the instance
III. Choose an Amazon Machine Image (AMI)
- We need to choose AMI which means choosing an OS for the server.
- In this case, we can choose the UBUNTU 20.04which is a free tier and proceed next.
- Now we need to choose instance type for our server.
- In this case, we can go with a free tier
instance.
- Once you followed with the above steps you will be able to click on the launch button in “Review Instance Launch”. You will be getting a pop-up for choosing a private key pair.
- Let’s click on create a new private key pair and choose a keypair name.
- Now we need to click on the download key pair button to download it.
- Done with create instance launching.Step 5:
- Download putty, putty gen
- https://www.putty.org/
- · convert a private key from PPM format which you have downloaded previously to PPK format using PuttyGen.
- ·
Firstly open the putty
and load that to pem file paste and its will automatically connected into ppk file
Step 6:
- Open aws ec2 instance copy public ipv4 DNS link.
- Open putty gen paste public ipv4 DNS in host name.
- click the advanced filter and select ssh then click authentication
- search form ppk file and then paste the link.
- Now click on advanced options and go to “Authentication” and insert the created private key in PPK format
- save it finally login it will connect with serve
Some command used in ubuntu to execute the aws server
- open the docker hub whether image loading or not
- Using some commands we can push the image .As shown in image.
- In this way we can store the image and data in docker to aws services