Tuesday, January 25, 2022

Building an Object Detection Model from Scratch in Python

Machine learning is FUN !!!!!!!!

 Introduction to machine learning:

  • Machine learning is the idea that there are generic algorithms that can tell you something interesting about a set of data without you having to write any custom code specific to the problem.
  •  Instead of writing code, you feed data to the generic algorithm and it builds its own logic based on the data.

Two kinds of Machine Learning Algorithms:

  • Supervised Learning
  • Unsupervised Learning

Configuring a Snowflake storage integration to access Amazon S3

Purpose of the article: Snow flake storage integration with IAM role and storage in s3 buckets 

Tools and Technology: AWS, Snowflake.

Keywords: Connect with snowflake and store the data in amazon console.

why storage integration is important????

Many more  secure way  of integration the external agent.
  • Snowflake provide is called as storage integrating 
  • provide trust between snowflake the external agents with represent to the storage.

step by step process to "snowflake and store the data in amazon console":

Step 1:

  • Firstly login with aws account then search for s3 service, now we need to create s3 buckets.

  • After clicking create bucket give buckets name.

  • Block public access setting for this bucket need to uncheck and select block public access acknowledge.




  • done s3 buckets get created in aws console.
Step 2:

What is IAM Role?

  • An IAM role is similar to an IAM user, in that it is an AWS identity with permission policies that determine what the identity can and cannot do in AWS. However, instead of being uniquely associated with one person, a role is intended to be assumable by anyone who needs it.
  • Search for roles in dashboard and select IAM (Identity and Access Management )  feature
  • In this we need to search for s3. As shown in figure.
  • Click s3 allow s3 to call AWS services on your behalf.
  • Next click to next permission.
  • Here we need to find Amazon a3 full access.
  • click next tags ,then role got create.


Step 3:
  • In this we have some query to execute in snowflake to fetch the data .
  • copy the  STORAGE_AWS_IAM_USER_ARN  ,STORAGE_AWS_EXTERNAL_ID  they properties values.
  • Connect between the aws and snowflake.
     


Step 4:

why policy AWS ..?
  • A policy is an object in AWS that, when associated with an identity or resource, defines their permissions. AWS evaluates these policies when an IAM principal (user or role) makes a request. Permissions in the policies determine whether the request is allowed or denied. Most policies are stored in AWS as JSON documents.
  • In this fig we notice trust relationship -----> edit trust relationship.
 
  • we check like edit trust relationship--->condition--->keys--->value.


Step 5:
  • Using some query we can find connection between the snowflake to aws.
  • step by step process to upload the csv file in s3 bucket.

Step 6:
  • Connection between the aws to snowflake by query.
  • Done with  the file loading in snowflake.



Conclusion:
         In this way, you can be able to fetch the data access Snowflake storage integration to access Amazon.























Monday, January 24, 2022

Why python why not some other languages?????

overview:
Introduction:

  "Are you a beginner who is interested in learning code? Then, Python will be a great place for you to get started with.

There are plenty of programming languages to learn. But, Python is one of the best programming languages designed with simplicity in mind. "

But until when will that trend continue? When will Python trend its rock the world compare to other languages.

       why python ???
  •  Python is fast growing platform now a days. 
  •  Simple and Easy to learn. 
  •  Some many domain python used like wed application , gaming application, data science and machine learning and artificial intelligent model design.


    why not some other languages?????
  • some more languages like java c  c++ . I am not saying like its difficult but execute and understand step  are different.
  • present days this languages are  not booming technology's java c  c++.

  • As survey say that python fast booming technology  as per 2022.we can list in this fig.
  • More job opportunity in future.

conclusion:

Hope you have a clear idea about Python. So why are you still waiting? Python has plenty of platform to learn for beginners. So give Python a go!


Friday, January 21, 2022

ML MODEL DEPLOYED IN DOCKER CONTAINER AND STORE IN AWS SERVICE

 

ML MODEL DEPLOY IN DOCKER CONTAINER AND STORE  IN AWS SERVICE

Purpose of the article: Machine learning models deploy external data docker container in the aws catalog. 

Intended Audience: Useful for the people working on AWS service 

Tools and Technology: AWS, Python, Docker hub

Keywords: Docker container to pull the image.

AWS Machine Learning (AWS DL Containers) are Docker images pre-installed with deep learning frameworks to make it easy to deploy custom machine learning (ML) environments quickly .The complicated process of building and optimizing your environments from scratch. High performance and scalability right away.

why docker?

  • Docker is a containerization service that allows for websites, APIs, databases, and, in our case, data science models to be deployed anywhere
  • Docker is lightweight and doesn't take up as much memory as other methods and has a faster startup time.
  • Docker containers are a popular way to deploy custom ML environments that run consistently in multiple environments

Step 1:

  •  Flask code in vs code .Run and check in local server.
  • output 






step 2:
  •  After executing flask code
  •  we need go docker code and then execute it .

Step 3:
  • open and install docker in laptop.
  • click to login with some credentials


  • Take that url and paste it over in terminal and then execute it .



  • Then go to vs code and click docker .
  • Right click and select image bulid,it will build to the container.
  • As shown in below.



  • loaded in the container.

Step 3:
  • We need to check whether the image loaded or not in the container.
  • firstly open the docker and then go to --> to image .check.
  • final image got push in the container.



Step 4:

I. Create AWS account  and login

  • our goal is to deploy flask application to docker in  AWS Cloud.
  •  For this, we need to have an AWS account.
  •  After creating the account you need to go to AWS Management Console and need to sign in to AWS by using the account that you have just created.

II. Create EC2 instance

  • AWS Management Console and search for EC2 and need to click on running instances. 
  • Click on Launch Instance to create the new EC2 instance. 

  • Go to EC2 dashboard launch instance and then create the instance


III. Choose an Amazon Machine Image (AMI)

  • We need to choose AMI which means choosing an OS for the server.
  •  In this case, we can choose the UBUNTU 20.04which is a free tier and proceed next.


IV. Choose Instance Type
  • Now we need to choose instance type for our server.
  •  In this case, we can go with a free tier instance. 

V. Creating a Private Key Pair
  • Once you followed with the above steps you will be able to click on the launch button in “Review Instance Launch”. You will be getting a pop-up for choosing a private key pair.  
  • Let’s click on create a new private key pair and choose a keypair name. 
  •  Now we need to click on the download key pair button to download it.

  • Done with create instance launching.

    we can be able to create an EC2 Instance. Now we need to go to view instances where you could be able to find the instance you have created.



    Step 5:
  • Download putty, putty gen
  •  https://www.putty.org/
Putty:- It is a free and open-source terminal emulator, serial console, and network file transfer application. It supports several network protocols, including SCP, SSH, Telnet, rlogin, and raw socket connection. It can also connect to a serial port.


  • ·        convert a private key from PPM format which you have downloaded previously to PPK format using PuttyGen.
  • ·        Firstly open the  putty  and load that to pem file paste and its will automatically connected   into ppk file 





Step 6:

  • Open aws ec2 instance  copy public ipv4 DNS link.




  • Open putty gen paste public ipv4 DNS  in host name.


  • click the advanced filter and select ssh then click authentication
  •  search form ppk file and then  paste the link.
  • Now click on advanced options and go to “Authentication” and insert the created private key in PPK format
  • save it finally login it will connect with serve

  • Some command used in ubuntu to execute the aws server



sudo-apt update

sudo apt install docker.io


  • we need to give docker login credentials.

Step 7:
  • open the docker hub whether image loading or not 



  • Using some commands we can push the image .As shown in image.




we need to whether container id create or not. suppose container id got created image push  in the container.

Conclusion: 
  • In this way we can store the image and data in docker to aws services