Featured Post

How to Build CI/CD Pipeline: GitHub to AWS

Image
 Creating a CI/CD pipeline to deploy a project from GitHub to AWS can be done using various AWS services like AWS CodePipeline, AWS CodeBuild, and optionally AWS CodeDeploy or Amazon ECS for application deployment. Below is a high-level guide on how to set up a basic GitHub to AWS pipeline: Prerequisites AWS Account : Ensure access to the AWS account with the necessary permissions. GitHub Repository : Have your application code hosted on GitHub. IAM Roles : Create necessary IAM roles with permissions to interact with AWS services (e.g., CodePipeline, CodeBuild, S3, ECS, etc.). AWS CLI : Install and configure the AWS CLI for easier management of services. Step 1: Create an S3 Bucket for Artifacts AWS CodePipeline requires an S3 bucket to store artifacts (builds, deployments, etc.). Go to the S3 service in the AWS Management Console. Create a new bucket, ensuring it has a unique name. Note the bucket name for later use. Step 2: Set Up AWS CodeBuild CodeBuild will handle the build proces

The Growth of Machine Learning till TensorFlow

The Internet and the vast amount of data are inspirations for CEOs of big corporations to start to use Machine learning. It is to provide a better experience to users.

How TensorFlow Starts

Let us take Amazon, online retail that uses Machine learning. The algorithm's purpose is to generate revenue. Based on user search data, the ML application provides information or insights.

The other example is the advertising platform where Google is a leader in this line. Where it shows ads based on the user movements while surfing the web. These are just a few, but there are many in reality.

TensorFlow is a new generation framework for Machine Learning developers. Here is the flow of how it started.
Machine Learning


Evolution

Evolution of TensorFlow

Top ML Frameworks

Torch

  • The torch is the first framework developed in 2002 by Ronan Collobert. Initially, IBM and Facebook have shown much interest.
  • The interface language is Lua.
  • The primary focus is matrix calculations. It is suitable for developing neural networks.

Theano

  • It is developed in 2010 by the University of Montreal. It is highly reliable to process graphs (GPU).
  • Theano stores operations in a data structure called a graph, which it compiles into high-performance code. It uses Python routines.

Caffe

  • This framework is much popular in processing Image recognition.
  • Caffe is written in C++.
  • It is popular in Machine Learning and Neural networks.

Keras

  • It is well known for developing neural networks. 
  • The real advantages or simplicity and easy development.
  • François Chollet created Keras as an interface to other machine learning frameworks, and many developers access Theano through Keras to combine Keras's simplicity with Theano's performance.

TensorFlow

This is developed by Google in 2015. You can use TensorFlow on Google cloud. It supports Python heavily. The core functions of this framework developed in .C++

Takeaways.

  1. The story of Machine Learning started in the 18th century.
  2. Python is the top interface language in the major ML frameworks.
  3. Python is the prime language you need for 20th-century Data science projects.

Comments

Popular posts from this blog

How to Fix datetime Import Error in Python Quickly

How to Check Kafka Available Brokers

SQL Query: 3 Methods for Calculating Cumulative SUM