Skip to main content

What is "Learning Algorithm" in Machine Learning

#What is machine learning algorithm
#What is machine learning algorithm
Just a basics on Machine Learning

Alice has just begun taking a course on machine learning. She knows that at the end of the course, she will be expected to have “learned”all about this topic. A common way of gauging whether or not shehas learned is for her teacher, Bob, to give her a exam. She has done well at learning if she does well on the exam.

But what makes a reasonable exam? If Bob spends the entire semester talking about machine learning, and then gives Alice an exam on History of Pottery, then Alice’s performance on this exam will not be representative of her learning. On the other hand, if the (The general supervised approach to machine learning: a learning algorithm reads in training data and computes a learned function f . This function can then automatically label future text examples.)

exam only asks questions that Bob has answered exactly during lectures, then this is also a bad test of Alice’s learning, especially if it’s an “open notes” exam. What is desired is that Alice observes specific examples from the course, and then has to answer new, but related questions on the exam. This tests whether Alice has the ability to  generalize. Generalization is perhaps the most central concept in machine learning. This Machine learning with data science hands-on really useful.

Examples for Machine Learning

As a running concrete example, we will use that of a course recommendation system for undergraduate computer science students. We have a collection of students and a collection of courses. Each student has taken, and evaluated, a subset of the courses. The evaluation is simply a score from −2 (terrible) to +2 (awesome).

The job of the recommender system is to predict how much a particular student (say, Alice) will like a particular course (say, Algorithms).

Given historical data from course ratings (i.e., the past) we are trying to predict unseen ratings (i.e., the future). Now, we could be unfair to this system as well. We could ask it whether Alice is likely to enjoy the History of Pottery course. This is unfair because the system has no idea what History of Pottery even is, and has no prior experience with this course. On the other hand, we could ask it how much Alice will like Artificial Intelligence, which she took last year and rated as +2 (awesome).

We would expect the system to predict that she would really like it, but this isn’t demonstrating that the system has learned: it’s simply recalling its past experience. In the former case, we’re expecting the system to generalize beyond its experience, which is unfair. In the latter case, we’re not expecting it to generalize at all.

This general set up of predicting the future based on the past is at the core of most machine learning. The objects that our algorithm will make predictions about are examples. In the recommender system setting, an example would be some particular Student/Course pair (such as Alice/Algorithms). The desired prediction would be the rating that Alice would give to Algorithms.

Comments

Popular posts from this blog

The best 5 differences of AWS EMR and Hadoop

With Amazon Elastic MapReduce (Amazon EMR) you can analyze and process vast amounts of data. It does this by distributing the computational work across a cluster of virtual servers running in the Amazon cloud. The cluster is managed using an open-source framework called Hadoop.

Amazon EMR has made enhancements to Hadoop and other open-source applications to work seamlessly with AWS. For example, Hadoop clusters running on Amazon EMR use EC2 instances as virtual Linux servers for the master and slave nodes, Amazon S3 for bulk storage of input and output data, and CloudWatch to monitor cluster performance and raise alarms.

You can also move data into and out of DynamoDB using Amazon EMR and Hive. All of this is orchestrated by Amazon EMR control software that launches and manages the Hadoop cluster. This process is called an Amazon EMR cluster.


What does Hadoop do...

Hadoop uses a distributed processing architecture called MapReduce in which a task is mapped to a set of servers for proce…

Top 20 ultimate ETL Questions really good for interviews

How to print/display the first line of a file?  there are many ways to do this. However the easiest way to display the first line of a file is using the [head] command.  $> head -1 file. Txt no prize in guessing that if you specify [head -2] then it would print first 2 records of the file.  another way can be by using [sed] command. [sed] is a very powerful text editor which can be used for various text manipulation purposes like this.  $> sed '2,$ d' file. Txt how does the above command work?  The 'd' parameter basically tells [sed] to delete all the records from display from line 2 to last line of the file (last line is represented by $ symbol). Of course it does not actually delete those lines from the file, it just does not display those lines in standard output screen. So you only see the remaining line which is the 1st line.  how to print/display the last line of a file?  the easiest way is to use the [tail] command.  $> tail -1 file. Txt if you want to do it using…

5 Things About AWS EC2 You Need to Focus!

Amazon Elastic Compute Cloud (Amazon EC2) - is a web service that provides resizable compute capacity in the cloud. It is designed to make web-scale cloud computing easier for developers.
Amazon EC2’s simple web service interface allows you to obtain and configure capacity with minimal friction.

The basic functions of EC2... 
It provides you with complete control of your computing resources and lets you run on Amazon’s proven computing environment.Amazon EC2 reduces the time required to obtain and boot new server instances to minutes, allowing you to quickly scale capacity, both up and down, as your computing requirements change.Amazon EC2 changes the economics of computing by allowing you to pay only for capacity that you actually use. Amazon EC2 provides developers the tools to build failure resilient applications and isolate themselves from common failure scenarios. 
Key Points for Interviews:
EC2 is the basic fundamental block around which the AWS are structured.EC2 provides remote ope…