Case Studies: Analyzing Sentiment & Loan Default Prediction In our case study on analyzing sentiment, you will create models that predict a class (positive/negative sentiment) from input features (text of the reviews, user profile information,…). In our second case study for this course, loan default prediction, you will tackle financial data, and predict when a loan is likely to be risky or safe for the bank. These tasks are an examples of classification, one of the most widely used areas of machine learning, with a broad array of applications, including ad targeting, spam detection, medical diagnosis and image classification. In this course, you will create classifiers that provide state–of–the–art performance on a variety of tasks. You will become familiar with the most successful techniques, which are most widely used in practice, including logistic regression, decision trees and boosting. In addition, you will be able to design and implement the underlying algorithms that can learn these models at scale, using stochastic gradient ascent. You will implement these technique on real–world, large–scale machine learning tasks. You will also address significant tasks you will face in real–world applications of ML, including handling missing data and measuring precision and recall to evaluate a classifier. This course …
Instructor Details
Courses : 2
Specification: Machine Learning: Classification
|
52 reviews for Machine Learning: Classification
Add a review Cancel reply
This site uses Akismet to reduce spam. Learn how your comment data is processed.
FREE
Akash G –
good
Reinhold L –
Very good course for classification in machine learning – top presentation documents – very well structured and practical
Shazia B –
one of the best experience about this course i gained I learned a lot about machine learning classification further machine learning regression thanks a lot Coursera 🙂
Ashish C –
more topics like deep learning, neural networks need to be introduced
Shashidhar Y –
Nice!!
Neelkanth S M –
The content is good but completing assignments is a real pain because they choose to deploy a unstable proprietary python library, which gives hard time installing and running (as of Q1 2019). The entire learning experience is marred by this Graphlab python library.
Martin B –
As with all the courses in this specialization: great production values, excellent tuition. Useful assignments, even though the reliance of Graphlab Create is a bit of a drag. I also would have liked to see some discussion of Support Vector Machines.
YASHKUMAR R T –
This course will provide you clear and detailed explanation of all the topics of Classification.
MAO M –
lots of work. very good for beginners
akashkr1498 –
good course but make quize and assignment quize more understandable
Miguel A B P –
Excellent course!
Vibhutesh K S –
It was a very detailed course. I wished, doing it much earlier in my research career. Great insights and Exercises.
Gaurav C –
Would have loved even more had Carlos explained his students gradient boosting as well. I liked the way of his taught in lectures.
Karthik M –
Excellent course and the instructors cover all the important topics
Dohyoung C –
Great … I learned quite a lot about classification
Md s –
awesome course , have learned lot of stuff
sudheer n –
The way Carlos Guestrin explains things is exquisite. if basics is what is very important to you, and can learn code implementation and libraries from other sources, this is the go to course
lokeshkunuku –
its been 3 weeks I started this course it was so nice and awesome. the lectures explaination and the ppt all were well crafted and easy to pick and understand.
Lewis C L –
First, coursera is a ghost town. There is no activity on the forum. Real responses stopped a year ago. Most of the activity is from 3 years ago. This course is dead. Two, this course seems to approach the topic as teaching inadequate ways to perform various tasks to show the inadequacies. You can learn from that; we will make mistakes or use approaches that are less than ideal. But, that should be a quick “don’t do this,” while moving on to better approaches Three, the professors seem to dismiss batch learning as a “dodgy” technique. If Hinton, Bengio, and other intellectual leaders of the field recommend it as the preferred technique, then it probably is. Four, the professors emphasize log likelihood. Mathematically, minus the log likelihood is the same as cross–entropy cost. The latter is more robust and applicable to nearly every classification problem (except decision trees), and so is a more versatile formulation. As neither actually plays any roll in the training algorithm except as guidance for the gradient and epsilon formulas and as a diagnostic, the more versatile and robust approach should be preferred. The professors seem very focused on decision trees. Despite the “apparent” intuitive appeal and computational tractability, the technique seems to be eclipsed by other methods. Worth teaching and occasionally using to be sure, but not for 3/4 of the course. There are many mechanical problems that remain in the material. At least 6 errors in formulas or instructions remain. Most can be searched for on the forum to find some resolution, through a lot of noise. Since the last corrections were made 3 years ago, the UW or Coursera’s lack of interest shows. It was a bit unnecessary to use a huge dataset that resulted in a training matrix or over 10 billion cells. Sure, if you wanted to focus on methods for scaling––very valuable indeed––go for it. But, this lead to unnecessary long training times and data issues that were, at best, orthogonal to the overall purpose of highlighting classification techniques and encouraging good insights about how classification techniques work. The best thing about the course was the willingness to allow various technologies to be used. The developers went to some lengths to make this possible. It was far more work to stray outside the velvet ropes of the Jupiter notebooks, but it was very rewarding. Finally, the quizzes were dependent on numerical point answers that could often be matched only by using the same exact technology and somewhat sloppy approaches (no lowercase for word sentiment analysis, etc.). It does take some cleverness to think of questions that lead to the right answer if the concepts are implemented properly. It doesn’t count when the answers rely precisely on anomalies. I learned a lot, but only because I wrote my own code and was able to think more clearly about it, but that was somewhat of a side effect. All in all, a disappointing somewhat out of date class.
Yufeng X –
The lecture is super. The exams could be more challenging–:)
Aakash S –
Amazing Explanation of every thing related to Classification. Thanks a lot for the course.
Thuc D X –
Sometimes the assignment description was hard to follow along. Overall, the course equips me a good understand and practical skills to tackle classification tasks.
Gaurav B –
Explaination Is Not good I have to take help from other courses
Jafed E –
I enjoy the lectures. The professor has a good speaking and teaching style which keeps me interested. Lots of concrete math examples which make it easier to understand. Very good slides which are well formulated and easy to understand
Naman M –
you can’t find a better course on machine learning as compared to this one. Simply the best course on coursera
Shaik R –
Best Machine Learning classification course by far…. each aspect is explained in detail..but forum responses can be improved.. Great course for machine Learning beginners… loved it.
Yacine M T –
Very helpful. Thank you
Fan J –
good content, help me a lot!
Kevin –
Great course for beginner to intermediate data science enthusiast! This course teaches you how to implement logistic regression, decision tree, AdaBoost algorithm, and stochastic approach from scratch! There’s also some assignment to learn how to implement those algorithms in our preferred library. Would be great if Carlos & Emily can bring another advanced machine learning course!
Naveendhar –
Last portion was a little difficult to relate to why we started this move for large datasets in the first place. I had to keep going to the fact that I am going to be handling large datasets. Like the use cases. simple and effective. The quizzes were simple and the graph questions were really helpful in gauging my understanding of math behind these models.
Hanna L –
Great class!
RISHI P M –
Good
Muhammad W K –
A great course. Starting from very simple and easy–to–understand concepts of classification, it takes us through very important grass–root concepts and algorithms necessary not only in classification but in better general machine learning understanding too. Like Precision and Recall, Boosting, Scalability and Online machine learning etc.
VIGNESHKUMAR R –
good
Gareth J –
A good course to teach the key points.
Muhammad Z H –
I have learned alot
RAJKUMAR R V –
It will definitely help you in understanding the basics to dept of most of the algorithms. Even though you are already aware of most of the things covered elsewhere related to Classification, this course will add up up a considerable amount of extra inputs which will help to understand and explore more things in Machine learning.
Parab N S –
Excellent course on Classification by University of Washington
AJAY K –
Excellent tutorials
Shrikrishna S P –
The course is very well structured. It starts from the basic classifiers, further moving on to more complex ones. The instructors teach how to implement each mentioned algorithm from scratch, this really makes the course above par. I loved the course and it helped me to become a good machine learning practitioner. Thanks Emily and Carlos.
Neemesh J –
Awesome learning experience.
Nitin K M –
The course is perfect for people who want to gain in–depth knowledge of classification algorithms but exercise descriptions are vague. I found trouble understanding the flow of assignments. Also, Bagging and Gradient Boosting techniques were not covered under ensembles. Overall, the course is awesome.
Francesco –
The material is good, but the choice of using GraphLab Create is a poor one. It’s not used in the industry and it’s poorly supported. I had issues installing it both via command line and via the installer, so I ended up using the AWS machine. But that has it’s own drawbacks, such as the slowness and the setup time.
AMARTHALURU N K –
good
Michael O T –
A great professor and a lot of knowledge about machine learning classification
Germanno T –
Excellent Course!
Nick S –
The course itself is well structured and introduce gradually the complexity. Unfortunately, the exercises requires the use of a specific library, instead of scikit–learn and numpy. Furthermore, they also required Python 2, while Python 3 is now widely used.
Jane z –
The hands–on approach is excellent. Not only I learned ML / Classification, I was able to practice Python skills and statistical skills as well. THANK YOU!
Emil K –
Such a great course. Brings the math behind machine learning to users without a math background. Thank you.
Aparna g –
very Good Concept
Mr. J –
spectacular
Cosmos D I –
This course is very informational!