ICode9

精准搜索请尝试: 精确搜索
首页 > 其他分享> 文章详细

[ML] {ud120} Lesson 4: Decision Trees

2019-06-07 10:47:49  阅读:393  来源: 互联网

标签:mini ud120 ML Decision Tree tree project Entropy Lesson


Linearly Separable Data

 

 

 

 

 

 Multiple Linear Questions

 

 

 

 

 

 

 

Constructing a Decision Tree First Split 

 

 

 

 

 Coding A Decision Tree

 

 

 

 

 

 

 

 

 

 Decision Tree Parameters

 

 

 

 

 Data Impurity and Entropy

 

 

 

 

 Formula of Entropy

 

 

There is an error in the formula in the entropy written on this slide. There should be a negative (-) sign preceding the sum:

Entropy = - \sum_i (p_i) \log_2 (p_i)−∑i​(pi​)log2​(pi​)

 

 

 

 

 

IG = 1

 

 

 

 Tuning Criterion Parameter

 

gini is another measurement of purity

 

 

 

Decision Tree Mini-Project

 

In this project, we will again try to identify the authors in a body of emails, this time using a decision tree. The starter code is in decision_tree/dt_author_id.py.

Get the data for this mini project from here.

Once again, you'll do the mini-project on your own computer and enter your answers in the web browser. You can find the instructions for the decision tree mini-project here.

 

标签:mini,ud120,ML,Decision,Tree,tree,project,Entropy,Lesson
来源: https://www.cnblogs.com/ecoflex/p/10987754.html

本站声明: 1. iCode9 技术分享网(下文简称本站)提供的所有内容,仅供技术学习、探讨和分享;
2. 关于本站的所有留言、评论、转载及引用,纯属内容发起人的个人观点,与本站观点和立场无关;
3. 关于本站的所有言论和文字,纯属内容发起人的个人观点,与本站观点和立场无关;
4. 本站文章均是网友提供,不完全保证技术分享内容的完整性、准确性、时效性、风险性和版权归属;如您发现该文章侵犯了您的权益,可联系我们第一时间进行删除;
5. 本站为非盈利性的个人网站,所有内容不会用来进行牟利,也不会利用任何形式的广告来间接获益,纯粹是为了广大技术爱好者提供技术内容和技术思想的分享性交流网站。

专注分享技术,共同学习,共同进步。侵权联系[81616952@qq.com]

Copyright (C)ICode9.com, All Rights Reserved.

ICode9版权所有