30 | 027. Logistic Regression: Calculating a Probability | 사상최강 | 2021-01-10 | 49 |
24 | 021. Representation: Cleaning Data | 사상최강 | 2020-11-08 | 54 |
21 | 018. Validation Set | 사상최강 | 2020-10-19 | 65 |
28 | 025. Regularization for Simplicity: L2 Regularization | 사상최강 | 2020-12-13 | 68 |
27 | 024. Feature Crosses | 사상최강 | 2020-12-07 | 75 |
20 | 017. Training Set와 Test Set | 사상최강 | 2020-10-11 | 76 |
29 | 026. Regularization for Simplicity: Lambda (λ) | 사상최강 | 2020-12-20 | 80 |
26 | 023. Representation: Cleaning Data: Handling missing values | 사상최강 | 2020-11-23 | 86 |
23 | 020. Representation - 양질의 features 생성에 필요한 요건 | 사상최강 | 2020-11-01 | 92 |
19 | 016. Generalization (일반화) | 사상최강 | 2020-10-04 | 92 |
22 | 019. Representation - Feature Engineering | 사상최강 | 2020-10-25 | 95 |
18 | 015. TensorFlow 2.0 첫걸음 – Synthetic Features와 Outliers | 사상최강 | 2020-09-20 | 107 |
11 | 008. Reducing Loss - Iterative Approach | 사상최강 | 2020-07-05 | 108 |
13 | 010. Reducing Loss - Learning Rate | 사상최강 | 2020-07-26 | 114 |
10 | 007. Loss Function (Linear regression) | 사상최강 | 2020-07-05 | 134 |
31 | 028. Logistic Regression: Loss and Regularization | 사상최강 | 2021-01-17 | 137 |
16 | 013. TensorFlow 소개 | 사상최강 | 2020-08-30 | 138 |
12 | 009. Reducing Loss - Gradient Descent | 사상최강 | 2020-07-12 | 141 |
25 | 022. Representation: Cleaning Data: Scaling 보충 | 사상최강 | 2020-11-15 | 144 |
7 | 005. Linear Regression | 사상최강 | 2020-06-21 | 148 |