Week 2(Getting the feet wet)
Back to work on Tuesday. My conference went well and made some great research contacts.
Day 1:
- My Desktop finally arrived so I had to set it up. Big task; since I was so used to the UNIX ecosystem that I actually forgot how Windows feels.
- Never the less spent the day setting up the environment and finishing all the training tutorials
Day 2:
- Started studying the effect of different losses on computing the gradient.
- Came across a new loss function that accounts for managing outliers. Would recommend it to all(Huber Loss)
Day 3:
- Realized that regression was required for both classification and regression based gradient boosting. In the case of classification methods, the gradient is computed on the probability of a class. Thus, we need regression methods to power classification methods.
- Regression Tree is not implemented in ECL. I might have to implement this and I am excited about this prospect.
- Thought of building a proof of concept using linear regression classifier.
- Set up stub methods for Gradient Boosting Classifiers
Day 4:
- Completed the stub methods as planned.
- Committed source code to github.
No comments:
Post a Comment