From the course: Machine Learning and AI Foundations: Value Estimations

Unlock the full course today

Join today to access over 22,600 courses taught by industry experts or purchase this course individually.

Feature selection

Feature selection

From the course: Machine Learning and AI Foundations: Value Estimations

Start my 1-month free trial

Feature selection

- [Instructor] Let's open up feature_selection.py. In our house price model, if we include the 18 original features, plus the new features that were created by using one hot in coding, we have a total of 63 features. Some of the features, like the size of the house in sq feet, are probably really important to determining the value of the house. Other features, like whether the house has a fireplace, probably matter less when calculating the final price, but how much less? Maybe there are features that don't matter at all, and we can just remove them from our model. With the tree based machine learning algorithm like radiant boosting, we can actually look at the train model and have it tell us how often each feature is used in determining the final price. First, let's load up the model using joblib.load. If you don't have a train house classifier model.pkl file, just open up train_model.py and run it first to create one. Now we can get the importance of each feature from our trained…

Contents