From the course: Cloud Hadoop: Scaling Apache Spark

Unlock the full course today

Join today to access over 22,600 courses taught by industry experts or purchase this course individually.

Spark ML: Evaluating the model

Spark ML: Evaluating the model - Apache Spark Tutorial

From the course: Cloud Hadoop: Scaling Apache Spark

Start my 1-month free trial

Spark ML: Evaluating the model

- [Instructor] So continuing on, we got a result of only, little over 29% accuracy, and that's just really not good enough for us to use this as a model. So, do we have some capability that we can use to evaluate other types of solutions to this problem? And we do. So, we're going to use this method, called RegressionMetrics. We can get more insight into our model performance. So just of note, RegressionMetrics requires input formatted as tuples of doubles where the first item is the prediction, and the second item is the observation, in this case the count, how many markets. Once you have mapped these values from holdout, you can directly pass them to the RegressionMetrics constructor. And the idea with this, is that we're going to have this holding area, so that we can see what it is we want to get out of this model and then we're going to apply some new methods against it. So the key line of this is mapped equals…

Contents