Changing the metric used in the progress table when creating a model

User 2568 | 1/11/2016, 12:24:44 AM

When using boostedtreesclassifier.create (or any model creation), is it possible to use a different evaluation metric than Accuracy in the progress table

The documentation for boostedtreesclassifier.create states in the section on validationset that "For each row of the progress table, the chosen metrics are computed for both the provided training dataset and the validationset." which seems to imply I can chose the metric

Comments

User 91 | 1/13/2016, 6:25:17 PM

Thanks for your comments! Unfortunately, this isn't possible right now, but this is a very useful feature and is something we definitely want to do in the near future.


User 2568 | 1/14/2016, 2:36:05 AM

Thanks. Please let the developers know this is a hot issue for me. Since I need to to rerun the model for a range of iterations, increasing the time to produce the plot by at least a factor of 10x

Here is an example


User 3147 | 1/29/2016, 4:55:06 PM

Hi all! @srikris: This would be a really welcomed feature! Are you going to make it available through the "graphlab RandomForestClassifier" too or do you intend to appropriately generalize the "graphlab.evaluate" module? I also think that you should improve the corresponding documentation, not only as far as the provided functionality concerns but most probably with references and research papers to help us better understand the underlying algorithms which are used by the functions.