Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Scope of eval_metric in binary classification with sparkxgb #25

Open
mzorko opened this issue Nov 13, 2019 · 0 comments
Open

Scope of eval_metric in binary classification with sparkxgb #25

mzorko opened this issue Nov 13, 2019 · 0 comments

Comments

@mzorko
Copy link

mzorko commented Nov 13, 2019

Hello, so the documentation states:

# eval_metric	
# Evaluation metrics for validation data, a default metric will be assigned according to objective 
# (rmse for regression, and error for classification, mean average precision for ranking). 
# options: rmse, mae, logloss, error, merror, mlogloss, auc, aucpr, ndcg, map, gamma-deviance

# maximize_evaluation_metrics	
# Whether to maximize evaluation metrics. Defaults to FALSE (for minization.)

When I try eval_metric = "auc" or eval_metric = "aucpr" or eval_metric = "error" the code below never completes. Do I have to set up another parameter, or is this option not supported at the moment? (eval_metric = "merror" and eval_metric = "mlogloss" work)

library(sparkxgb)
library(sparklyr)

sc <- spark_connect(master = "local")

my_data <- data.frame(x = 1:10, y = 11:20, target = sample(c("a", "b"), 10, TRUE)) %>% 
  sdf_copy_to(sc, .)

xgb_model <- xgboost_classifier(
  my_data, 
  target ~ .,
  num_class = 2,
  num_round = 5, 
  max_depth = 4,
  eval_metric = "auc",
  maximize_evaluation_metrics = TRUE
)

Looks like this is the problem with binary classification.

Where can I find out more about scope of all of the parameters regarding sparkxgb?
What is the best place to start if I want to help with problems regarding sparklyr extensions?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant