You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
# eval_metric # Evaluation metrics for validation data, a default metric will be assigned according to objective # (rmse for regression, and error for classification, mean average precision for ranking). # options: rmse, mae, logloss, error, merror, mlogloss, auc, aucpr, ndcg, map, gamma-deviance# maximize_evaluation_metrics # Whether to maximize evaluation metrics. Defaults to FALSE (for minization.)
When I try eval_metric = "auc" or eval_metric = "aucpr" or eval_metric = "error" the code below never completes. Do I have to set up another parameter, or is this option not supported at the moment? (eval_metric = "merror" and eval_metric = "mlogloss" work)
Looks like this is the problem with binary classification.
Where can I find out more about scope of all of the parameters regarding sparkxgb?
What is the best place to start if I want to help with problems regarding sparklyr extensions?
The text was updated successfully, but these errors were encountered:
Hello, so the documentation states:
When I try
eval_metric = "auc"
oreval_metric = "aucpr"
oreval_metric = "error"
the code below never completes. Do I have to set up another parameter, or is this option not supported at the moment? (eval_metric = "merror"
andeval_metric = "mlogloss"
work)Looks like this is the problem with binary classification.
Where can I find out more about scope of all of the parameters regarding sparkxgb?
What is the best place to start if I want to help with problems regarding
sparklyr
extensions?The text was updated successfully, but these errors were encountered: