Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG] KeyError: 'params/eta' #1244

Open
clearhanhui opened this issue Oct 13, 2023 · 4 comments
Open

[BUG] KeyError: 'params/eta' #1244

clearhanhui opened this issue Oct 13, 2023 · 4 comments

Comments

@clearhanhui
Copy link

clearhanhui commented Oct 13, 2023

Bug details

I am running HPO for XGBoost with Ray and Bendsearch.
At flaml/tune/searcher/search_thread.py#L66, in my case, the config is

{
'num_boost_round': 10, 
'params': {'max_depth': 12, 'eta': 0.020168455186106736, 'min_child_weight': 1.4504723523894132, 'scale_pos_weight': 3.794258636185337, 'gamma': 0.4985070123025904}
}

and the self._const is

{
'params': {'verbosity': 3, 'booster': 'gbtree', 'eval_metric': 'auc', 'tree_method': 'hist', 'objective': 'binary:logistic'}
}

after update step, I will get

{
'num_boost_round': 10, 
'params': {'verbosity': 3, 'booster': 'gbtree', 'eval_metric': 'auc', 'tree_method': 'hist', 'objective': 'binary:logistic'}
}

Values in config['params'] sampled from search space are all dropped.

How to solve

I solved it by recursively update config . Here is an example:

def recursive_update(d:dict, u:dict):
    """
    Args:
        d (dict): The target dictionary to be updated.
        u (dict): A dictionary containing values to be merged into `d`.
    """
    for k, v in u.items():
        if isinstance(v, dict) and k in d and isinstance(d[k], dict):
            recursive_update(d[k], v)
        else:
            d[k] = v

Then just replace config.update(self._const) with recursive_update(config, self._const), then i can get:

{
'num_boost_round': 10, 
'params': {'max_depth': 12, 'eta': 0.020168455186106736, 'min_child_weight': 1.4504723523894132, 'scale_pos_weight': 3.794258636185337, 'gamma': 0.4985070123025904, 'verbosity': 3, 'booster': 'gbtree', 'eval_metric': 'auc', 'tree_method': 'hist', 'objective': 'binary:logistic'}
}

My Traceback

Traceback (most recent call last):
  File "xgb_main.py", line 73, in <module>
    val_set=val_set, val_set_params=val_set_params)
  File "/home/ray/code-repo/dml-mljobs/xgb/utils.py", line 27, in wrapper
    v = func(*args, **kwargs)
  File "/home/ray/code-repo/dml-mljobs/xgb/tune.py", line 131, in fit
    result_grid = tuner.fit()
  File "/usr/local/python3/lib/python3.7/site-packages/ray/tune/tuner.py", line 292, in fit
    return self._local_tuner.fit()
  File "/usr/local/python3/lib/python3.7/site-packages/ray/tune/impl/tuner_internal.py", line 455, in fit
    analysis = self._fit_internal(trainable, param_space)
  File "/usr/local/python3/lib/python3.7/site-packages/ray/tune/impl/tuner_internal.py", line 573, in _fit_internal
    **args,
  File "/usr/local/python3/lib/python3.7/site-packages/ray/tune/tune.py", line 756, in run
    runner.step()
  File "/usr/local/python3/lib/python3.7/site-packages/ray/tune/execution/trial_runner.py", line 953, in step
    next_trial = self._update_trial_queue_and_get_next_trial()
  File "/usr/local/python3/lib/python3.7/site-packages/ray/tune/execution/trial_runner.py", line 889, in _update_trial_queue_and_get_next_trial
    if not self._update_trial_queue(blocking=wait_for_trial):
  File "/usr/local/python3/lib/python3.7/site-packages/ray/tune/execution/trial_runner.py", line 1475, in _update_trial_queue
    trial = self._search_alg.next_trial()
  File "/usr/local/python3/lib/python3.7/site-packages/ray/tune/search/search_generator.py", line 101, in next_trial
    self._experiment.spec, self._experiment.dir_name
  File "/usr/local/python3/lib/python3.7/site-packages/ray/tune/search/search_generator.py", line 110, in create_trial_if_possible
    suggested_config = self.searcher.suggest(trial_id)
  File "/usr/local/python3/lib/python3.7/site-packages/ray/tune/search/concurrency_limiter.py", line 108, in suggest
    suggestion = self.searcher.suggest(trial_id)
  File "/usr/local/python3/lib/python3.7/site-packages/flaml/tune/searcher/blendsearch.py", line 691, in suggest
    skip = self._should_skip(choice, trial_id, config, space)
  File "/usr/local/python3/lib/python3.7/site-packages/flaml/tune/searcher/blendsearch.py", line 823, in _should_skip
    config_signature = self._ls.config_signature(config, space)
  File "/usr/local/python3/lib/python3.7/site-packages/flaml/tune/searcher/flow2.py", line 635, in config_signature
    value = config[key]
KeyError: 'params/eta'
@sonichi
Copy link
Collaborator

sonichi commented Oct 14, 2023

Thanks. Would you like to create a PR?

@clearhanhui
Copy link
Author

Thanks. Would you like to create a PR?

Yes. #1246

@yxtay
Copy link

yxtay commented Mar 15, 2024

One way I worked around this bug is to ensure that space/param_space only contains hyperparameters defined with tune search space and remove any constants. I am using Ray Train TorchTrainer, so I moved the constants there instead.

If you are using the function trainable API, consider splitting out constants from config into separate arguments and use tune.with_parameters().

I believe this bug happens when trying to merge the constants with the sampled hyperparameters in config.

@thinkall
Copy link
Collaborator

thinkall commented Mar 15, 2024

One way I worked around this bug is to ensure that space/param_space only contains hyperparameters defined with tune search space and remove any constants. I am using Ray Train TorchTrainer, so I moved the constants there instead.

If you are using the function trainable API, consider splitting out constants from config into separate arguments and use tune.with_parameters().

I believe this bug happens when trying to merge the constants with the sampled hyperparameters in config.

Thank you @yxtay.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants