Skip to content

Calculate hypervolume in HSSP using sum of contributions #6130

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 2 commits into
base: master
Choose a base branch
from

Conversation

not522
Copy link
Member

@not522 not522 commented Jun 5, 2025

Motivation

The hypervolume determined by the selected points can be calculated as the sum of their contributions. This allows skipping the initial hypervolume calculation for each iteration in HSSP.

Description of the changes

  • Change to calculate hypervolume in HSSP using sum of contributions

Benchmark

  • master: 26.196007 sec
  • PR: 25.605701 sec
import optuna


def objective(trial: optuna.Trial) -> tuple[float, float, float]:
    x = trial.suggest_float("x", -5, 5)
    y = trial.suggest_float("y", -5, 5)
    return x**2 + y**2, (x - 2)**2 + (y - 2)**2, (x + 2)**2 + (y + 2)**2


sampler = optuna.samplers.TPESampler(seed=42)
study = optuna.create_study(sampler=sampler, directions=["minimize"]*3)
study.optimize(objective, n_trials=1000)
trials = study.trials
print((trials[-1].datetime_complete - trials[0].datetime_start).total_seconds())

@not522 not522 added the enhancement Change that does not break compatibility and not affect public interfaces, but improves performance. label Jun 5, 2025
Copy link
Contributor

@nabenabe0928 nabenabe0928 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you change here to np.maximum(0.0, contribs) so that any contribs will not be lower than max_contrib and add an inline note about this?
Such a situation can happen due to numerical instability.

return contribs

@nabenabe0928
Copy link
Contributor

@sawa3030 Could you check this PR?

@not522
Copy link
Member Author

not522 commented Jun 6, 2025

Could you change here to np.maximum(0.0, contribs) so that any contribs will not be lower than max_contrib and add an inline note about this? Such a situation can happen due to numerical instability.

return contribs

It's OK, but does that situation cause any errors?

@nabenabe0928
Copy link
Contributor

Could you change here to np.maximum(0.0, contribs) so that any contribs will not be lower than max_contrib and add an inline note about this? Such a situation can happen due to numerical instability.

return contribs

It's OK, but does that situation cause any errors?

I am not sure how likely it is, but it may trigger some unwanted behavior.
For example, if all(contribs < 0) becomes True at some point, contribs will not be updated anymore due to the check if contribs[i] < max_contrib: continue.
Yet, there should (or can) still be rankings among contribs.
By removing np.maximum, we may lose such information, resulting in picking up indices based on the contribs right before all(contribs < 0) holds.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement Change that does not break compatibility and not affect public interfaces, but improves performance.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants