ksddescent.ksdd_gradient¶
- ksddescent.ksdd_gradient(x0, score, step, kernel='gaussian', max_iter=1000, bw=1, store=False, verbose=False, clamp=None, beta=0.2)¶
Kernel Stein Discrepancy descent with gradient descent
Perform Kernel Stein Discrepancy descent with gradient descent. Since it uses gradient descent, a step size must be specified.
- Parameters
- x0torch.tensor, size n_samples x n_features
initial positions
- scorecallable
function that computes the score
- stepfloat
step size
- max_iterint
max numer of iters
- bwfloat
bandwidth of the stein kernel
- storesNone or list of ints
whether to stores the iterates at the indexes in the list
- verbose: bool
wether to print the current loss
- clamp:
if not None, should be a tuple (a, b). The points x are then constrained to stay in [a, b]
- Returns
- x: torch.tensor, size n_samples x n_features
The final positions
- loss_listlist of floats
List of the loss values during iterations
References
A.Korba, P-C. Aubin-Frankowski, S.Majewski, P.Ablin. Kernel Stein Discrepancy Descent International Conference on Machine Learning, 2021.