Skip to content

Instantly share code, notes, and snippets.

@kyunghyuncho
Created August 9, 2023 21:27
Show Gist options
  • Save kyunghyuncho/c23ab4fd9bde563e3ca2c3290c84874b to your computer and use it in GitHub Desktop.
Save kyunghyuncho/c23ab4fd9bde563e3ca2c3290c84874b to your computer and use it in GitHub Desktop.
softmax probability clipping at the logit level
import numpy
s = numpy.random.randn(10)
def clip_probability_logits(logit, epsilon=0.):
s_max = numpy.max(logit)
s_transf = logit - s_max
normalization_const = numpy.exp(s_transf).sum()
p_transf = numpy.exp(s_transf) / normalization_const
p_transf = numpy.maximum(epsilon, p_transf)
s_back = numpy.log(p_transf) + numpy.log(normalization_const) + s_max
return s_back
print(numpy.abs(clip_probability_logits(s, epsilon=1e-3) - s_back).sum())
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment