Created
August 9, 2023 21:27
-
-
Save kyunghyuncho/c23ab4fd9bde563e3ca2c3290c84874b to your computer and use it in GitHub Desktop.
softmax probability clipping at the logit level
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
import numpy | |
s = numpy.random.randn(10) | |
def clip_probability_logits(logit, epsilon=0.): | |
s_max = numpy.max(logit) | |
s_transf = logit - s_max | |
normalization_const = numpy.exp(s_transf).sum() | |
p_transf = numpy.exp(s_transf) / normalization_const | |
p_transf = numpy.maximum(epsilon, p_transf) | |
s_back = numpy.log(p_transf) + numpy.log(normalization_const) + s_max | |
return s_back | |
print(numpy.abs(clip_probability_logits(s, epsilon=1e-3) - s_back).sum()) |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment