Skip to content

Commit

Permalink
Add epsilon to prevent NaNs with tiny attention weights
Browse files Browse the repository at this point in the history
  • Loading branch information
Bjarke Felbo committed Oct 18, 2017
1 parent ac27e25 commit 1aadc4f
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion deepmoji/attlayer.py
Original file line number Diff line number Diff line change
Expand Up @@ -44,7 +44,7 @@ def call(self, x, mask=None):
if mask is not None:
mask = K.cast(mask, K.floatx())
ai = ai * mask
att_weights = ai / K.sum(ai, axis=1, keepdims=True)
att_weights = ai / (K.sum(ai, axis=1, keepdims=True) + K.epsilon())
weighted_input = x * K.expand_dims(att_weights)
result = K.sum(weighted_input, axis=1)
if self.return_attention:
Expand Down

0 comments on commit 1aadc4f

Please sign in to comment.