softmax后出现nan 发表于 2019-04-Sun | 阅读次数: 12345678910111213141516>>> import torch>>> import torch.nn.functional as F>>> F.softmax(torch.Tensor([0, float('-inf')]), -1)tensor([ 1.0000, 0.0000])>>> F.softmax(torch.Tensor([0, float('inf')]), -1) # should give [0.0, 1.0]tensor([ nan, nan])>>> F.log_softmax(torch.Tensor([0, float('-inf')]), -1)tensor([ 0.0000, -inf])>>> F.log_softmax(torch.Tensor([0, float('inf')]), -1)tensor([ nan, nan])>>> F.softmax(torch.Tensor([float('-inf'), 0, float('-inf')]), -1)tensor([ 0.0000, 1.0000, 0.0000])>>> F.softmax(torch.Tensor([0, float('inf'), 0]), -1) # should give [0.0, 1.0, 0.0]tensor([ nan, nan, nan])>>> F.softmax(torch.Tensor([float('-inf'), 0, float('inf')]), -1) # should give [0.0, 0.0, 1.0]tensor([ nan, nan, nan])请作者喝一杯咖啡☕️打赏微信支付