embedding 发表于 2018-04-Wed | 阅读次数: 1234567891011121314151617181920212223242526272829303132>>> # an Embedding module containing 10 tensors of size 3>>> embedding = nn.Embedding(10, 3)>>> # a batch of 2 samples of 4 indices each>>> input = Variable(torch.LongTensor([[1,2,4,5],[4,3,2,9]]))>>> embedding(input)Variable containing:(0 ,.,.) = -1.0822 1.2522 0.2434 0.8393 -0.6062 -0.3348 0.6597 0.0350 0.0837 0.5521 0.9447 0.0498(1 ,.,.) = 0.6597 0.0350 0.0837 -0.1527 0.0877 0.4260 0.8393 -0.6062 -0.3348 -0.8738 -0.9054 0.4281[torch.FloatTensor of size 2x4x3]>>> # example with padding_idx>>> embedding = nn.Embedding(10, 3, padding_idx=0)>>> input = Variable(torch.LongTensor([[0,2,0,5]]))>>> embedding(input)Variable containing:(0 ,.,.) = 0.0000 0.0000 0.0000 0.3452 0.4937 -0.9361 0.0000 0.0000 0.0000 0.0706 -2.1962 -0.6276[torch.FloatTensor of size 1x4x3]请作者喝一杯咖啡☕️打赏微信支付