Tutorial on word2vector using GloVe and Word2Vec
发布日期:2021-09-24 22:28:43 浏览次数:2 分类:技术文章

本文共 87935 字,大约阅读时间需要 293 分钟。

Tutorial on word2vector using GloVe and Word2Vec

2018-05-04 10:02:53

 

Some Important Reference Pages First:   

Reference Page 

Glove Project Page:  

Word2Vec Project Page:  

Pre-trained word2vec model:  

Gensim Tutorial:  

 

=================================== 

=====    For the Glove                 

===================================

1. Download one of the pre-trained model from  and Unzip the files. 

1 Wikipedia 2014 + Gigaword 5 (6B tokens, 400K vocab, uncased, 50d, 100d, 200d, & 300d vectors, 822 MB download): glove.6B.zip2 Common Crawl (42B tokens, 1.9M vocab, uncased, 300d vectors, 1.75 GB download): glove.42B.300d.zip3 Common Crawl (840B tokens, 2.2M vocab, cased, 300d vectors, 2.03 GB download): glove.840B.300d.zip

 

2. Install the needed packages: 

pickle, numpy, re, pickle, collections, bcolz

 

3. Run the following demo to test the results (extract the feature of given single word). 

Code: 

1 import pickle  2 import numpy as np 3 import re, pickle, collections, bcolz 4  5 # with open('./glove.840B.300d.txt', 'r', encoding="utf8") as f: 6  7 with open('./glove.6B.200d.txt', 'r') as f: 8     lines = [line.split() for line in f] 9 10 print('==>> begin to load Glove pre-trained models.')11 glove_words = [elem[0] for elem in lines] 12 glove_words_idx = {elem:idx for idx, elem in enumerate(glove_words)}    # is elem: idx equal to glove_words_idx[elem]=idx? 13 glove_vecs = np.stack(np.array(elem[1:], dtype=np.float32) for elem in lines) 14 15 print('==>> save into .pkl files.')16 pickle.dump(glove_words, open('./glove.6B.200d.txt'+'_glove_words.pkl', 'wb')) 17 pickle.dump(glove_words_idx, open('./glove.6B.200d.txt'+'_glove_words_idx.pkl', 'wb')) 18 19 ## saving array using specific function. 20 def save_array(fname, arr): 21     c=bcolz.carray(arr, rootdir=fname, mode='w')22     c.flush()23 24 save_array('./glove.6B.200d.txt'+'_glove_vecs'+'.dat', glove_vecs) 25 26 def load_glove(loc):27     return (bcolz.open(loc+'_glove_vecs.dat')[:],28         pickle.load(open(loc+'_glove_words.pkl', 'rb')),29         pickle.load(open(loc+'_glove_words_idx.pkl', 'rb'))) 30 31 32 ###############################################33 print('==>> Loading the glove.6B.200d.txt files.')34 en_vecs, en_wv_word, en_wv_idx = load_glove('./glove.6B.200d.txt')35 en_w2v = {w: en_vecs[en_wv_idx[w]] for w in en_wv_word} 36 n_en_vec, dim_en_vec = en_vecs.shape37 38 print('==>> shown one demo: "King"') 39 demo_vector = en_w2v['king']40 print(demo_vector)41 print("==>> Done !")
View Code

Results: 

wangxiao@AHU$ python tutorial_Glove_word2vec.py ==>> begin to load Glove pre-trained models.==>> save into .pkl files.==>> Loading the glove.6B.200d.txt files.==>> shown one demo: "King" [-0.49346 -0.14768 0.32166001 0.056899 0.052572 0.20192 -0.13506 -0.030793 0.15614 -0.23004 -0.66376001 -0.27316001 0.10391 0.57334 -0.032355 -0.32765999 -0.27160001 0.329189990.41305 -0.18085 1.51670003 2.16490006 -0.10278 0.098019 -0.018946 0.027292 -0.79479998 0.36631 -0.33151001 0.28839999 0.10436 -0.19166 0.27326 -0.17519 -0.14985999 -0.072333 -0.54370999 -0.29728001 0.081491 -0.42673001 -0.36406001 -0.52034998 0.18455 0.44121 -0.32196 0.39172 0.11952 0.36978999 0.29229 -0.42954001 0.46653 -0.067243 0.31215999 -0.17216 0.48874 0.28029999 -0.17577 -0.35100999 0.020792 0.15974 0.21927001 -0.32499 0.086022 0.38927001 -0.65638 -0.67400998 -0.41896001 1.27090001 0.20857 0.28314999 0.58238 -0.14944001 0.3989 0.52680999 0.35714 -0.39100999 -0.55372 -0.56642002 -0.15762 -0.48004001 0.40448001 0.057518 -1.01569998 0.21754999 0.073296 0.15237001 -0.38361999 -0.75308001 -0.0060254 -0.26232001 -0.54101998 -0.34347001 0.11113 0.47685 -0.73229998 0.77596998 0.015216 -0.66327 -0.21144 -0.42964 -0.72689998 -0.067968 0.50601 0.039817 -0.27584001 -0.34794 -0.0474 0.50734001 -0.30777001 0.11594 -0.19211 0.3107 -0.60075003 0.22044 -0.36265001 -0.59442002 -1.20459998 0.10619 -0.60277998 0.21573 -0.35361999 0.55473 0.58094001 0.077259 1.0776 -0.1867 -1.51680005 0.32418001 0.83332998 0.17366 1.12320006 0.10863 0.55888999 0.30799001 0.084318 -0.43178001 -0.042287 -0.054615 0.054712 -0.80914003 -0.24429999 -0.076909 0.55216002 -0.71895999 0.83319002 0.020735 0.020472 -0.40279001 -0.28874001 0.23758 0.12576 -0.15165 -0.69419998 -0.25174001 0.29591 0.40290001 -1.0618 0.19847 -0.63463002 -0.70842999 0.067943 0.57366002  0.041122 0.17452 0.19430999 -0.28641 -1.13629997 0.45116001 -0.066518 0.82615 -0.45451999 -0.85652 0.18105 -0.24187 0.20152999 0.72298002 0.17415 -0.87327999 0.69814998 0.024706 0.26174 -0.0087155 -0.39348999 0.13801 -0.39298999 -0.23057 -0.22611 -0.14407 0.010511 -0.47389001 -0.15645 0.28601 -0.21772 -0.49535 0.022209 -0.23575 -0.22469001 -0.011578 0.52867001 -0.062309 ]==>> Done !
View Code

 

4. Extract the feature of the whole sentense.  

  Given one sentense, such as "I Love Natural Language Processing", we can translate this sentense into a matrix representation. Specifically, we represent each word of the sentense with a vector which lies in a continuous space (as shown above). You can also see this blog:  to futher understand this process. 

  但是有如下的疑问:

  1. 给定的句子,长短不一,每一个单词的维度固定,但是总的句子的长度不固定,怎么办?

  只好用 padding 的方法了,那么,怎么进行 padding 呢? 

For CNN architecture usually the input is (for each sentence) a vector of embedded words [GloVe(w0), GloVe(w1), ..., GloVe(wN)] of fixed length N. So commonly you preset the maximum sentence length and just pad the tail of the input vector with zero vectors (just as you did). Marking the beginning and the end of the sentence is more relevant for the RNN, where the sentence length is expected to be variable and processing is done sequentially.Having said that, you can add a new dimension to the GloVe vector, setting it to zero for normal words, and arbitrarily to (say) 0.5 for BEGIN, and 1 for END and a random negative value for OOV.As for unknown words, you should ;) encounter them very rear, otherwise you might consider training the embeddings by yourself.那么,是什么意思呢?对于 CNN 的网络来说,由于需要处理 网格状的 data,所以,需要将句子进行填充,以得到完整的 matrix,然后进行处理;对于 RNN/LSTM 的网络来说,是可以按照时刻,顺序处理的,所以呢?就不需要 padding 了?

 

  2.  Here is another tutorial on Word2Vec using deep learning tools --- Keras.  

from numpy import arrayfrom keras.preprocessing.text import one_hotfrom keras.preprocessing.sequence import pad_sequencesfrom keras.models import Sequentialfrom keras.layers import Densefrom keras.layers import Flattenfrom keras.layers.embeddings import Embedding# define documentsdocs = ['Well done!',        'Good work',        'Great effort',        'nice work',        'Excellent!',        'Weak',        'Poor effort!',        'not good',        'poor work',        'Could have done better.']# define class labelslabels = array([1,1,1,1,1,0,0,0,0,0])# integer encode the documentsvocab_size = 50encoded_docs = [one_hot(d, vocab_size) for d in docs]print("==>> encoded_docs: ")print(encoded_docs)# pad documents to a max length of 4 wordsmax_length = 4padded_docs = pad_sequences(encoded_docs, maxlen=max_length, padding='post')print("==>> padded_docs: ")print(padded_docs)# define the modelmodel = Sequential()model.add(Embedding(vocab_size, 8, input_length=max_length))model.add(Flatten())model.add(Dense(1, activation='sigmoid'))# compile the modelmodel.compile(optimizer='adam', loss='binary_crossentropy', metrics=['acc'])# summarize the modelprint("==>> model.summary()") print(model.summary())# fit the modelmodel.fit(padded_docs, labels, nb_epoch=100, verbose=0) # evaluate the modelloss, accuracy = model.evaluate(padded_docs, labels, verbose=0)print("==>> the final Accuracy: ") print('Accuracy: %f' % (accuracy*100))
View Code

  The output is:    

Using TensorFlow backend.==>> encoded_docs: [[26, 3], [18, 29], [39, 48], [34, 29], [23], [14], [3, 48], [40, 18], [3, 29], [40, 19, 3, 4]]==>> padded_docs: [[26 3 0 0][18 29 0 0][39 48 0 0][34 29 0 0][23 0 0 0][14 0 0 0][ 3 48 0 0][40 18 0 0][ 3 29 0 0][40 19 3 4]]WARNING:tensorflow:From /usr/local/lib/python2.7/dist-packages/keras/backend/tensorflow_backend.py:1047: calling reduce_prod (from tensorflow.python.ops.math_ops) with keep_dims is deprecated and will be removed in a future version.Instructions for updating:keep_dims is deprecated, use keepdims insteadWARNING:tensorflow:From /usr/local/lib/python2.7/dist-packages/keras/backend/tensorflow_backend.py:1108: calling reduce_mean (from tensorflow.python.ops.math_ops) with keep_dims is deprecated and will be removed in a future version.Instructions for updating:keep_dims is deprecated, use keepdims instead==>> model.summary()____________________________________________________________________________________________________Layer (type) Output Shape Param # Connected to ====================================================================================================embedding_1 (Embedding) (None, 4, 8) 400 embedding_input_1[0][0] ____________________________________________________________________________________________________flatten_1 (Flatten) (None, 32) 0 embedding_1[0][0] ____________________________________________________________________________________________________dense_1 (Dense) (None, 1) 33 flatten_1[0][0] ====================================================================================================Total params: 433Trainable params: 433Non-trainable params: 0____________________________________________________________________________________________________None2018-05-08 14:39:59.053198: I tensorflow/core/platform/cpu_feature_guard.cc:140] Your CPU supports instructions that this TensorFlow binary was not compiled to use: AVX2 FMA==>> the final Accuracy: Accuracy: 89.999998
View Code

   

  3. Maybe translate the sentence into vectors using skip-thought vectors is another goog choice. you can also check this tutorial from: 

     

  4. But we still want to use GloVe to extract the vector of each word and concatenate them into a long vector. 

1 import os 2 import numpy as np 3 from collections import OrderedDict 4 import re, pickle, collections, bcolz 5 import pdb  6  7 seq_home = '/dataset/language-dataset/' 8 seqlist_path = '/dataset/language-train-video-list.txt' 9 output_path = 'data/language-train-video-list.pkl'10 11 def load_glove(loc):12     return (bcolz.open(loc+'_glove_vecs.dat')[:],13         pickle.load(open(loc+'_glove_words.pkl', 'rb')),14         pickle.load(open(loc+'_glove_words_idx.pkl', 'rb'))) 15 16 pre_trained_model_path = '/glove/glove.6B.200d.txt'17 18 ###############################################19 print('==>> Loading the glove.6B.200d.txt files.')20 en_vecs, en_wv_word, en_wv_idx = load_glove(pre_trained_model_path)21 en_w2v = {w: en_vecs[en_wv_idx[w]] for w in en_wv_word} 22 n_en_vec, dim_en_vec = en_vecs.shape23 24 25 with open(seqlist_path,'r') as fp:26     seq_list = fp.read().splitlines()27 28 data = {}29 for i,seq in enumerate(seq_list):30     img_list = sorted([p for p in os.listdir(seq_home+seq+'/imgs/') if os.path.splitext(p)[1] == '.jpg'])   ## image list 31     gt = np.loadtxt(seq_home+seq+'/groundtruth_rect.txt', delimiter=',')    ## gt files 32     language_txt = open(seq_home+seq+'/language.txt', "rw+")     ## natural language description files 33 34     line = language_txt.readline()35     print("==>> language: %s" % (line)) 36 37     gloveVector = [] 38     test_txtName = seq_home+seq+'/glove_vector.txt'39     f=file(test_txtName, "w")40 41     word_list = line.split(' ')42     for word_idx in range(len(word_list)): 43         current_word = word_list[word_idx] 44         current_GloVe_vector = en_w2v[current_word]     ## vector dimension is: 200-D  45 46         gloveVector = np.concatenate((gloveVector, current_GloVe_vector), axis=0) 47         f.write(str(current_GloVe_vector))48         f.write('\n') 49     f.close()50 51     print(i)52 53     assert len(img_list) == len(gt), "Lengths do not match!!"54     55     if gt.shape[1]==8:56         x_min = np.min(gt[:,[0,2,4,6]],axis=1)[:,None]57         y_min = np.min(gt[:,[1,3,5,7]],axis=1)[:,None]58         x_max = np.max(gt[:,[0,2,4,6]],axis=1)[:,None]59         y_max = np.max(gt[:,[1,3,5,7]],axis=1)[:,None]60         gt = np.concatenate((x_min, y_min, x_max-x_min, y_max-y_min),axis=1)61 62     data[seq] = {
'images':img_list, 'gt':gt, 'gloveVector': gloveVector}63 64 with open(output_path, 'wb') as fp:65 pickle.dump(data, fp, -1)
View Code

  The Glove vector can be saved into a txt file for each video, as shown in following screenshots. 

[ 0.19495     0.60610002 -0.077356    0.017301   -0.51370001  0.22421999 -0.80773002  0.022378    0.30256     1.06669998 -0.10918     0.57902998  0.23986     0.1691      0.0072384   0.42197999 -0.20458999  0.60271001  0.19188    -0.19616     0.33070001  3.20020008 -0.18104     0.20784  0.49511001 -0.42258999  0.022125    0.24379     0.16714001 -0.20909999 -0.12489    -0.51766998 -0.13569    -0.25979999 -0.17961    -0.47663 -0.89539999 -0.27138999  0.17746     0.45827001  0.21363001  0.22342999 -0.049342    0.34286001 -0.32315001  0.27469999  0.95993    -0.25979  0.21250001 -0.21373001  0.19809     0.15455    -0.48581001  0.38925001  0.33747    -0.27897999  0.19371    -0.45872    -0.054217   -0.24022999  0.59153003  0.12453    -0.21302     0.058223   -0.046671   -0.011614 -0.32025999  0.64120001 -0.28718999  0.035138    0.39287001 -0.030683  0.083529   -0.010964   -0.62427002 -0.13575    -0.38468999  0.11454 -0.61037999  0.12594999 -0.17633    -0.21415    -0.37013999  0.21763  0.055366   -0.25242001 -0.45475999 -0.28105     0.18911    -1.58539999  0.64841002 -0.34621999  0.59254003 -0.39034    -0.44258001  0.20562001  0.44395     0.23019999 -0.35018    -0.36090001  0.62993002 -0.34698999 -0.31964999 -0.17361     0.51714998  0.68493003 -0.15587001  1.42550004 -0.94313997  0.031951   -0.26210001 -0.089544    0.22437    -0.050374  0.035414   -0.070493    0.17037    -0.38598001  0.0095626   0.26363  0.72125    -0.13797     0.70602    -0.50839001 -0.49722001 -0.48706001  0.16254     0.025619    0.33572    -0.64160001 -0.32541999  0.21896  0.05331     0.082011    0.12038     0.088849   -0.04651    -0.085435  0.036835   -0.14695001 -0.25841001 -0.043812    0.053209   -0.48954999  1.73950005  0.99014997  0.09395    -0.20236    -0.050135    0.18337999  0.22713999  0.83146    -0.30974001  0.51995999  0.068264   -0.28237 -0.30096999 -0.031014    0.024615    0.4294     -0.085036    0.051913  0.31251001 -0.34443    -0.085145    0.024975    0.0082017   0.17241 -0.66000998  0.0058962  -0.055387   -0.22315     0.35721999 -0.18962  0.25819999 -0.24685    -0.79571998 -0.09436    -0.10271     0.13713001  1.48660004 -0.57113999 -0.52649999 -0.25181001 -0.40792    -0.18612  0.040009    0.11557     0.017987    0.27149001 -0.14252    -0.087756  0.15196     0.064926    0.01651    -0.25334001  0.27437001  0.24246  0.018494    0.22473   ][  4.32489991e-01   5.34709990e-01  -1.83240008e-02   1.56369999e-01   6.69689998e-02   7.63140023e-02  -7.16499984e-01   2.41520002e-01   1.70919999e-01   8.96220028e-01  -2.00739995e-01   1.88370004e-01   4.53570008e-01  -1.86419994e-01  -3.41060013e-01  -3.10570002e-02  -4.90720011e-02   6.55830026e-01  -7.43409991e-03   1.54770002e-01   1.55849993e-01   3.11129999e+00  -4.12149996e-01   1.89439997e-01   1.86639994e-01  -2.00540006e-01   6.63070008e-02   3.90180014e-02   6.06740005e-02  -3.17880005e-01   1.82290003e-02  -6.80719972e-01  -1.33360000e-02   2.32099995e-01  -8.23459998e-02  -2.68420011e-01  -3.51870000e-01  -2.03860000e-01   6.69749975e-02   3.02689999e-01   1.84330001e-01   1.44429997e-01   1.38980001e-01   1.93629991e-02  -4.09139991e-01   2.24950001e-01   8.02020013e-01  -3.05290014e-01   2.22399995e-01  -6.10979982e-02   3.48910004e-01   1.13349997e-01  -4.14329991e-02   7.31640011e-02   3.17719996e-01  -2.59590000e-01   3.17759991e-01  -1.78080007e-01   2.12380007e-01   1.07529998e-01   4.37519997e-01  -1.81329995e-01  -1.53909996e-01   7.27130026e-02   4.14730012e-01   2.73079991e-01  -1.06940001e-01   7.45599985e-01   3.02549988e-01   1.49999997e-02   6.33790016e-01   1.03399999e-01   2.18640000e-01  -7.43470015e-03  -7.09949970e-01   4.60739993e-02  -8.34990025e-01   1.84519999e-02  -2.75200009e-01  -2.48000007e-02  -1.21320002e-01   3.29720005e-02   1.07320003e-01   2.18070000e-01   2.79079992e-02  -2.99050003e-01  -3.06030005e-01  -3.17970008e-01   2.10580006e-01  -1.16180003e+00   4.57720011e-01  -2.60210007e-01   3.76190007e-01  -2.16749996e-01  -1.68540001e-01   1.77729994e-01  -1.46709993e-01   2.67820001e-01  -5.86109996e-01  -4.98769999e-01   3.83679986e-01   2.51020014e-01  -3.90020013e-01  -6.41409993e-01   5.34929991e-01   2.76479989e-01  -4.32830006e-01   1.61350000e+00  -1.58399999e-01  -2.03970000e-01  -2.18089998e-01  -9.74659983e-04  -7.48440027e-02   3.78959998e-02  -4.03719991e-01  -1.74640000e-01   4.11469996e-01  -2.80400008e-01  -3.42570007e-01   5.72329983e-02   7.18890011e-01  -1.30419999e-01   5.75810015e-01  -3.27690005e-01   7.75609985e-02  -2.66130000e-01   4.82820012e-02   1.24679998e-01   1.29319996e-01   7.65379965e-02  -3.85809988e-01   3.72209996e-01  -1.67280003e-01   1.86330006e-01  -2.20280007e-01   3.27170014e-01   1.74089998e-01  -2.51549989e-01  -5.47109991e-02  -2.48549998e-01  -2.24260002e-01  -3.80129993e-01   5.53080022e-01  -1.52050003e-01   1.12820005e+00   3.70869994e-01  -4.62579988e-02  -3.56009990e-01  -1.72120005e-01   1.23460002e-01   6.58179998e-01   5.05930007e-01  -4.99610007e-01   1.97339997e-01   1.37759998e-01   3.98500008e-04  -1.38300002e-01  -6.67430013e-02  -7.20809996e-02   4.00720000e-01  -3.77620012e-01  -9.86199975e-02   2.04119995e-01  -3.42869997e-01  -2.01509997e-01  -1.18270002e-01  -1.60109997e-01   1.63570002e-01  -3.79029989e-01  -2.45529994e-01  -5.31699993e-02  -4.19999994e-02   1.82099998e-01   2.23959997e-01   4.50800002e-01  -2.03030005e-01  -6.16330028e-01  -2.02739999e-01   2.18419999e-01   3.66939992e-01   7.68440008e-01  -2.92189986e-01  -9.80089977e-02  -2.92939991e-01  -1.93189994e-01   1.45720005e-01   2.45150000e-01  -7.11840019e-02   3.97929996e-01  -3.33019998e-03  -4.01179999e-01   1.39760002e-01   8.27540010e-02  -1.29600003e-01  -3.05500001e-01   8.98869988e-03   2.26559997e-01   3.21410000e-01  -4.29780006e-01   4.74779993e-01][  5.73459983e-01   5.41700006e-01  -2.34770000e-01  -3.62399995e-01   4.03699994e-01   1.13860004e-01  -4.49330002e-01  -3.09909999e-01  -5.34110004e-03   5.84259987e-01  -2.59559993e-02   4.93930012e-01  -3.72090004e-02  -2.84280002e-01   9.76959988e-02  -4.89069998e-01   2.60269996e-02   3.76489997e-01   5.77879995e-02  -4.68070000e-01   8.12880024e-02   3.28250003e+00  -6.36900008e-01   3.79559994e-01   3.81670007e-03   9.36070010e-02  -1.28549993e-01   1.73800007e-01   1.05219997e-01   2.86480010e-01   2.10889995e-01  -4.70759988e-01   2.77330000e-02  -1.98029995e-01   7.63280019e-02  -8.46289992e-01  -7.97079980e-01  -3.87430012e-01  -3.04220002e-02  -2.68489987e-01   4.85850006e-01   1.28950000e-01   3.83540004e-01   3.87219995e-01  -3.85239989e-01   1.90750003e-01   4.89980012e-01   1.32780001e-01   1.07920002e-02   2.67699987e-01   1.78120002e-01  -1.14330001e-01  -3.34939986e-01   8.73059988e-01   7.58750021e-01  -3.03779989e-01  -1.56259999e-01   1.20850001e-03   2.33219996e-01   2.79529989e-01  -1.84939995e-01  -1.41460001e-01  -1.89689994e-01  -3.83859985e-02   3.58740002e-01   6.55129999e-02   6.05649985e-02   6.63389981e-01  -8.32519978e-02   6.51630014e-02   5.17610013e-01   1.61709994e-01   4.60110009e-01   1.63880005e-01  -1.23989999e-01   3.11219990e-01  -1.54119998e-01  -1.09169997e-01  -4.25509989e-01   1.14179999e-01   2.51370013e-01  -5.61579987e-02  -2.59270012e-01   2.81630009e-01  -1.80939995e-02   1.60650000e-01  -4.85060006e-01  -9.89030004e-01   2.50220001e-01  -1.67359993e-01   4.14739996e-01   1.77010000e-01   4.24070001e-01   1.10880002e-01  -1.83599994e-01  -1.24100000e-01  -3.47799987e-01   9.90779996e-02  -2.23810002e-01  -1.12450004e-01  -2.11559996e-01   3.07060010e-03  -2.36070007e-01   2.72610001e-02   3.64300013e-01   3.99219990e-02  -1.83689997e-01   1.22660005e+00  -7.76400030e-01  -6.62249982e-01   1.57239996e-02  -1.49690002e-01   8.46489966e-02   2.68139988e-01  -1.67649999e-01  -3.19420010e-01   2.84940004e-01  -7.00000003e-02   1.20099997e-02  -1.22189999e-01   5.63099980e-01  -3.19999993e-01   5.01089990e-01  -1.02090001e-01   4.65750009e-01  -7.15420008e-01   1.72930002e-01   5.82589984e-01   7.83839971e-02  -3.38440016e-02  -2.51289994e-01   3.65029991e-01   3.15780006e-02  -6.57779992e-01   5.47499992e-02   8.71890008e-01   1.24550000e-01  -4.58770007e-01  -2.69650012e-01  -4.67790008e-01  -2.85780011e-03   1.78100005e-01   6.39689982e-01   1.39950007e-01   9.75960016e-01   1.18359998e-01  -6.39039993e-01  -1.54159993e-01   6.52619973e-02   2.43290007e-01   6.64759994e-01   2.50690013e-01  -1.02519996e-01  -3.28390002e-01  -8.55590031e-02  -1.27739999e-02  -1.94309995e-01   5.61389983e-01  -3.57329994e-01  -2.03439996e-01  -1.24130003e-01  -3.44309986e-01  -2.32960001e-01  -2.11870000e-01   8.53869990e-02   7.00630024e-02  -1.98029995e-01  -2.60230005e-02  -3.90370011e-01   8.00019979e-01   4.05770004e-01  -7.98629969e-02   3.52629989e-01  -3.40429991e-01   3.96759987e-01   2.28619993e-01  -3.50279987e-01  -4.73439991e-01   5.97419977e-01  -1.16570003e-01   1.05519998e+00  -4.15699989e-01  -8.05519968e-02  -5.65709993e-02  -1.66219994e-01   1.92739993e-01  -9.51749980e-02  -2.07810000e-01   1.56200007e-01   5.02309985e-02  -2.79150009e-01   4.37420011e-01  -3.12370002e-01   1.31940007e-01  -3.32780004e-01   1.88769996e-01  -2.34219998e-01   5.44179976e-01  -2.30690002e-01   3.49469990e-01][ -1.43299997e-01  -6.54269993e-01  -4.71419990e-01  -2.22780004e-01  -1.74260005e-01   6.43490016e-01  -7.60049999e-01  -8.42769966e-02   2.19850004e-01  -4.24650013e-01  -2.56919991e-02  -3.27909999e-02  -3.98149997e-01  -1.26000000e-02   6.31869972e-01  -6.31640017e-01  -2.70480007e-01   2.09920004e-01   5.72669983e-01   8.88589993e-02   2.24289998e-01   1.78419995e+00   8.73669982e-01  -2.23949999e-01   4.91869986e-01   6.86379969e-02   6.80689991e-01   1.85680002e-01  -2.85780013e-01  -1.06030002e-01  -3.07460010e-01   1.06180003e-02  -1.62290007e-01  -3.62700000e-02   8.02920014e-02  -2.17669994e-01  -7.92110026e-01  -7.06340015e-01  -2.89400011e-01  -1.51869997e-01   3.09009999e-01  -1.41849995e-01  -4.18920010e-01  -1.66299999e-01  -1.68510005e-01   1.04549997e-01  -3.49110007e-01  -1.21399999e+00   3.21929991e-01   1.88219994e-01  -3.50019991e-01  -2.23629996e-01   3.85280013e-01   1.40709996e-01   1.34890005e-01   3.89420003e-01   1.82109997e-01   1.32369995e-01  -2.60610014e-01   5.10110021e-01   1.26430005e-01  -2.94450015e-01  -1.34350002e-01  -1.44600004e-01   1.98740005e-01  -3.41720015e-01   2.40710005e-01  -6.08640015e-02   8.60409975e-01   8.92559998e-04   3.46020013e-01  -2.53780007e-01   2.53749996e-01   1.20920002e-01  -3.82739991e-01   1.55090000e-02  -6.41290024e-02   2.54729986e-01  -1.78489998e-01   2.37949997e-01   3.26880008e-01   2.03940004e-01  -5.65020025e-01   3.87109995e-01  -7.00220019e-02  -4.63999987e-01  -1.98210001e-01  -7.27079988e-01   5.36920011e-01  -9.09250021e-01  -2.36699998e-01  -6.20739982e-02   2.38769993e-01   3.24860007e-01   5.51190019e-01  -4.07079995e-01   1.40870005e-01   2.83569992e-01  -1.33379996e-01  -3.99740010e-01  -1.66620001e-01   4.43010002e-01  -6.85970008e-01   1.54200003e-01   6.58490002e-01   2.95630004e-02   2.61550009e-01   1.10969996e+00  -6.78719997e-01  -1.91960007e-01   7.45159984e-02  -3.53740007e-01  -7.15600014e-01   4.14079994e-01  -3.37280005e-01   2.05780007e-02   2.17340007e-01   6.48889989e-02  -3.51080000e-01  -2.44440004e-01  -1.98780000e-01   3.02210003e-01  -1.32180005e-01  -4.86490011e-01  -3.55599999e-01   2.83119995e-02   7.04949975e-01   7.35610008e-01   5.98779976e-01  -3.94160002e-01  -3.96349996e-01   2.55739987e-01   1.30939996e+00  -6.69730008e-02   4.72090006e-01   3.20030004e-01  -2.39020005e-01   1.87910005e-01   2.14310005e-01   3.10609996e-01  -7.22400010e-01  -2.59559989e-01   7.50709996e-02   5.45759976e-01   1.33249998e+00   5.97169995e-01  -7.56640017e-01   2.30540007e-01   3.45209986e-01   8.61710012e-01   5.65129995e-01   1.33000001e-01   4.27370012e-01  -4.36790008e-03  -5.15600026e-01   3.94329995e-01  -1.58559993e-01  -1.29820004e-01  -5.47240004e-02  -1.01789999e+00   6.39170036e-02  -6.47679985e-01  -7.53360018e-02   1.49210006e-01   5.53300023e-01  -6.24869987e-02  -2.01309994e-01   1.70489997e-01  -3.22829992e-01   2.43039995e-01   2.64449995e-02  -9.52059999e-02   3.80879998e-01  -3.69230002e-01   4.54270005e-01  -3.31169993e-01  -1.93750001e-02   1.59300007e-02   1.85659993e-02  -5.27350008e-01   1.05949998e+00  -4.31519985e-01   2.57169992e-01  -9.44539979e-02   7.53460005e-02  -2.06599995e-01   3.21940005e-01  -1.23549998e-02   4.72169995e-01   2.44020000e-01  -3.63059998e-01   1.71650007e-01  -3.50499988e-01   1.40990004e-01   1.18069999e-01   3.65170002e-01  -1.65889993e-01  -3.93799990e-02  -3.70829999e-01   2.03909993e-01][  2.41689995e-01  -3.45340014e-01  -2.23069996e-01  -1.29069996e+00   2.52849996e-01  -5.51280022e-01  -8.03359970e-02  -8.17670021e-03   3.11360002e-01  -4.51009989e-01   2.46610001e-01   3.64410013e-01   9.43359971e-01  -3.54200006e-02   7.80480027e-01  -3.97650003e-01   3.11250001e-01  -1.77430004e-01  -4.19889987e-01  -3.78149986e-01   6.72299981e-01   3.17160010e+00   3.24960016e-02  -3.16400006e-02   5.80680013e-01  -4.44579989e-01  -5.56120016e-02   1.80519998e-01   2.85719991e-01   9.58700031e-02   2.14369997e-01   4.97310013e-02   1.87199995e-01   1.19139999e-01   2.74080001e-02  -8.06079984e-01  -3.08349997e-01  -8.97369981e-01  -1.97720006e-01   2.67409999e-02  -3.87650013e-01   1.16590001e-01  -2.01100007e-01   2.01010004e-01  -7.91329965e-02  -5.09539992e-02   6.01889985e-03   3.34699988e-01  -2.11180001e-01   7.40419999e-02  -2.81410009e-01  -5.96150011e-02  -3.52959991e-01   6.47480011e-01   5.39080016e-02  -3.13760012e-01  -3.66219997e-01  -2.77550012e-01   2.26760004e-02   4.88109998e-02   1.43120006e-01  -1.85800001e-01  -5.69639981e-01  -5.41190028e-01   1.86159998e-01   1.88539997e-01   2.75209993e-01  -1.78350002e-01  -3.74379992e-01   1.21090002e-01   1.86099997e-03  -9.21269972e-03   1.01860002e-01   9.80810001e-02  -3.72449994e-01   6.64150000e-01   5.73659986e-02  -4.38450009e-01  -4.05250013e-01  -5.59589982e-01  -1.13430001e-01  -5.49870014e-01  -2.63209999e-01  -2.84709990e-01   1.44109994e-01   1.03600003e-01  -3.21980000e-01  -2.15299994e-01   9.86679971e-01  -4.19369996e-01   3.11899990e-01   3.33799988e-01   1.60180002e-01   3.31369996e-01  -2.54939999e-02  -3.78799997e-02  -1.20480001e-01  -1.21749997e-01   9.47659984e-02   2.61579990e-01   2.99309995e-02  -2.96249986e-01   4.34009999e-01  -3.65360007e-02  -4.28519994e-01  -3.96380007e-01  -2.49730006e-01   1.10179996e+00  -2.28119999e-01   2.43239999e-01   8.38569999e-02  -4.88810003e-01  -2.13159993e-01   4.02490012e-02  -4.05160010e-01  -1.24109998e-01  -1.97280005e-01  -7.36959994e-01  -4.45820004e-01  -4.49919999e-01  -7.37069994e-02  -1.64539993e-01   1.60109997e-01  -4.19519991e-01   4.14169997e-01  -5.72770000e-01   5.13670027e-01   8.11180025e-02   2.92879995e-02   3.53089988e-01  -1.08029999e-01   1.37020007e-01   3.88280004e-01  -2.88129985e-01   6.82030022e-01   1.74119994e-01  -8.85400027e-02  -2.97850013e-01  -2.84550011e-01   1.35700002e-01   1.52319998e-01   2.20369995e-01   6.88120008e-01  -1.30999997e-01   1.81970000e+00  -4.51530010e-01   3.95529985e-01  -6.08169973e-01   3.36979985e-01   1.10900000e-01  -2.84759998e-01   3.04610014e-01  -2.16059998e-01  -5.81110008e-02   4.41720009e-01  -2.42310002e-01  -1.27580002e-01  -4.87929992e-02  -1.54949993e-01  -8.54910016e-01   9.29120034e-02  -6.08090013e-02   2.90730000e-02  -3.87349993e-01  -7.08530024e-02  -6.59749985e-01  -3.81570011e-01   5.01699984e-01  -7.35599995e-01   4.15210009e-01   2.13280007e-01  -3.37790012e-01   6.69019997e-01   4.24860001e-01  -1.21480003e-01  -1.06260004e-02   1.27450004e-01  -1.35609999e-01   2.34229997e-01   3.51099998e-01   1.28410006e+00   1.29820004e-01   2.13569999e-01   3.28570008e-01   1.65669993e-01  -2.14579999e-01  -4.42750007e-01   3.28500003e-01   1.80010006e-01   6.48650005e-02  -3.58799994e-01  -1.42259998e-02   3.11250001e-01  -2.20489994e-01   3.28290015e-02   3.85250002e-01  -1.05120003e-01   2.78010011e-01  -1.01709999e-01  -7.15209991e-02][  5.24869978e-01  -1.19410001e-01  -2.02419996e-01  -6.23929977e-01  -1.53799996e-01  -6.63369969e-02  -3.68499994e-01   2.86490005e-02   1.37950003e-01  -5.87819993e-01   6.02090001e-01   2.60540005e-02   7.07889974e-01   1.20329998e-01  -1.74310002e-02   4.03360009e-01  -3.19680005e-01  -2.50609994e-01   1.60889998e-01   2.47649997e-01   7.79359996e-01   2.74070001e+00   1.19589999e-01  -2.67529994e-01  -3.82809997e-01  -3.36580008e-01   1.41039997e-01  -4.65480000e-01  -8.92079994e-02   2.22540006e-01  -3.60740013e-02  -7.10140020e-02   6.23199999e-01   3.22770000e-01   4.15650010e-01  -3.68530005e-02  -5.82859993e-01  -6.26510024e-01  -3.26169990e-02   2.74789989e-01  -2.66950011e-01   5.27690016e-02  -1.09500003e+00  -1.99760008e-03  -7.49390006e-01  -1.89999994e-02  -1.87619999e-01  -5.19330025e-01   1.71590000e-01   4.40690011e-01   1.90789998e-01  -4.57339995e-02  -2.43069995e-02   2.32710004e-01   1.02179997e-01  -6.02790006e-02  -2.63680011e-01  -1.49090007e-01   4.33890015e-01  -8.51420015e-02  -6.61419988e-01  -6.23379983e-02  -1.37920007e-01   4.54079986e-01  -1.51400000e-01   1.14929996e-01   5.48650026e-01   2.82370001e-01  -2.55129993e-01   1.11149997e-01  -8.47479999e-02  -9.66319963e-02   4.88200009e-01   7.48070002e-01  -6.74910009e-01   3.21949989e-01  -8.98810029e-01  -3.58159989e-01   8.54640007e-02  -1.67390004e-01  -1.60109997e-01  -1.06349997e-01  -2.96649992e-01  -3.03389996e-01   2.29310006e-01   6.51580021e-02   1.70359999e-01  -2.08039999e-01  -1.12930000e-01  -1.58399999e-01  -1.88710004e-01   5.38689971e-01  -3.58050019e-02   3.26770008e-01  -3.18859994e-01   2.01079994e-02  -1.15209997e-01  -3.67400013e-02   5.97510003e-02   9.09829974e-01   1.55420005e-01   1.63939998e-01   3.57499987e-01  -4.75789994e-01  -3.53519991e-02  -1.30729997e+00   2.40319997e-01   1.24170005e+00   4.63180006e-01  -2.90019996e-02   6.63789988e-01  -2.20129997e-01   6.57410026e-01  -2.34209999e-01  -7.43539989e-01   8.95029977e-02  -1.41269997e-01   3.54860008e-01  -6.87049985e-01  -9.53029990e-02  -1.59720004e-01  -4.36020009e-02   5.39509989e-02  -1.33180007e-01   4.47530001e-01   1.41289994e-01   2.71119997e-02   1.18179999e-01   2.02419996e-01  -5.98179996e-01  -6.83469996e-02  -1.58150002e-01   2.42929995e-01   8.88189971e-02  -1.16219997e-01  -3.82679999e-01   1.30649999e-01  -3.00619990e-01  -8.46930027e-01   1.90159995e-02   6.41009986e-01  -8.42439979e-02   7.83680022e-01  -2.63660014e-01   1.98839998e+00  -7.38060027e-02   7.21970022e-01   4.84739989e-02   1.62070006e-01   3.40840012e-01  -1.85289994e-01   2.85659999e-01  -2.35440001e-01  -2.69939989e-01  -5.27239978e-01   2.74859995e-01   6.56930029e-01   2.20569998e-01  -6.80390000e-01   3.59100014e-01   2.51430005e-01   7.24399984e-02   2.58690000e-01   3.22510004e-01   3.57239991e-02  -3.88200015e-01   1.18890002e-01  -2.90010005e-01  -6.32229984e-01  -1.48519993e-01  -1.21040002e-01  -1.74060002e-01   2.48679996e-01   4.74539995e-01  -2.50710011e-01  -5.23679972e-01  -1.82730004e-01  -8.30529988e-01   6.15109980e-01  -3.84620011e-01   9.74919975e-01   4.49510008e-01  -5.42559981e-01  -3.18340003e-01  -5.07809997e-01  -2.73739994e-02  -1.26379997e-01   5.42479992e-01   4.85489994e-01   8.55289996e-02  -1.42509997e-01   8.77849981e-02   2.47189999e-01  -2.38490000e-01   6.14570007e-02  -4.23129983e-02   3.57470006e-01   9.73920003e-02  -3.58500004e-01   1.70269996e-01][  7.54669979e-02  -2.92360008e-01  -2.60369986e-01  -2.81670004e-01   1.60970002e-01  -1.94720000e-01  -2.82059997e-01   4.91409987e-01  -4.11189999e-03   9.58790034e-02   3.33959997e-01   1.68740004e-02   2.06159994e-01  -1.57429993e-01  -1.61379993e-01   2.15379998e-01  -1.48359999e-01   2.54599992e-02  -4.36809987e-01  -6.36610016e-02   5.52609980e-01   3.08189988e+00  -1.17839999e-01  -4.61309999e-01  -2.76089996e-01   2.09480003e-01  -2.05440000e-01  -5.71550012e-01   3.34479988e-01   1.59130007e-01   2.54360004e-03   1.80040002e-01   1.34719998e-01  -9.74040031e-02   3.55369985e-01  -4.74280000e-01  -7.92569995e-01  -5.44179976e-01   2.42999997e-02   6.35990024e-01   1.23369999e-01  -1.29130006e-01  -2.65650004e-01  -2.49569997e-01  -5.21990001e-01  -4.05229986e-01   4.84030008e-01   1.83729995e-02   2.30389997e-01   6.21379986e-02  -1.92919999e-01   2.95060009e-01  -3.57930005e-01   1.67019993e-01   3.18679988e-01  -3.60540003e-01  -1.09779999e-01  -1.56320006e-01   4.59210008e-01   9.64749977e-02  -3.77000004e-01  -7.76150003e-02  -4.88990009e-01   2.05750003e-01   5.05429983e-01   5.34190014e-02  -2.59779990e-01   5.10420024e-01   9.75209996e-02   3.26059997e-01   1.43539995e-01   2.27149995e-03   4.86149997e-01   4.69379991e-01  -4.11240011e-01  -1.71480000e-01  -3.97439986e-01  -2.89009988e-01  -1.77560002e-01   3.70009989e-02   3.48300010e-01   1.59339994e-01  -7.42810011e-01   1.88970000e-01   4.36850004e-02   5.72080016e-01  -6.70159996e-01  -4.39470001e-02  -2.83360004e-01  -3.19959998e-01  -2.04040006e-01  -8.78980011e-02  -1.57240003e-01   2.18180008e-02  -5.67569971e-01   6.32960021e-01  -1.00970000e-01  -6.55760020e-02   5.82689978e-03   3.30349989e-02   3.97830009e-01  -3.11659992e-01  -6.10889971e-01   2.75590003e-01   1.00079998e-01  -4.19900000e-01   6.35599997e-03   1.87170005e+00   3.14729989e-01  -3.60040009e-01   8.13839972e-01  -2.17099994e-01  -1.84589997e-02  -2.26319999e-01   1.45850003e-01  -1.43500000e-01  -4.14239988e-02   5.59740007e-01  -6.67519987e-01  -2.19589993e-01   1.90109998e-01   3.30150008e-01   6.12900019e-01   4.67709988e-01   4.20260012e-01  -5.28190017e-01   2.31650006e-02   3.29100005e-02   4.73060012e-01   1.40060000e-02  -1.73960000e-01  -4.43619996e-01   4.13769990e-01  -2.06790000e-01   3.92830014e-01   3.02109987e-01   7.31339976e-02   4.21640016e-02  -9.27100003e-01  -4.76139992e-01   2.43100002e-01  -1.33790001e-01  -2.22379997e-01  -4.14570011e-02   1.58500004e+00   3.74810010e-01   2.59939991e-02  -2.42719993e-01   3.05779994e-01   1.46870002e-01   1.16659999e-01  -2.94179991e-02  -7.83390030e-02  -2.25119993e-01   1.33149996e-01  -6.48420006e-02  -2.86870003e-01  -1.05600003e-02  -3.46679986e-01   4.21449989e-02  -6.00409985e-01   8.24810028e-01   3.10220003e-01   1.64890006e-01  -7.29210004e-02   1.93939999e-01  -9.84980017e-02  -2.03830004e-02  -4.09090012e-01  -1.04039997e-01   1.91689998e-01  -1.59689993e-01   3.80259991e-01   6.28019989e-01   2.59499997e-01  -3.33669990e-01  -7.33330011e-01  -4.07429993e-01   6.84230030e-01  -6.63380027e-02   5.04360020e-01  -2.89829999e-01  -3.90859991e-01  -4.59310003e-02  -2.66240001e-01  -1.67699993e-01  -1.50370002e-01   1.48279995e-01  -2.81430006e-01  -1.70870006e-01  -2.55760014e-01  -5.62830009e-02  -1.66500002e-01   3.51060003e-01   4.10320014e-02   2.73110002e-01   3.00200004e-02   1.64649993e-01  -8.41889977e-02   5.75059988e-02][  5.46909988e-01  -7.55890012e-01  -9.20799971e-01  -8.20680022e-01   1.48800001e-01  -1.32200003e-01   2.49499991e-03   5.39979994e-01  -3.12929988e-01  -2.12009996e-02  -2.96559989e-01   1.51110003e-02  -3.76980007e-02  -4.07290012e-01   3.36799994e-02  -2.90179998e-01  -5.64790010e-01   6.47960007e-01   3.96770000e-01  -1.91139996e-01   6.98249996e-01   1.61090004e+00   8.42899978e-01  -1.45980000e-01  -1.13940001e-01  -5.39680004e-01  -7.22540021e-01  -1.47310004e-01  -6.78520024e-01   1.93159997e-01   4.17819992e-03  -3.96059990e-01   8.99320021e-02   3.30909997e-01  -4.09649983e-02   4.59470004e-01  -3.07020009e-01  -2.61469990e-01   5.39489985e-01   6.60130024e-01  -5.68049997e-02  -3.70750010e-01  -1.46880001e-03  -1.48770005e-01   2.13000000e-01  -1.36710003e-01   5.12350023e-01  -5.27249992e-01   2.73449998e-02   3.00159991e-01   4.59829986e-01   3.65709990e-01   2.23989993e-01   1.47589996e-01  -9.97050032e-02   2.26520002e-01  -5.95120013e-01  -4.34839994e-01  -5.26629984e-02  -3.05240005e-01  -1.91280007e-01  -4.08630013e-01  -2.83719987e-01   6.89159989e-01  -7.18529999e-01   2.89909989e-01  -1.31809995e-01  -8.75229985e-02   2.70280004e-01   3.31449993e-02   1.93159997e-01  -2.50629991e-01   1.13150001e-01   4.88560013e-02  -4.41439986e-01  -8.88149977e-01  -3.45840007e-01  -5.47370017e-01   2.13919997e-01   1.46469995e-01  -3.76450002e-01   5.63120008e-01   4.44130003e-01  -6.05080016e-02   5.85870028e-01   4.50850010e-01  -1.27309993e-01   6.86409995e-02   3.78729999e-01  -7.15900004e-01   2.72450000e-01   2.16839999e-01  -5.09050012e-01   2.82240003e-01   5.28559983e-01   2.97140002e-01  -3.33429992e-01  -1.62200004e-01  -3.97210002e-01   3.37130010e-01   6.61469996e-01  -4.09590006e-01  -1.97520003e-01  -8.77860010e-01  -4.36379999e-01   1.79399997e-01   3.39769991e-03   9.73770022e-01   7.32789993e-01  -3.44499983e-02  -4.76709992e-01  -4.99819994e-01   4.97110009e-01  -7.77419984e-01  -3.60000014e-01  -1.58899993e-01   1.19869998e-02  -1.39750004e-01  -5.91250002e-01  -1.58280000e-01   9.19710025e-02   4.62440014e-01  -1.17980000e-02   8.29930007e-01   7.96280026e-01  -5.64369977e-01  -1.02080004e-02  -3.86090010e-01  -6.74889982e-01  -3.42109986e-02   1.99829996e-01   2.01010004e-01   7.43009984e-01  -3.58200014e-01   1.07579999e-01  -7.77289987e-01   3.24770004e-01  -7.43889987e-01  -7.90560007e-01   4.95799989e-01   3.70090008e-01  -1.55349998e-02   1.20260000e+00  -6.90900028e-01   1.49619997e+00   4.31109995e-01   2.64299989e-01  -5.99219978e-01  -4.51359987e-01   7.66879976e-01  -1.43020004e-02   1.43340006e-01   1.19379997e-01  -5.56930006e-01  -9.95339990e-01   8.20089996e-01  -3.02410007e-01   9.95709971e-02  -4.92139995e-01   2.25950003e-01   5.18890023e-01   6.51669979e-01  -7.13440031e-02   3.06109991e-03  -2.95789987e-01  -4.40490007e-01  -1.61009997e-01   2.52220005e-01  -3.50809991e-01  -7.96020031e-02   4.33560014e-01  -1.43889993e-01  -1.26969993e-01   1.01450002e+00  -6.24639988e-02  -5.60909986e-01  -6.52670026e-01   1.86299995e-01  -3.76190007e-01  -2.40750000e-01   2.50979990e-01  -2.06090003e-01  -8.57209980e-01  -5.84480017e-02  -1.23539999e-01  -7.35830009e-01  -4.83509988e-01   2.75110006e-01  -1.10220000e-01   3.17330003e-01   1.87999994e-01   1.05719995e+00   5.85460007e-01   3.13740000e-02   2.11569995e-01   1.13799997e-01   4.65829998e-01  -3.05020005e-01  -4.51409996e-01   2.87349999e-01][  2.41689995e-01  -3.45340014e-01  -2.23069996e-01  -1.29069996e+00   2.52849996e-01  -5.51280022e-01  -8.03359970e-02  -8.17670021e-03   3.11360002e-01  -4.51009989e-01   2.46610001e-01   3.64410013e-01   9.43359971e-01  -3.54200006e-02   7.80480027e-01  -3.97650003e-01   3.11250001e-01  -1.77430004e-01  -4.19889987e-01  -3.78149986e-01   6.72299981e-01   3.17160010e+00   3.24960016e-02  -3.16400006e-02   5.80680013e-01  -4.44579989e-01  -5.56120016e-02   1.80519998e-01   2.85719991e-01   9.58700031e-02   2.14369997e-01   4.97310013e-02   1.87199995e-01   1.19139999e-01   2.74080001e-02  -8.06079984e-01  -3.08349997e-01  -8.97369981e-01  -1.97720006e-01   2.67409999e-02  -3.87650013e-01   1.16590001e-01  -2.01100007e-01   2.01010004e-01  -7.91329965e-02  -5.09539992e-02   6.01889985e-03   3.34699988e-01  -2.11180001e-01   7.40419999e-02  -2.81410009e-01  -5.96150011e-02  -3.52959991e-01   6.47480011e-01   5.39080016e-02  -3.13760012e-01  -3.66219997e-01  -2.77550012e-01   2.26760004e-02   4.88109998e-02   1.43120006e-01  -1.85800001e-01  -5.69639981e-01  -5.41190028e-01   1.86159998e-01   1.88539997e-01   2.75209993e-01  -1.78350002e-01  -3.74379992e-01   1.21090002e-01   1.86099997e-03  -9.21269972e-03   1.01860002e-01   9.80810001e-02  -3.72449994e-01   6.64150000e-01   5.73659986e-02  -4.38450009e-01  -4.05250013e-01  -5.59589982e-01  -1.13430001e-01  -5.49870014e-01  -2.63209999e-01  -2.84709990e-01   1.44109994e-01   1.03600003e-01  -3.21980000e-01  -2.15299994e-01   9.86679971e-01  -4.19369996e-01   3.11899990e-01   3.33799988e-01   1.60180002e-01   3.31369996e-01  -2.54939999e-02  -3.78799997e-02  -1.20480001e-01  -1.21749997e-01   9.47659984e-02   2.61579990e-01   2.99309995e-02  -2.96249986e-01   4.34009999e-01  -3.65360007e-02  -4.28519994e-01  -3.96380007e-01  -2.49730006e-01   1.10179996e+00  -2.28119999e-01   2.43239999e-01   8.38569999e-02  -4.88810003e-01  -2.13159993e-01   4.02490012e-02  -4.05160010e-01  -1.24109998e-01  -1.97280005e-01  -7.36959994e-01  -4.45820004e-01  -4.49919999e-01  -7.37069994e-02  -1.64539993e-01   1.60109997e-01  -4.19519991e-01   4.14169997e-01  -5.72770000e-01   5.13670027e-01   8.11180025e-02   2.92879995e-02   3.53089988e-01  -1.08029999e-01   1.37020007e-01   3.88280004e-01  -2.88129985e-01   6.82030022e-01   1.74119994e-01  -8.85400027e-02  -2.97850013e-01  -2.84550011e-01   1.35700002e-01   1.52319998e-01   2.20369995e-01   6.88120008e-01  -1.30999997e-01   1.81970000e+00  -4.51530010e-01   3.95529985e-01  -6.08169973e-01   3.36979985e-01   1.10900000e-01  -2.84759998e-01   3.04610014e-01  -2.16059998e-01  -5.81110008e-02   4.41720009e-01  -2.42310002e-01  -1.27580002e-01  -4.87929992e-02  -1.54949993e-01  -8.54910016e-01   9.29120034e-02  -6.08090013e-02   2.90730000e-02  -3.87349993e-01  -7.08530024e-02  -6.59749985e-01  -3.81570011e-01   5.01699984e-01  -7.35599995e-01   4.15210009e-01   2.13280007e-01  -3.37790012e-01   6.69019997e-01   4.24860001e-01  -1.21480003e-01  -1.06260004e-02   1.27450004e-01  -1.35609999e-01   2.34229997e-01   3.51099998e-01   1.28410006e+00   1.29820004e-01   2.13569999e-01   3.28570008e-01   1.65669993e-01  -2.14579999e-01  -4.42750007e-01   3.28500003e-01   1.80010006e-01   6.48650005e-02  -3.58799994e-01  -1.42259998e-02   3.11250001e-01  -2.20489994e-01   3.28290015e-02   3.85250002e-01  -1.05120003e-01   2.78010011e-01  -1.01709999e-01  -7.15209991e-02][-0.34636    -0.88984001 -0.50321001 -0.43516001  0.54518998  0.17437001 -0.093541    0.16141    -0.46575999 -0.22       -0.31415001 -0.13484 -0.37617999 -0.67635     0.78820002 -0.33384001 -0.42414001  0.32367  0.50670999  0.21540999  0.43296     1.49049997  0.31795001 -0.15196  0.2579     -0.35666001 -0.63880002 -0.086453   -0.94755     0.19203  0.31161001 -0.74491    -0.59210998  0.4332     -0.064934   -0.48862001  0.35600999 -0.44780999 -0.015773    0.18203001  0.051751   -0.2854 -0.14727999  0.1513     -0.33048001  0.27135     1.16659999 -0.36662  0.090829    0.87301999 -0.13505     0.21204001  0.57270002  0.54259002 -0.50335002  0.16767    -0.82530999 -0.45962    -0.42642    -0.2164  0.088689   -0.15061    -0.16785    -0.31794    -0.69608998  0.40715 -0.29190999 -0.042072    0.90051001  0.35947999  0.030644   -0.028853  0.086821    0.74741    -0.52008998  0.20655     0.44053999 -0.11865 -0.15449999 -0.22457001 -0.15453     0.16101    -0.30825001 -0.28479999 -0.50823998  0.48732999 -0.012029    0.034592    0.48304    -0.56752002 -0.057299    0.22070999 -0.34200001 -0.060634    0.95499998 -0.60952997  0.59577    -0.11553    -0.67475998  0.52658999  0.82163     0.35126001  0.15521     0.12135     0.38191     0.24228001 -0.51485997  1.14810002  0.07281     0.23024    -0.68901998 -0.17606001 -0.24308001 -0.13686 -0.13467     0.059625   -0.68668997  0.15907     0.11983    -0.024954  0.34898001  0.15456     0.047524    0.23616999  0.54784    -1.01380002  0.10351     0.26865    -0.064867    0.23893     0.026141    0.081648  0.74479997 -0.67208999  0.23351    -0.55070001 -0.14296    -0.30303001 -0.40233999  0.012984    0.86865002 -0.14025     1.13900006 -0.093339  1.56060004  0.41804001  0.54067999 -0.43340999 -0.090589    0.56682003 -0.21291     0.45693001 -0.64519    -0.05866     0.21477     0.45563 -0.15218     0.36307001 -0.25441    -0.72013998  0.52191001  0.55509001 -0.073841    0.44994    -0.11501     0.1911      0.077304    0.18629  0.60244     0.028085    0.17228    -0.24455     0.04822     0.51318002 -0.06824     0.35515001 -0.80987    -0.42732999 -0.72728997  0.47817001  0.87611997 -0.18855     0.30390999 -0.14161     0.26699001 -0.81572002 -0.67589998 -0.34687999  0.53188998  0.75443    -0.083874    0.77434999  0.081108   -0.29840001 -0.24409001 -0.14574    -0.1186      0.085964  0.48076999 -0.13097   ][ 1.07439995 -0.49325001 -0.23126    -0.197      -0.087485   -0.16806 -0.11092     0.42857999 -0.16893999  0.0094633  -0.50453001 -0.40006  0.31169     0.50295001 -0.48537001 -0.20689    -0.62162     0.38407999  0.22182     0.051087   -0.018028    1.37919998  0.3163     -0.17425001  0.11344    -0.42723    -0.28510001 -0.1246     -0.2024      0.18217 -0.37834001 -0.22421999  0.38877001  0.20065001 -0.29708999  0.77051002  0.13087    -0.25084001  0.54755002  0.38086     0.28174999 -0.15691 -0.71987998  0.24118     0.073913   -0.46965     1.02180004  0.049863  0.036841    0.54706001 -0.15903001  0.53780001 -0.10335     0.51644999 -0.25512001 -0.18553001 -0.51804    -0.24337     0.57081997 -0.39017001 -0.17132001 -0.14939    -0.1724      0.91408002 -0.45778     0.40143001  0.075224   -0.4104     -0.1714     -0.63552999  0.60185999 -0.3193 -0.46136999  0.030652    0.32890001 -0.2472     -0.49831    -0.90982997 -0.057251    0.20213    -0.51845998  0.46320999  0.032707    0.29872999  1.11189997 -0.35784999  0.34929001 -0.51739001  0.25235    -1.0309  0.21052     0.06349    -0.10589     0.43222001  0.20389     0.065589 -0.62914002  0.1096     -0.86363     0.44760999  0.43114999  0.041376 -0.42429999 -0.080897    0.093115    0.22603001  0.31176999  0.83004999 -0.25659001  0.013861    0.38326001 -0.52025998  0.30410001 -0.52507001 -0.78566003 -0.046498    0.41154999 -0.21447    -0.24202999 -0.24732  1.01129997  0.067517    0.18331    -0.17636     0.49777001 -0.21067999  0.0037579   0.22881    -0.15993001 -0.13421001 -0.27379999 -0.20734  0.13407999 -0.57762003 -0.66526997 -0.42083001  0.65882999 -0.53825998 -0.50585997  0.51735002  0.25468999 -0.83724999  0.83806998 -0.42219999  1.0776      0.065962    0.48954999 -0.78614002 -0.19338     0.097524  0.27215999  0.037038    0.61778998 -0.29506999 -0.97285002  0.53106999 -0.32765001 -0.045966   -0.75436997  0.024904    0.64872003  0.023095  0.32062     0.35837999 -0.091125   -0.10866     0.33048001 -0.1162 -0.40981999  0.43928999 -0.16706     0.26047     0.090957    0.92714  0.099946   -0.29513001 -0.35341001  0.33693001 -0.42203999 -0.065625  0.54738998 -0.41751    -0.86232001 -0.65891999 -0.41549     0.067035 -0.41558     0.15092     0.17556     0.94068003  0.22527     0.65908003  0.15809     0.061199    0.63612998 -0.17089    -0.017591   -0.054003 -0.69072002  0.65179998][  1.03529997e-01   7.20499977e-02  -2.93029994e-02  -4.46799994e-01  -8.61259997e-02   7.40030035e-02  -4.65499997e-01  -6.18570000e-02  -5.03650010e-01   1.95480004e-01  -1.03349999e-01   7.75929987e-01  -1.72040001e-01  -4.53520000e-01   2.63040006e-01   1.64340004e-01   2.80279994e-01   2.89330006e-01   3.10360014e-01   1.27750002e-02   6.79520011e-01   2.62770009e+00   3.89640003e-01  -4.49900001e-01   2.17969999e-01   9.16400030e-02  -1.67940006e-01   7.72420019e-02   2.91269988e-01   1.20530002e-01  -5.49659990e-02   9.16600004e-02   1.31300002e-01  -9.33270007e-02  -3.53009999e-01  -5.03880024e-01  -7.29799986e-01  -3.61380011e-01  -2.99659997e-01   2.07839999e-02  -7.03599975e-02  -6.72269985e-02  -3.62650007e-02  -1.46009997e-02  -5.98580018e-02  -1.63020000e-01   3.00660014e-01  -9.28120017e-02  -6.69799969e-02   1.43830001e-01   1.03950001e-01  -1.93039998e-02  -4.07020003e-01   8.86749983e-01   2.67349988e-01  -1.23829998e-01   2.73739994e-02   3.44639987e-01   6.04049981e-01   2.80640006e-01   1.41320005e-01   2.46429995e-01  -4.48760003e-01   5.42909980e-01   1.96759999e-01  -4.94709998e-01   2.68779993e-02   2.69100010e-01   2.53410012e-01   9.88679975e-02   1.77919999e-01  -3.22290003e-01  -1.05930001e-01   1.89520001e-01  -2.57629991e-01   2.43619993e-01  -2.45560005e-01  -1.36539996e-01  -1.08170003e-01  -4.21220005e-01  -1.61640003e-01  -3.41199994e-01  -1.47389993e-01  -2.16409996e-01  -5.62399998e-02   7.15879977e-01   6.40570000e-02  -3.07790011e-01   6.57369971e-01  -6.36910021e-01   2.64039993e-01   2.15059996e-01   1.83850005e-01   3.32610011e-01  -6.30220026e-02  -2.22450003e-01   6.31980002e-02  -4.47950006e-01  -2.52279997e-01   1.97699994e-01  -3.55479985e-01   1.19139999e-01  -2.99149990e-01   1.29390001e-01  -4.30770010e-01  -1.37700006e-01   2.38429993e-01   1.45529997e+00  -1.13710001e-01  -3.79790008e-01  -2.83129990e-01  -4.64819998e-01  -1.92760006e-01   1.98490005e-02   2.70090014e-01  -1.70849994e-01  -5.20099998e-02  -2.03219995e-01  -3.27439994e-01  -5.07570028e-01   2.98289992e-02   1.80350006e-01   3.05330008e-01   2.40700006e-01   4.66120005e-01  -7.12530017e-01   5.96719980e-01   2.13310003e-01   3.48639995e-01   3.80190015e-01   1.38260007e-01  -7.33380020e-02   2.26349995e-01  -4.55599993e-01   8.19810014e-03   3.47319990e-01  -1.36079997e-01  -6.59820020e-01  -2.79430002e-01   4.81799990e-02  -3.97129990e-02   2.86449999e-01   1.41259998e-01  -3.94299999e-02   1.44519997e+00   4.59950000e-01   7.92850032e-02  -3.55580002e-01   3.09360009e-02  -2.50809994e-02   5.64870015e-02   9.00759995e-02   5.20309992e-02  -3.99529994e-01   3.40330005e-01  -4.17400002e-01  -3.82189989e-01   2.22049996e-01   1.09520003e-01  -1.64539993e-01   1.20609999e-01   1.60919994e-01  -3.27459991e-01   2.45800003e-01  -2.28320006e-02   2.74109989e-01  -2.10689995e-02   3.91460001e-01  -5.61020017e-01   6.95510030e-01  -3.55260000e-02   4.71640006e-02   6.71909988e-01   4.81240004e-01  -2.78840009e-02   5.05490005e-01  -5.41859984e-01  -3.52369994e-01  -3.12009990e-01  -1.76020002e-03   9.59439993e-01  -5.03639996e-01   4.39889997e-01   4.71810013e-01   4.25799996e-01  -5.92209995e-01  -3.96219999e-01   1.89040005e-02   3.33819985e-02  -2.90149987e-01  -1.22079998e-01   2.76309997e-02  -2.66250014e-01  -1.97340008e-02   2.31020004e-01   8.76149982e-02  -7.69149978e-03   1.90050006e-02  -4.42119986e-01  -6.81999996e-02][  2.41689995e-01  -3.45340014e-01  -2.23069996e-01  -1.29069996e+00   2.52849996e-01  -5.51280022e-01  -8.03359970e-02  -8.17670021e-03   3.11360002e-01  -4.51009989e-01   2.46610001e-01   3.64410013e-01   9.43359971e-01  -3.54200006e-02   7.80480027e-01  -3.97650003e-01   3.11250001e-01  -1.77430004e-01  -4.19889987e-01  -3.78149986e-01   6.72299981e-01   3.17160010e+00   3.24960016e-02  -3.16400006e-02   5.80680013e-01  -4.44579989e-01  -5.56120016e-02   1.80519998e-01   2.85719991e-01   9.58700031e-02   2.14369997e-01   4.97310013e-02   1.87199995e-01   1.19139999e-01   2.74080001e-02  -8.06079984e-01  -3.08349997e-01  -8.97369981e-01  -1.97720006e-01   2.67409999e-02  -3.87650013e-01   1.16590001e-01  -2.01100007e-01   2.01010004e-01  -7.91329965e-02  -5.09539992e-02   6.01889985e-03   3.34699988e-01  -2.11180001e-01   7.40419999e-02  -2.81410009e-01  -5.96150011e-02  -3.52959991e-01   6.47480011e-01   5.39080016e-02  -3.13760012e-01  -3.66219997e-01  -2.77550012e-01   2.26760004e-02   4.88109998e-02   1.43120006e-01  -1.85800001e-01  -5.69639981e-01  -5.41190028e-01   1.86159998e-01   1.88539997e-01   2.75209993e-01  -1.78350002e-01  -3.74379992e-01   1.21090002e-01   1.86099997e-03  -9.21269972e-03   1.01860002e-01   9.80810001e-02  -3.72449994e-01   6.64150000e-01   5.73659986e-02  -4.38450009e-01  -4.05250013e-01  -5.59589982e-01  -1.13430001e-01  -5.49870014e-01  -2.63209999e-01  -2.84709990e-01   1.44109994e-01   1.03600003e-01  -3.21980000e-01  -2.15299994e-01   9.86679971e-01  -4.19369996e-01   3.11899990e-01   3.33799988e-01   1.60180002e-01   3.31369996e-01  -2.54939999e-02  -3.78799997e-02  -1.20480001e-01  -1.21749997e-01   9.47659984e-02   2.61579990e-01   2.99309995e-02  -2.96249986e-01   4.34009999e-01  -3.65360007e-02  -4.28519994e-01  -3.96380007e-01  -2.49730006e-01   1.10179996e+00  -2.28119999e-01   2.43239999e-01   8.38569999e-02  -4.88810003e-01  -2.13159993e-01   4.02490012e-02  -4.05160010e-01  -1.24109998e-01  -1.97280005e-01  -7.36959994e-01  -4.45820004e-01  -4.49919999e-01  -7.37069994e-02  -1.64539993e-01   1.60109997e-01  -4.19519991e-01   4.14169997e-01  -5.72770000e-01   5.13670027e-01   8.11180025e-02   2.92879995e-02   3.53089988e-01  -1.08029999e-01   1.37020007e-01   3.88280004e-01  -2.88129985e-01   6.82030022e-01   1.74119994e-01  -8.85400027e-02  -2.97850013e-01  -2.84550011e-01   1.35700002e-01   1.52319998e-01   2.20369995e-01   6.88120008e-01  -1.30999997e-01   1.81970000e+00  -4.51530010e-01   3.95529985e-01  -6.08169973e-01   3.36979985e-01   1.10900000e-01  -2.84759998e-01   3.04610014e-01  -2.16059998e-01  -5.81110008e-02   4.41720009e-01  -2.42310002e-01  -1.27580002e-01  -4.87929992e-02  -1.54949993e-01  -8.54910016e-01   9.29120034e-02  -6.08090013e-02   2.90730000e-02  -3.87349993e-01  -7.08530024e-02  -6.59749985e-01  -3.81570011e-01   5.01699984e-01  -7.35599995e-01   4.15210009e-01   2.13280007e-01  -3.37790012e-01   6.69019997e-01   4.24860001e-01  -1.21480003e-01  -1.06260004e-02   1.27450004e-01  -1.35609999e-01   2.34229997e-01   3.51099998e-01   1.28410006e+00   1.29820004e-01   2.13569999e-01   3.28570008e-01   1.65669993e-01  -2.14579999e-01  -4.42750007e-01   3.28500003e-01   1.80010006e-01   6.48650005e-02  -3.58799994e-01  -1.42259998e-02   3.11250001e-01  -2.20489994e-01   3.28290015e-02   3.85250002e-01  -1.05120003e-01   2.78010011e-01  -1.01709999e-01  -7.15209991e-02][ 0.041052   -0.54705    -0.72193998 -0.31235999 -0.43849     0.10691 -0.50641    -0.45401001 -0.28623     0.018973    0.020495    0.42860001 -0.0057162  -0.21272001  0.71864998 -0.091906   -0.55365002  0.39133  0.15351    -0.27454001  0.56528002  3.04830003  0.30467001 -0.37893  0.37865999  0.13925999 -0.11482     0.48212999 -0.30522999  0.43125001 -0.09667     0.069156    0.31426001  0.26350999 -0.31189999 -0.39881 -0.55656999 -0.35934001 -0.25402001  0.072061   -0.12966999 -0.11247 -0.041192   -0.042619   -0.07848     0.31992     0.41635999  0.26131001 -0.18175    -0.1279      0.21332    -0.41973001 -0.50444001  0.37705001  0.83955002 -0.34571001 -0.43000999 -0.18653999 -0.061082   -0.087612  0.092833    0.52604997 -0.57611001 -0.19328     0.20576     0.24607  0.1631     -0.18138     0.032592    0.19169     0.73565    -0.25718999  0.30072999  0.56699002 -0.21544001  0.18933    -0.12287    -0.65759999  0.021702    0.1041      0.098952   -0.43171999 -0.27517    -0.15448  0.31301001 -0.032041   -0.090526    0.14489999  0.68151999 -0.88242  0.30816999 -0.62702     0.12274     0.014773   -0.16887     0.56159002  0.022004    0.52085    -0.22685     0.09633     0.26956999  0.30489001  0.018463    0.31009001  0.04198     0.32381999  0.13153     0.89722002  0.15475    -0.38806999 -0.52993    -0.35383999 -0.0913      0.57230002  0.48067001  0.24438     0.074138   -0.019957   -0.35030001 -0.034695 -0.12045     0.39998999 -0.37015    -0.53040999  0.10655    -0.44973001  0.43105    -0.44937     0.48675999  0.43836999  0.043421    0.52675003  0.61176002  0.26704001  0.59239     0.23650999  0.12841    -0.10665 -0.46715    -0.039081   -0.24921     0.030486    0.092933   -0.04841  1.83580005  0.077535   -0.11494    -0.13668001 -0.23983     0.31948  0.19205999  0.38894001  0.34755    -0.038804    0.19541    -0.37099999 -0.027576   -0.24127001 -0.16868     0.032815    0.08139     0.054121 -0.42785999  0.26447001  0.054847   -0.21765999  0.015181    0.57656002  0.24033     0.62076002 -0.019055   -0.31509     0.76424998  0.35168999 -0.28512001  0.15175     0.11238    -0.60829997  0.35087001  0.19140001  0.51753998  0.20893    -0.63814002  0.19403     0.24493     0.46606001 -0.32235     0.37286001 -0.19508     0.13237999 -0.35420999  0.22849  0.36032    -0.0050241  -0.051955   -0.37755999 -0.087065    0.3592  0.11564     0.44372001][  4.71520007e-01  -5.87790012e-01  -6.76150024e-01  -4.67000008e-01   1.11709997e-01   3.26370001e-01  -4.65070009e-01  -8.05180013e-01  -1.65340006e-01  -1.13849998e-01  -1.36849999e-01   2.56980002e-01   3.36279988e-01   1.90149993e-01   1.32440001e-01   2.37419993e-01  -1.46239996e+00   8.59059989e-01   5.53650022e-01   1.94330007e-01   2.73930013e-01   1.05669999e+00   6.24970019e-01  -4.30469990e-01   7.14770019e-01  -4.73030001e-01  -8.97960007e-01   2.56910007e-02  -6.42499983e-01   2.15990007e-01  -1.22769997e-01  -5.36949992e-01   5.91489971e-01   6.28649965e-02   1.51260002e-02  -6.57150000e-02   1.61709994e-01  -8.86740014e-02  -7.09370002e-02   6.12349987e-01   1.38689995e-01  -3.67980003e-01  -9.46219981e-01   1.30669996e-01  -2.82139987e-01  -3.02709997e-01   4.05889988e-01  -2.11899996e-01   1.74940005e-01   2.38450006e-01   3.41769993e-01   4.50269997e-01  -7.82140017e-01   1.64210007e-01   7.19319999e-01  -6.80140018e-01  -4.93660003e-01   3.67380008e-02   2.62410015e-01  -8.48299980e-01  -6.59759998e-01   4.04370010e-01  -2.61209998e-02   5.83829999e-01  -3.28000009e-01   6.39530003e-01   1.20350003e-01   7.21519988e-04   8.28130007e-01  -3.83879989e-01   5.35929978e-01  -4.59630013e-01  -5.12839973e-01   1.74339995e-01   2.06220001e-01  -8.01329970e-01  -4.74339992e-01  -4.32810009e-01  -6.11400008e-01   1.71409994e-01   5.54369986e-01   6.11240007e-02   6.43959999e-01  -5.23599982e-01   1.35130000e+00  -1.40279993e-01   3.67210001e-01  -3.68629992e-01   7.95690000e-01  -1.01139998e+00  -1.47060007e-01   4.48889993e-02   4.06240001e-02   5.33150017e-01   3.40680003e-01   2.50710011e-01  -1.26489997e-01  -1.15050003e-01  -1.48660004e-01   7.65860021e-01   9.80419964e-02   6.28759980e-01  -4.09599990e-01  -3.33020017e-02  -3.77389997e-01  -2.54250001e-02  -8.92150030e-02   1.63540006e+00   5.04270017e-01  -3.86139989e-01  -1.25259995e-01   1.65910006e-01  -2.19990000e-01  -6.84350014e-01  -2.99389988e-01  -1.25550002e-01   3.63970011e-01   3.89310002e-01  -9.63360012e-01  -7.42670000e-02   3.48879993e-01   2.36190006e-01  -8.27549994e-01   3.19779992e-01  -3.00359994e-01  -4.01740015e-01   7.04670012e-01  -3.32989991e-01  -1.26369998e-01  -3.72830003e-01  -9.09730017e-01  -1.33279994e-01  -5.32779992e-02  -3.47909987e-01   2.48980001e-01  -4.34430003e-01  -2.42050007e-01  -6.21890008e-01  -1.27769995e+00  -9.66310024e-01   4.20859993e-01  -4.12339985e-01   1.20589995e+00  -2.55379993e-02   1.01170003e+00  -1.11219997e-03   5.92599988e-01  -5.58459997e-01  -3.69489998e-01   5.58549985e-02  -6.60969973e-01   3.91559988e-01   3.09260011e-01  -6.28539994e-02  -1.14900005e+00   4.69440013e-01  -5.05439997e-01  -1.75500005e-01  -3.80259991e-01  -1.97349995e-01   7.31180012e-01  -1.77149996e-01   4.18940008e-01   4.23940003e-01  -5.16149998e-02  -1.20180003e-01   3.96189988e-01  -2.36780003e-01   1.19029999e-01   2.93020010e-01  -1.81150004e-01   4.59089994e-01   1.93489999e-01   4.75919992e-01  -3.67170006e-01  -8.76540027e-04  -6.15379997e-02  -4.12099995e-02  -1.95240006e-01  -6.47260016e-03   4.30709988e-01   6.84989989e-02  -4.27579999e-01  -3.15990001e-01  -2.59119987e-01  -6.58209980e-01  -1.92790002e-01   5.57280004e-01  -2.24089995e-01   2.21340001e-01   4.68760014e-01   6.37290001e-01   6.58949971e-01   8.79120007e-02  -3.06369990e-01  -3.22079986e-01   6.47899985e-01   1.75490007e-01  -5.78580022e-01   8.94800007e-01]
View Code

  

 

 

 

 

 

 

 

 

 

=================================== 

=====    For the word2vec               

===================================

1. Download and unzip the pre-trained model of .

2. Install the gensim tools:

  sudo pip install --upgrade gensim

3. Code for vector extraction from given sentence. 

  import gensim      print("==>> loading the pre-trained word2vec model: GoogleNews-vectors-negative300.bin")    dictFileName = './GoogleNews-vectors-negative300.bin'       wv = gensim.models.KeyedVectors.load_word2vec_format(dictFileName, binary=True)

The Output is: 

==>> loading the pre-trained word2vec model: GoogleNews-vectors-negative300.bin

INFO:gensim.models.utils_any2vec:loading projection weights from ./GoogleNews-vectors-negative300.bin
INFO:gensim.models.utils_any2vec:loaded (3000000, 300) matrix from ./GoogleNews-vectors-negative300.bin
INFO:root:Data statistic
INFO:root:train_labels:19337
INFO:root:test_labels:20632
INFO:root:train_sentences:19337
INFO:root:dev_sentences:2000
INFO:root:test_sentences:20632
INFO:root:dev_labels:2000
embed_size:300
vocab_size:3000000

vocab_path = ''data/bilstm.vocab''        index_to_word = [key for key in wv.vocab]    word_to_index = {}    for index, word in enumerate(index_to_word):        word_to_index[word] = index    with open(vocab_path, "w") as f:        f.write(json.dumps(word_to_index))

 

 

 

Let's take a deep understanding on the  bidirectional-LSTM-for-text-classification-master 

1 class BiLSTM(nn.Module):  2     def __init__(self, embedding_matrix, hidden_size=150, num_layer=2, embedding_freeze=False):  3         super(BiLSTM,self).__init__()  4   5         # embedding layer  6         vocab_size = embedding_matrix.shape[0]  7         embed_size = embedding_matrix.shape[1]  8         self.hidden_size = hidden_size  9         self.num_layer = num_layer 10         self.embed = nn.Embedding(vocab_size, embed_size) 11         self.embed.weight = nn.Parameter(torch.from_numpy(embedding_matrix).type(torch.FloatTensor), requires_grad=not embedding_freeze) 12         self.embed_dropout = nn.Dropout(p=0.3) 13         self.custom_params = [] 14         if embedding_freeze == False: 15             self.custom_params.append(self.embed.weight) 16  17         # The first LSTM layer 18         self.lstm1 = nn.LSTM(embed_size, self.hidden_size, num_layer, dropout=0.3, bidirectional=True) 19         for param in self.lstm1.parameters(): 20             self.custom_params.append(param) 21             if param.data.dim() > 1: 22                 nn.init.orthogonal(param) 23             else: 24                 nn.init.normal(param) 25  26         self.lstm1_dropout = nn.Dropout(p=0.3) 27  28         # The second LSTM layer 29         self.lstm2 = nn.LSTM(2*self.hidden_size, self.hidden_size, num_layer, dropout=0.3, bidirectional=True) 30         for param in self.lstm2.parameters(): 31             self.custom_params.append(param) 32             if param.data.dim() > 1: 33                 nn.init.orthogonal(param) 34             else: 35                 nn.init.normal(param) 36         self.lstm2_dropout = nn.Dropout(p=0.3) 37          38         # Attention 39         self.attention = nn.Linear(2*self.hidden_size,1) 40         self.attention_dropout = nn.Dropout(p=0.5) 41  42         # Fully-connected layer 43         self.fc = weight_norm(nn.Linear(2*self.hidden_size,3)) 44         for param in self.fc.parameters(): 45             self.custom_params.append(param) 46             if param.data.dim() > 1: 47                 nn.init.orthogonal(param) 48             else: 49                 nn.init.normal(param) 50  51         self.hidden1=self.init_hidden() 52         self.hidden2=self.init_hidden() 53  54     def init_hidden(self, batch_size=3): 55         if torch.cuda.is_available(): 56             return (Variable(torch.zeros(self.num_layer*2, batch_size, self.hidden_size)).cuda(), 57                     Variable(torch.zeros(self.num_layer*2, batch_size, self.hidden_size)).cuda()) 58         else: 59             return (Variable(torch.zeros(self.num_layer*2, batch_size, self.hidden_size)), 60                     Variable(torch.zeros(self.num_layer*2, batch_size, self.hidden_size))) 61  62     def forward(self, sentences): 63  64         print("==>> sentences: ", sentences)  65  66         # get embedding vectors of input  67         padded_sentences, lengths = torch.nn.utils.rnn.pad_packed_sequence(sentences, padding_value=int(0), batch_first=True) 68         print("==>> padded_sentences: ", padded_sentences) 69  70  71         embeds = self.embed(padded_sentences) 72         print("==>> embeds: ", embeds) 73  74         # pdb.set_trace()  75  76         noise = Variable(torch.zeros(embeds.shape).cuda()) 77         noise.data.normal_(std=0.3) 78         embeds += noise 79         embeds = self.embed_dropout(embeds) 80         # add noise 81          82         packed_embeds = torch.nn.utils.rnn.pack_padded_sequence(embeds, lengths, batch_first=True) 83  84         print("==>> packed_embeds: ", packed_embeds)  85  86  87  88         # First LSTM layer 89         # self.hidden = num_layers*num_directions batch_size hidden_size 90         packed_out_lstm1, self.hidden1 = self.lstm1(packed_embeds, self.hidden1) 91         padded_out_lstm1, lengths = torch.nn.utils.rnn.pad_packed_sequence(packed_out_lstm1, padding_value=int(0)) 92         padded_out_lstm1 = self.lstm1_dropout(padded_out_lstm1) 93         packed_out_lstm1 = torch.nn.utils.rnn.pack_padded_sequence(padded_out_lstm1, lengths) 94  95  96  97         pdb.set_trace()  98  99         # Second LSTM layer100         packed_out_lstm2, self.hidden2 = self.lstm2(packed_out_lstm1, self.hidden2)101         padded_out_lstm2, lengths = torch.nn.utils.rnn.pad_packed_sequence(packed_out_lstm2, padding_value=int(0), batch_first=True)102         padded_out_lstm2 = self.lstm2_dropout(padded_out_lstm2)103 104         # attention105         unnormalize_weight = F.tanh(torch.squeeze(self.attention(padded_out_lstm2), 2))106         unnormalize_weight = F.softmax(unnormalize_weight, dim=1)107         unnormalize_weight = torch.nn.utils.rnn.pack_padded_sequence(unnormalize_weight, lengths, batch_first=True)108         unnormalize_weight, lengths = torch.nn.utils.rnn.pad_packed_sequence(unnormalize_weight, padding_value=0.0, batch_first=True)109         logging.debug("unnormalize_weight size: %s" % (str(unnormalize_weight.size())))110         normalize_weight = torch.nn.functional.normalize(unnormalize_weight, p=1, dim=1)111         normalize_weight = normalize_weight.view(normalize_weight.size(0), 1, -1)112         weighted_sum = torch.squeeze(normalize_weight.bmm(padded_out_lstm2), 1)113         114         # fully connected layer115         output = self.fc(self.attention_dropout(weighted_sum))116         return output
View Code

 

==>> Some Testing: 

(a). len(wv.vocab) = 300,0000 

(b). wv.vocab is what ? Something like this: 

{ ... , u'fivemonth': <gensim.models.keyedvectors.Vocab object at 0x7f90945bd810>,

u'retractable_roofs_Indians': <gensim.models.keyedvectors.Vocab object at 0x7f90785f5690>,

u'Dac_Lac_province': <gensim.models.keyedvectors.Vocab object at 0x7f908d8eda10>, 

u'Kenneth_Klinge': <gensim.models.keyedvectors.Vocab object at 0x7f9081563410>}

(c). index_to_word:   count the words from the pre-trained model. 

{ ... , u"Lina'la_Sin_Casino", u'fivemonth', u'retractable_roofs_Indians', u'Dac_Lac_province', u'Kenneth_Klinge'}

(d). word_to_index: give each word a index as following 

{ ... , u'fivemonth': 2999996, u'Pidduck': 2999978, u'Dac_Lac_province': 2999998, u'Kenneth_Klinge': 2999999 } 

(e). dataset is: 

[ ... , [1837614, 1569052, 1837614, 1288695, 2221039, 2323218, 1837614, 1837614, 2029395, 1612781, 311032, 1524921, 1837614, 2973515, 2033866, 882731, 2462275, 2809106, 1479961, 826019, 73590, 953550, 1837614],

[1524921, 1113778, 1837614, 318169, 1837614, 1954969, 196613, 943118, 1837614, 2687790, 291413, 2774825, 2366038, 296869, 1468080, 856987, 1802099, 724308, 1207907, 2264894, 2206446, 812434],

[564298, 477983, 1837614, 1449153, 1837614, 211925, 2206446, 481834, 488597, 280760, 2072822, 1344872, 1791678, 2458776, 2965810, 2168205, 387112, 2656471, 1391, 1837614, 1801696, 2093846, 1210651],

[2493381, 133883, 2441902, 1014220, 1837614, 2597880, 1756105, 2651537, 1391, 2641114, 2517536, 1109601, 122269, 1782479, 2965835, 488597, 1767716, 753333, 564298, 2380935, 228060, 1837614, 371618],

[1837614, 1344872, 2458776, 2965810, 1837614, 2015408, 1837614, 528014, 1991322, 1837614, 908982, 1484130, 2349526, 988689, 753336, 1837614, 364492, 2183116, 826019, 73590, 953550, 1837614],

[1837614, 2673785, 1990947, 1219831, 2635341, 1247040, 1837614, 799543, 1990947, 1219831, 2722301, 1837614, 1427513, 969099, 2157673, 1430111, 1837614]]

 

 

(f). the variable during the training process: 

 

# ('==>> sentences: ', PackedSequence(data=tensor(

# [ 4.0230e+05, 1.8376e+06, 2.0185e+06, 1.8376e+06, 2.8157e+06,
# 1.8376e+06, 1.8376e+06, 1.0394e+06, 1.8376e+06, 2.9841e+06,
# 4.4713e+05, 1.1352e+06, 2.3532e+06, 1.8376e+06, 1.8376e+06,
# 1.8376e+06, 1.9550e+06, 3.8429e+04, 6.2537e+05, 2.3764e+05,
# 1.8376e+06, 1.5428e+06, 1.4214e+06], device='cuda:0'),
# batch_sizes=tensor([ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1])))

 

# ('==>> padded_sentences: ', tensor(
# [[ 4.0230e+05, 1.8376e+06, 2.0185e+06, 1.8376e+06, 2.8157e+06,
# 1.8376e+06, 1.8376e+06, 1.0394e+06, 1.8376e+06, 2.9841e+06,
# 4.4713e+05, 1.1352e+06, 2.3532e+06, 1.8376e+06, 1.8376e+06,
# 1.8376e+06, 1.9550e+06, 3.8429e+04, 6.2537e+05, 2.3764e+05,
# 1.8376e+06, 1.5428e+06, 1.4214e+06]], device='cuda:0'))

 

# length: 23

 

# ('==>> embeds: ', tensor([[
# [-0.0684, 0.1826, -0.1777, ..., 0.1904, -0.1021, 0.1729],
# [ 0.0801, 0.1050, 0.0498, ..., 0.0037, 0.0476, -0.0688],
# [-0.1982, -0.0693, 0.1230, ..., -0.1357, -0.0306, 0.1104],
# ...,
# [ 0.0801, 0.1050, 0.0498, ..., 0.0037, 0.0476, -0.0688],
# [-0.0518, -0.0299, 0.0415, ..., 0.0776, -0.1660, 0.1602],
# [-0.0532, -0.0004, 0.0337, ..., -0.2373, -0.1709, 0.0233]]], device='cuda:0'))

 

 

# ('==>> packed_embeds: ', PackedSequence(data=tensor([

# [-0.3647, 0.2966, -0.2359, ..., -0.0000, 0.2657, -0.4302],
# [ 1.1699, 0.0000, 0.3312, ..., 0.5714, 0.1930, -0.2267],
# [-0.0627, -1.0548, 0.4966, ..., -0.5135, -0.0150, -0.0000],
# ...,
# [-0.6065, -0.7562, 0.3320, ..., -0.5854, -0.2089, -0.5737],
# [-0.0000, 0.4390, 0.0000, ..., 0.6891, 0.0250, -0.0000],
# [ 0.6909, -0.0000, -0.0000, ..., 0.1867, 0.0594, -0.2385]], device='cuda:0'),
# batch_sizes=tensor([ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1])))

 

# packed_out_lstm1

# PackedSequence(data=tensor([
# [-0.0000, -0.0000, 0.1574, ..., 0.1864, -0.0901, -0.3205],
# [-0.0000, -0.3490, 0.1774, ..., 0.1677, -0.0000, -0.3688],
# [-0.3055, -0.0000, 0.2240, ..., 0.0000, -0.0927, -0.0000],
# ...,
# [-0.3188, -0.4134, 0.1339, ..., 0.3161, -0.0000, -0.3846],
# [-0.3355, -0.4365, 0.1575, ..., 0.2775, -0.0886, -0.4015],
# [-0.0000, -0.0000, 0.2452, ..., 0.1763, -0.0000, -0.2748]], device='cuda:0'),
# batch_sizes=tensor([ 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1, 1])) 

 

## self.hidden1

1         # self.hidden1  2         # (tensor([   3         #         [[-0.0550, -0.4266,  0.0498,  0.0750,  0.3122,  0.0499, -0.2899,  4         #           -0.0980, -0.1776,  0.4376,  0.1555,  0.7167, -0.0162,  0.0081,  5         #           -0.3512,  0.0250,  0.0948, -0.3596,  0.0670, -0.6623,  0.0026,  6         #           -0.0262,  0.1934,  0.3206, -0.1941,  0.3229, -0.6488, -0.3858,  7         #            0.3169, -0.0216, -0.1136,  0.1411, -0.0296, -0.2860, -0.1277,  8         #           -0.0154,  0.2620, -0.3553,  0.0637,  0.8245,  0.6886, -0.0048,  9         #           -0.1717,  0.2495, -0.8636, -0.0217, -0.2365,  0.1324,  0.1790, 10         #           -0.1515,  0.3530, -0.1644,  0.0073,  0.6709,  0.1577, -0.0190, 11         #            0.0384,  0.4871, -0.7375, -0.1804,  0.3034, -0.3516, -0.2870, 12         #            0.6387,  0.0414,  0.6983, -0.3211,  0.0449,  0.2127, -0.0421, 13         #           -0.3454, -0.8145, -0.3629, -0.0828, -0.1558,  0.4048, -0.4971, 14         #           -0.7495,  0.0622, -0.3318,  0.3913, -0.0322, -0.0678,  0.0307, 15         #           -0.0153, -0.1535, -0.2719,  0.0128, -0.1521, -0.2924,  0.7109, 16         #           -0.8551,  0.0330,  0.0482,  0.2410, -0.0655, -0.2496,  0.1816, 17         #            0.4963, -0.7593, -0.0022, -0.1122,  0.6857, -0.5693,  0.5805, 18         #            0.7660,  0.4430,  0.0243, -0.0313,  0.0780, -0.2419,  0.0745, 19         #           -0.0119,  0.5761, -0.0285, -0.1085, -0.1783, -0.0706,  0.1243, 20         #            0.6333,  0.0296,  0.5557,  0.2717, -0.0071,  0.0503, -0.0405, 21         #           -0.4542,  0.8905,  0.4492,  0.8809,  0.7021,  0.8616,  0.2825, 22         #           -0.2114,  0.3026, -0.1384,  0.1252,  0.4989,  0.2236, -0.5374, 23         #           -0.1352, -0.0561,  0.0378, -0.5291, -0.1004, -0.3723,  0.0836, 24         #            0.3500,  0.0542,  0.2013]], 25  26         #         [[-0.0041, -0.0040, -0.1428,  0.2783, -0.1378, -0.4242,  0.1000, 27         #            0.1641, -0.0175, -0.0896,  0.3241,  0.3513, -0.4675, -0.1250, 28         #            0.0546,  0.2400, -0.0997, -0.5614,  0.2026, -0.1505,  0.0833, 29         #           -0.3128, -0.4646, -0.0778,  0.2204,  0.5597, -0.7004,  0.0419, 30         #           -0.3699, -0.5748,  0.7741, -0.7220,  0.0494,  0.3430,  0.1389, 31         #            0.0178,  0.0136,  0.0273,  0.1559,  0.4333,  0.2411,  0.0804, 32         #           -0.0202,  0.6204, -0.4104,  0.5382,  0.1804,  0.0179,  0.1118, 33         #            0.3084,  0.1894,  0.0092, -0.2182,  0.0022,  0.4377, -0.0575, 34         #            0.0906, -0.0531,  0.0936, -0.1203, -0.0836, -0.1718,  0.2059, 35         #            0.7192,  0.0833,  0.3712,  0.2354, -0.1141, -0.0993, -0.0433, 36         #           -0.1474, -0.0078,  0.0834,  0.0864,  0.9305, -0.3387,  0.0818, 37         #            0.3775,  0.1609, -0.0246,  0.2563, -0.1253, -0.0897,  0.0322, 38         #           -0.0848,  0.0673, -0.1051,  0.0205,  0.0183, -0.0007,  0.1229, 39         #           -0.3388, -0.0948,  0.0335,  0.0450, -0.2747,  0.2763, -0.2691, 40         #            0.6240, -0.0018,  0.2048,  0.3943,  0.2015, -0.5962, -0.0069, 41         #           -0.3460, -0.7910,  0.3002, -0.4653, -0.0611,  0.6912, -0.8154, 42         #           -0.0443, -0.0189, -0.1265, -0.1202,  0.0013, -0.5983,  0.0879, 43         #            0.0752,  0.8593,  0.0357,  0.0953, -0.0525,  0.2069, -0.6292, 44         #           -0.0456, -0.7646,  0.0166,  0.0584,  0.0142,  0.0575, -0.1658, 45         #           -0.4304,  0.3228,  0.4094,  0.0149,  0.1478,  0.1447,  0.4192, 46         #           -0.0783, -0.0683,  0.0259,  0.0665,  0.6224,  0.3775,  0.0247, 47         #            0.1710, -0.3622,  0.0931]], 48  49         #         [[-0.2315, -0.2179,  0.1716, -0.6143,  0.4329,  0.2288,  0.1208, 50         #           -0.3435,  0.6314,  0.0106,  0.1470, -0.0915, -0.3019,  0.1302, 51         #            0.2325,  0.1794,  0.0145, -0.6598,  0.0062, -0.0743,  0.0232, 52         #            0.9310, -0.8155, -0.0449, -0.5504,  0.5746,  0.3607,  0.4053, 53         #           -0.1887, -0.5448,  0.1522, -0.0189, -0.4852,  0.6322,  0.0011, 54         #           -0.1590, -0.0054,  0.2972, -0.0270,  0.0047,  0.2944, -0.0629, 55         #           -0.1138,  0.1349,  0.5449,  0.8018, -0.0218,  0.0523,  0.3262, 56         #           -0.0506,  0.2821, -0.0661, -0.7165, -0.3653,  0.3321, -0.0255, 57         #           -0.0551, -0.1826,  0.6027, -0.1995,  0.0598,  0.0205, -0.1769, 58         #           -0.0789,  0.4500, -0.1641,  0.5002, -0.0716, -0.3708, -0.0020, 59         #           -0.5195, -0.0896,  0.1421,  0.1149,  0.3407,  0.2649,  0.0858, 60         #            0.2778, -0.3768,  0.6176, -0.2148,  0.5444,  0.3009,  0.4848, 61         #           -0.1174,  0.0019,  0.6213,  0.2524, -0.0816,  0.4639,  0.4747, 62         #           -0.7812,  0.2435, -0.0867,  0.1617,  0.2194,  0.0426,  0.1393, 63         #           -0.0448,  0.0506, -0.5524,  0.0707,  0.2226, -0.0337,  0.7445, 64         #           -0.4516,  0.1107, -0.2617, -0.1914,  0.7238, -0.2689,  0.0110, 65         #           -0.3139, -0.0027, -0.5964, -0.9012, -0.4319,  0.0112, -0.0306, 66         #            0.4002, -0.1117, -0.1021,  0.1652, -0.2872,  0.3640,  0.2162, 67         #           -0.3843, -0.0869, -0.1623,  0.0297, -0.0048, -0.0735, -0.0886, 68         #           -0.4138,  0.2325, -0.4248,  0.3354,  0.0712, -0.4079,  0.0821, 69         #            0.1413,  0.2241, -0.1938, -0.0807,  0.3551, -0.0814,  0.1438, 70         #           -0.6870, -0.3647,  0.0276]], 71  72         #         [[ 0.0258,  0.3281, -0.8145, -0.0476, -0.2886, -0.8013,  0.2135, 73         #            0.1541,  0.2069,  0.1345, -0.0171, -0.0228, -0.5237,  0.4917, 74         #            0.5187,  0.1402,  0.0928, -0.0373,  0.2698,  0.1259,  0.0021, 75         #           -0.1624, -0.4100,  0.5377, -0.1013, -0.5658, -0.1015,  0.5609, 76         #            0.1661,  0.5731,  0.0012, -0.1766,  0.0743, -0.3630,  0.1082, 77         #            0.4643,  0.0175, -0.0260,  0.3810, -0.6425, -0.5515,  0.8800, 78         #           -0.1158, -0.5741,  0.0463,  0.4033,  0.0803,  0.0403,  0.1159, 79         #            0.4471,  0.0294,  0.2899,  0.0248, -0.1772,  0.6600, -0.2252, 80         #           -0.4896, -0.1285, -0.2377, -0.4179, -0.4056, -0.3224, -0.6855, 81         #           -0.2703,  0.2971,  0.1259, -0.0456, -0.2495,  0.8141,  0.4453, 82         #            0.7480, -0.0578,  0.8023, -0.3586, -0.5229,  0.2299,  0.9668, 83         #           -0.0717,  0.5355, -0.0743,  0.5246,  0.1604, -0.1464, -0.0757, 84         #            0.0414, -0.0861,  0.2245,  0.1247,  0.0676, -0.2053,  0.0113, 85         #            0.7875, -0.0308,  0.2025,  0.1289, -0.0020, -0.3099,  0.5317, 86         #           -0.0117,  0.0928, -0.4100, -0.6184,  0.1171,  0.0216, -0.1266, 87         #            0.1640,  0.0821, -0.4097, -0.0691,  0.5805,  0.1692, -0.2021, 88         #            0.5971,  0.1172, -0.6535, -0.0579,  0.1177,  0.1123, -0.1943, 89         #            0.0488, -0.1305, -0.4859, -0.2758, -0.2972, -0.0605, -0.0029, 90         #           -0.1508,  0.0375, -0.5942, -0.2139, -0.0335, -0.2320, -0.1152, 91         #           -0.2054, -0.2643, -0.1770,  0.1245,  0.6334, -0.0363,  0.0264, 92         #           -0.3348, -0.0434, -0.3794, -0.0913,  0.1293, -0.6537,  0.6490, 93         #            0.1305, -0.0631, -0.2243]]], device='cuda:0'),  94         # tensor([[[ -0.0775,  -4.6346,   7.9077,   0.1164,   0.8626,   0.4240, 95         #            -0.9286,  -0.1612,  -0.6049,   0.6771,   0.7443,   1.7457, 96         #            -0.3930,   0.0112, -13.5393,   0.0317,   0.1236,  -0.8475, 97         #             0.1212,  -1.3623,   0.0117,  -0.3297,   0.9009,   0.3840, 98         #            -0.5885,   0.7411,  -8.9613,  -1.1402,   0.4511,  -0.0753, 99         #            -0.3107,   1.6518,  -0.0870,  -3.1360, -12.0200,  -0.0464,100         #             0.2756,  -0.6695,   0.1604,   4.8299,   6.4623,  -0.9555,101         #            -0.6904,   0.4469, -11.3343,  -0.1669,  -0.2747,   0.1590,102         #             0.5829,  -0.3345,   2.1731,  -0.5636,   0.0207,   0.9874,103         #             0.6291,  -1.2261,   0.1946,   1.1287,  -1.5759,  -0.1875,104         #             0.5550,  -1.7350,  -0.8235,   1.5122,   0.2019,   5.5143,105         #            -3.8153,   0.6771,   0.3011,  -0.2994,  -0.7320,  -1.5857,106         #            -0.4785,  -0.5584,  -0.3226,   1.1932,  -4.3901,  -1.6923,107         #             0.3526,  -1.0625,   0.9279,  -0.1843,  -0.4376,   2.2389,108         #            -0.1558,  -0.3959,  -1.2987,   0.0279,  -0.4938,  -0.3364,109         #             3.2596,  -2.2647,   0.1448,   0.0726,   0.3968,  -0.1885,110         #            -0.3960,   0.2141,   0.6785,  -2.1622,  -0.0043,  -0.7516,111         #             0.9367,  -0.7092,   4.1853,   1.3348,   0.6993,   0.2043,112         #            -0.0916,   0.1392,  -1.5672,   0.0867,  -0.0346,   0.9226,113         #            -0.0470,  -0.6870,  -0.3002,  -0.1131,   0.7785,   1.0582,114         #             0.0914,   2.8785,   0.8164,  -0.1048,   0.0573,  -0.0499,115         #            -0.5990,   3.1714,   0.7925,   1.7461,   1.3243,   2.9236,116         #             0.8966,  -0.4455,   0.8763,  -0.3036,   0.3302,   2.6581,117         #             0.4608,  -0.7280,  -2.9457,  -0.1973,   0.0585,  -0.6555,118         #            -0.6621,  -0.4549,   0.5812,   0.4495,   0.1350,   1.8521]],119 120         #         [[ -0.0053,  -0.0843,  -0.1763,   0.3929,  -0.1668,  -0.6609,121         #             0.1269,   0.2214,  -0.0208,  -0.3571,   0.7532,   0.7496,122         #            -4.9288,  -0.5457,   0.3557,   0.4795,  -0.2318,  -0.9659,123         #             0.6826,  -1.6542,   0.4917,  -0.3956,  -1.5164,  -0.2274,124         #             0.6779,   1.1201,  -3.1397,   0.0434,  -0.4993,  -0.8809,125         #             6.1257,  -5.6283,   0.4273,   1.5070,   0.6624,   0.1289,126         #             0.2180,   0.9920,   0.1646,   0.8828,   0.5732,   0.3255,127         #            -0.0679,   0.9843,  -1.8408,   1.0547,   0.1959,   0.0748,128         #             0.1907,   0.4751,   0.3174,   0.0747,  -0.6487,   0.0377,129         #             0.5554,  -0.4095,   0.2593,  -0.0568,   0.3751,  -0.3646,130         #            -0.2031,  -0.3284,   0.4058,   1.2788,   0.1348,   1.8184,131         #             0.8482,  -0.7494,  -0.2395,  -0.4352,  -0.1584,  -0.0105,132         #             0.2676,   0.3763,   2.1413,  -1.3001,   0.3923,   1.6432,133         #             0.2987,  -1.2708,   5.5667,  -0.1727,  -0.5106,   0.5180,134         #            -6.7258,   0.5001,  -0.3052,   0.0843,   0.0474,  -0.2306,135         #             0.1908,  -2.2523,  -1.5432,   0.2809,   0.7099,  -0.4145,136         #             0.7393,  -0.6529,   0.7825,  -0.0019,   1.3337,   2.2042,137         #             9.2887,  -0.8515,  -0.0610,  -0.6146,  -1.5616,   0.3592,138         #            -1.3585,  -0.2641,   1.4763,  -3.2525,  -0.5447,  -0.0453,139         #            -1.0416,  -2.4657,   0.0556,  -0.7654,   0.2062,   0.0855,140         #             3.0740,   0.0952,   0.4923,  -0.1772,   0.9173,  -4.2004,141         #            -0.1298,  -2.4266,   0.0181,   0.5039,   0.0399,   7.9909,142         #            -0.5778,  -2.9112,   0.4854,   1.2364,   0.0686,   0.6365,143         #             0.1869,   0.6050,  -0.1246,  -0.1848,   0.5406,   0.2110,144         #             1.2367,   1.9466,   0.0302,   0.2002,  -0.5902,   0.1069]],145 146         #         [[ -0.3559,  -0.2859,   0.2699,  -1.4359,   0.9814,   0.2811,147         #             0.8539,  -2.6654,   2.5455,   0.0434,   0.5947,  -0.5325,148         #            -0.4638,   0.5487,   1.4376,   0.9863,   0.7429,  -1.8308,149         #             0.0402,  -0.2282,   0.0366,  12.7877,  -1.2491,  -0.1437,150         #            -1.4960,   0.7364,   0.8599,   1.8343,  -3.5117,  -1.2758,151         #             0.2930,  -0.0472,  -0.7527,   0.9555,   0.0446,  -0.3389,152         #            -0.1985,   1.7953,  -0.5702,   0.0141,   0.6166,  -0.0924,153         #            -0.5182,   0.5146,   2.0801,   2.7460,  -0.2606,   0.2090,154         #             0.9266,  -0.4758,   0.9961,  -0.1723,  -1.2069,  -1.1735,155         #             0.3683,  -0.4933,  -0.0604,  -0.2354,   0.8239,  -5.4226,156         #             0.0854,   0.1185,  -0.2656,  -0.2689,   0.6047,  -0.6246,157         #             1.0131,  -0.1673,  -0.4990,  -0.0690,  -0.6092,  -0.5205,158         #             0.1808,   0.3061,   0.3924,   0.5868,   0.1452,   2.8930,159         #            -0.6085,   1.6086,  -0.4763,   5.0389,   1.1569,   3.4060,160         #            -0.7565,   0.0247,   0.8477,   0.3714,  -0.1043,   1.5607,161         #             4.0700,  -1.8363,   0.4370,  -0.3571,   0.7268,   0.3435,162         #             0.0972,   7.1477,  -0.1486,   0.3342,  -0.9733,   0.2311,163         #             0.6104,  -0.4988,   2.8838,  -1.3387,   1.4291,  -0.4121,164         #            -0.6722,   2.6834,  -0.5188,   0.0428,  -0.3452,  -0.0131,165         #            -0.9004,  -3.0346,  -0.9254,   0.0150,  -0.0386,   1.0639,166         #            -0.2444,  -0.4562,   4.1626,  -1.9304,   1.0662,   2.0683,167         #            -0.9553,  -0.6434,  -1.6777,   0.0702,  -0.0113,  -0.5503,168         #            -0.1873,  -0.6916,   1.0729,  -0.8234,   0.6421,   0.3022,169         #            -0.6065,   0.1016,   0.7792,   0.2533,  -0.2670,  -0.1314,170         #             0.7515,  -0.9859,   0.5050,  -1.0552,  -0.5632,   1.0697]],171 172         #         [[  0.3612,   2.4727,  -4.6103,  -1.6459,  -1.7761,  -1.4302,173         #             0.2737,   0.3302,   4.0617,   0.7206,  -0.0749,  -0.8146,174         #            -1.0134,   0.8741,   1.9300,   0.5426,   0.1386,  -1.3920,175         #             0.4602,   1.3387,   0.0068,  -0.3648,  -7.6665,   0.9011,176         #            -0.3286,  -1.5220,  -0.2155,   0.7959,   4.0746,   0.9382,177         #             0.0023,  -0.2666,   0.4571,  -1.9530,   0.3216,   2.1178,178         #             0.4043,  -0.0309,   2.5116,  -1.2250,  -0.9842,   5.0822,179         #            -4.1296,  -5.3579,   0.9115,   0.4843,   0.1365,   0.0491,180         #             0.1446,   0.6523,   0.0765,   0.3761,   0.0310,  -0.4825,181         #             7.1485,  -0.4211,  -3.7914,  -0.2492,  -0.3775,  -0.4745,182         #            -1.3320,  -1.8203,  -1.0266,  -0.4446,   2.2385,   0.6003,183         #            -0.1759,  -1.9601,   2.3865,   1.3325,   4.8762,  -0.2398,184         #             7.5251,  -0.4380,  -2.3422,   0.6013,  13.8362,  -0.4112,185         #             2.3579,  -0.1720,   1.0265,   0.6521,  -0.7363,  -0.7864,186         #             0.2986,  -0.1298,   0.5078,   0.1386,   1.4856,  -0.3133,187         #             0.9933,   1.5144,  -0.0433,   1.0841,   0.3962,  -0.0024,188         #            -0.3937,   2.2719,  -0.0198,   1.6771,  -1.2469,  -0.8017,189         #             0.1607,   0.0244,  -0.1429,   0.9912,   0.1635,  -1.2396,190         #            -0.1615,   1.0921,   0.8146,  -0.3309,   0.8553,   0.4243,191         #            -3.5547,  -0.1382,   0.1513,   0.4036,  -0.2505,   0.2295,192         #            -0.6219,  -0.7644,  -0.7568,  -0.4494,  -0.0775,  -0.0178,193         #            -0.2550,   0.2258,  -2.7895,  -0.3362,  -0.2364,  -0.9864,194         #            -1.3459,  -1.8118,  -0.4397,  -0.8312,   0.3526,   1.3541,195         #            -0.0467,   1.6161,  -0.4478,  -0.5202,  -0.4164,  -0.8265,196         #             0.1626,  -4.2044,   3.2649,   0.2940,  -0.8260,  -0.4956]]], device='cuda:0'))
View Code

 

######## Padding functions used in pytorch. #########

1. torch.nn.utils.rnn.PackedSequence(*args) 

  Holds the data and list of batch_sizes of a packed sequence. 

  All RNN modules accept packed sequences as inputs. 

2.  torch.nn.utils.rnn.pack_padded_sequence(input, lengths, batch_first=False) 

  Packs a Tensor containing padded sequences of variable length.

  Input can be of size T*B** where T is the length of the longest sequence, B is the batch size, and the * is any number of dimensions.

  If batch_first is True  B*T** inputs are expected. The sequences should be sorted by length in a decreasing order.

3. torch.nn.utils.rnn.pad_packed_sequence(sequence, batch_first=False, padding_values=0.0, total_length=None)

  Pads a packed batch of variable length sequences.

  It is an inverse operation to pack_padded_sequence(). 

  Batch elements will be ordered decreasingly by their length. 

  Note: total_length is useful to implement the pack sequence -> rnn -> unpack sequence .  

  Return: Tuple of tensor containing the padded sequence, and a tensor containing the list of lengths of each sequence in the batch

4. torch.nn.utils.rnn.pad_sequence(sequence, batch_first=False, padding_value=0)

  Pad a list of variable length Tensors with zero.

 

5. torch.nn.utils.rnn.pack_sequence(sequences)

  Packs a list of variable length Tensors. 

6. Tutorials on these functions.  

  (1). https://zhuanlan.zhihu.com/p/34418001

  (2). https://zhuanlan.zhihu.com/p/28472545

 

总结起来就是:在利用 recurrent neural network 处理变长的句子序列时,我们可以配套的使用:

  torch.nn.utils.rnn.pack_padded_sequence ()   来对一个 mini-batch 中的句子进行 padding;

  torch.nn.utils.rnn.pad_packed_sequence ()   来避免 padding 对句子表示的影响。

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

===

转载于:https://www.cnblogs.com/wangxiaocvpr/p/8969137.html

转载地址:https://blog.csdn.net/a1424262219/article/details/102148589 如侵犯您的版权,请留言回复原文章的地址,我们会给您删除此文章,给您带来不便请您谅解!

上一篇:Hierarchical Question-Image Co-Attention for Visual Question Answering
下一篇:Visual Question Answering with Memory-Augmented Networks

发表评论

最新留言

网站不错 人气很旺了 加油
[***.192.178.218]2024年04月20日 00时30分31秒