-
Notifications
You must be signed in to change notification settings - Fork 5
/
index.html
343 lines (288 loc) · 29.6 KB
/
index.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>06_Encoder_Decoder</title>
<link rel="stylesheet" href="https://stackedit.io/style.css" />
</head>
<body class="stackedit">
<div class="stackedit__html"><h1 id="encoder-decoder">06 Encoder Decoder</h1>
<h2 id="assignment">Assignment</h2>
<ol>
<li>Take the last code (+tweet dataset) and convert that in such a war that:
<ol>
<li><em>encoder:</em> an RNN/LSTM layer takes the words in a sentence one by one and finally converts them into a single vector. <strong>VERY IMPORTANT TO MAKE THIS SINGLE VECTOR</strong></li>
<li>this single vector is then sent to another RNN/LSTM that also takes the last prediction as its second input. Then we take the final vector from this Cell</li>
<li>and send this final vector to a Linear Layer and make the final prediction.</li>
<li>This is how it will look:
<ol>
<li>embedding</li>
<li><em>word from a sentence +last hidden vector -></em> encoder <em>-> single vector</em></li>
<li><em>single vector + last hidden vector -> decoder -> single vector</em></li>
<li><em>single vector -> FC layer -> Prediction</em></li>
</ol>
</li>
</ol>
</li>
<li>Your code will be checked for plagiarism, and if we find that you have copied from the internet, then -100%.</li>
<li>The code needs to look as simple as possible, the focus is on making encoder/decoder classes and how to link objects together</li>
<li>Getting good accuracy is NOT the target, but must achieve at least <strong>45%</strong> or more</li>
<li>Once the model is trained, take one sentence, “print the outputs” of the encoder for each step and “print the outputs” for each step of the decoder. ← <strong>THIS IS THE ACTUAL ASSIGNMENT</strong></li>
</ol>
<h2 id="solution">Solution</h2>
<table>
<thead>
<tr>
<th>Dataset</th>
<th>Model</th>
</tr>
</thead>
<tbody>
<tr>
<td><a href="https://github.com/satyajitghana/TSAI-DeepNLP-END2.0/blob/main/06_Encoder_Decoder/Tweets_Dataset.ipynb">Github</a></td>
<td><a href="https://github.com/satyajitghana/TSAI-DeepNLP-END2.0/blob/main/06_Encoder_Decoder/Tweets_Model.ipynb">Github</a></td>
</tr>
<tr>
<td><a href="https://githubtocolab.com/satyajitghana/TSAI-DeepNLP-END2.0/blob/main/06_Encoder_Decoder/Tweets_Dataset.ipynb">Colab</a></td>
<td><a href="https://github.com/satyajitghana/TSAI-DeepNLP-END2.0/blob/main/06_Encoder_Decoder/Tweets_Model.ipynb">Colab</a></td>
</tr>
</tbody>
</table><p>The Dataset consists of <code>1341 (Cleaned)</code> Tweets which are labelled <code>[Negative, Positive, Neutral]</code></p>
<pre><code>Negative: 931, Positive: 352, Neutral: 81
</code></pre>
<p>68% is Negative, so I need to beat at least this to claim that my model is learning</p>
<p><img src="https://y.yarn.co/4ffcc7a9-5e92-4811-a0cb-f5fd684cff05_text.gif" alt="bar too low"></p>
<p>The real question though is how low of number of parameters we can go?</p>
<hr>
<p>Highest Test Accuracy: <code>81.33%</code><br>
Epochs: <code>5</code></p>
<p>Tensorboard ExperimentLogs: <a href="https://tensorboard.dev/experiment/HaR2fvGGSn6gWznIGb645Q/#scalars">https://tensorboard.dev/experiment/HaR2fvGGSn6gWznIGb645Q/#scalars</a></p>
<p><img src="https://github.com/satyajitghana/TSAI-DeepNLP-END2.0/blob/main/06_Encoder_Decoder/val_accuracy.png?raw=true" alt="val accuracy"></p>
<h2 id="model">Model</h2>
<h3 id="the-encoder">The Encoder</h3>
<pre class=" language-python"><code class="prism language-python"><span class="token keyword">class</span> <span class="token class-name">Encoder</span><span class="token punctuation">(</span>nn<span class="token punctuation">.</span>Module<span class="token punctuation">)</span><span class="token punctuation">:</span>
<span class="token keyword">def</span> <span class="token function">__init__</span><span class="token punctuation">(</span>self<span class="token punctuation">,</span> input_dim<span class="token operator">=</span><span class="token number">300</span><span class="token punctuation">,</span> hidden_dim<span class="token operator">=</span><span class="token number">16</span><span class="token punctuation">,</span> proj_dim<span class="token operator">=</span><span class="token number">64</span><span class="token punctuation">)</span><span class="token punctuation">:</span>
<span class="token builtin">super</span><span class="token punctuation">(</span>self<span class="token punctuation">.</span>__class__<span class="token punctuation">,</span> self<span class="token punctuation">)</span><span class="token punctuation">.</span>__init__<span class="token punctuation">(</span><span class="token punctuation">)</span>
self<span class="token punctuation">.</span>input_dim <span class="token operator">=</span> input_dim
self<span class="token punctuation">.</span>hidden_dim <span class="token operator">=</span> hidden_dim
self<span class="token punctuation">.</span>proj_dim <span class="token operator">=</span> proj_dim
self<span class="token punctuation">.</span>encode_lstm <span class="token operator">=</span> nn<span class="token punctuation">.</span>LSTMCell<span class="token punctuation">(</span>self<span class="token punctuation">.</span>input_dim<span class="token punctuation">,</span> self<span class="token punctuation">.</span>hidden_dim<span class="token punctuation">,</span> bias<span class="token operator">=</span><span class="token boolean">False</span><span class="token punctuation">)</span>
self<span class="token punctuation">.</span>encoder_proj <span class="token operator">=</span> nn<span class="token punctuation">.</span>Linear<span class="token punctuation">(</span>self<span class="token punctuation">.</span>hidden_dim<span class="token punctuation">,</span> self<span class="token punctuation">.</span>proj_dim<span class="token punctuation">,</span> bias<span class="token operator">=</span><span class="token boolean">False</span><span class="token punctuation">)</span>
<span class="token keyword">def</span> <span class="token function">init_hidden</span><span class="token punctuation">(</span>self<span class="token punctuation">,</span> device<span class="token punctuation">,</span> batch_size<span class="token punctuation">)</span><span class="token punctuation">:</span>
zeros <span class="token operator">=</span> torch<span class="token punctuation">.</span>zeros<span class="token punctuation">(</span>batch_size<span class="token punctuation">,</span> self<span class="token punctuation">.</span>hidden_dim<span class="token punctuation">,</span> device<span class="token operator">=</span>device<span class="token punctuation">)</span>
<span class="token keyword">return</span> <span class="token punctuation">(</span>zeros<span class="token punctuation">,</span> zeros<span class="token punctuation">)</span>
<span class="token keyword">def</span> <span class="token function">forward</span><span class="token punctuation">(</span>self<span class="token punctuation">,</span> sequences<span class="token punctuation">,</span> lengths<span class="token punctuation">,</span> hidden_state<span class="token punctuation">,</span> debug<span class="token operator">=</span><span class="token boolean">False</span><span class="token punctuation">)</span><span class="token punctuation">:</span>
<span class="token punctuation">(</span>hh<span class="token punctuation">,</span> cc<span class="token punctuation">)</span> <span class="token operator">=</span> hidden_state
<span class="token keyword">for</span> idx <span class="token keyword">in</span> <span class="token builtin">range</span><span class="token punctuation">(</span>lengths<span class="token punctuation">[</span><span class="token number">0</span><span class="token punctuation">]</span><span class="token punctuation">)</span><span class="token punctuation">:</span>
<span class="token punctuation">(</span>hh<span class="token punctuation">,</span> cc<span class="token punctuation">)</span> <span class="token operator">=</span> self<span class="token punctuation">.</span>encode_lstm<span class="token punctuation">(</span>sequences<span class="token punctuation">[</span><span class="token number">0</span><span class="token punctuation">]</span><span class="token punctuation">[</span>idx<span class="token punctuation">]</span><span class="token punctuation">.</span>unsqueeze<span class="token punctuation">(</span><span class="token number">0</span><span class="token punctuation">)</span><span class="token punctuation">,</span> <span class="token punctuation">(</span>hh<span class="token punctuation">,</span> cc<span class="token punctuation">)</span><span class="token punctuation">)</span>
<span class="token comment"># print(hx[0][0].numpy())</span>
<span class="token keyword">if</span> debug<span class="token punctuation">:</span>
sns<span class="token punctuation">.</span>heatmap<span class="token punctuation">(</span>hh<span class="token punctuation">[</span><span class="token number">0</span><span class="token punctuation">]</span><span class="token punctuation">.</span>detach<span class="token punctuation">(</span><span class="token punctuation">)</span><span class="token punctuation">.</span>numpy<span class="token punctuation">(</span><span class="token punctuation">)</span><span class="token punctuation">.</span>reshape<span class="token punctuation">(</span><span class="token operator">-</span><span class="token number">1</span><span class="token punctuation">,</span> <span class="token number">4</span><span class="token punctuation">)</span><span class="token punctuation">,</span> fmt<span class="token operator">=</span><span class="token string">".2f"</span><span class="token punctuation">,</span> vmin<span class="token operator">=</span><span class="token operator">-</span><span class="token number">1</span><span class="token punctuation">,</span> vmax<span class="token operator">=</span><span class="token number">1</span><span class="token punctuation">,</span> annot<span class="token operator">=</span><span class="token boolean">True</span><span class="token punctuation">,</span> cmap<span class="token operator">=</span><span class="token string">"YlGnBu"</span><span class="token punctuation">)</span><span class="token punctuation">.</span><span class="token builtin">set</span><span class="token punctuation">(</span>title<span class="token operator">=</span>f<span class="token string">"Encoder Hidden State, step={idx}"</span><span class="token punctuation">)</span>
plt<span class="token punctuation">.</span>show<span class="token punctuation">(</span><span class="token punctuation">)</span>
encoder_sv <span class="token operator">=</span> self<span class="token punctuation">.</span>encoder_proj<span class="token punctuation">(</span>hh<span class="token punctuation">)</span>
<span class="token keyword">if</span> debug<span class="token punctuation">:</span>
sns<span class="token punctuation">.</span>heatmap<span class="token punctuation">(</span>encoder_sv<span class="token punctuation">[</span><span class="token number">0</span><span class="token punctuation">]</span><span class="token punctuation">.</span>detach<span class="token punctuation">(</span><span class="token punctuation">)</span><span class="token punctuation">.</span>numpy<span class="token punctuation">(</span><span class="token punctuation">)</span><span class="token punctuation">.</span>reshape<span class="token punctuation">(</span><span class="token operator">-</span><span class="token number">1</span><span class="token punctuation">,</span> <span class="token number">1</span><span class="token punctuation">)</span><span class="token punctuation">,</span> fmt<span class="token operator">=</span><span class="token string">".2f"</span><span class="token punctuation">,</span> vmin<span class="token operator">=</span><span class="token operator">-</span><span class="token number">1</span><span class="token punctuation">,</span> vmax<span class="token operator">=</span><span class="token number">1</span><span class="token punctuation">,</span> annot<span class="token operator">=</span><span class="token boolean">True</span><span class="token punctuation">,</span> cmap<span class="token operator">=</span><span class="token string">"YlGnBu"</span><span class="token punctuation">)</span><span class="token punctuation">.</span><span class="token builtin">set</span><span class="token punctuation">(</span>title<span class="token operator">=</span>f<span class="token string">"Encoder Single Vector"</span><span class="token punctuation">)</span>
plt<span class="token punctuation">.</span>show<span class="token punctuation">(</span><span class="token punctuation">)</span>
<span class="token keyword">return</span> encoder_sv<span class="token punctuation">,</span> <span class="token punctuation">(</span>hh<span class="token punctuation">,</span> cc<span class="token punctuation">)</span>
</code></pre>
<h3 id="the-decoder">The Decoder</h3>
<pre class=" language-python"><code class="prism language-python"><span class="token keyword">class</span> <span class="token class-name">Decoder</span><span class="token punctuation">(</span>nn<span class="token punctuation">.</span>Module<span class="token punctuation">)</span><span class="token punctuation">:</span>
<span class="token keyword">def</span> <span class="token function">__init__</span><span class="token punctuation">(</span>self<span class="token punctuation">,</span> input_dim<span class="token operator">=</span><span class="token number">64</span><span class="token punctuation">,</span> hidden_dim<span class="token operator">=</span><span class="token number">16</span><span class="token punctuation">,</span> proj_dim<span class="token operator">=</span><span class="token number">64</span><span class="token punctuation">)</span><span class="token punctuation">:</span>
<span class="token builtin">super</span><span class="token punctuation">(</span>self<span class="token punctuation">.</span>__class__<span class="token punctuation">,</span> self<span class="token punctuation">)</span><span class="token punctuation">.</span>__init__<span class="token punctuation">(</span><span class="token punctuation">)</span>
self<span class="token punctuation">.</span>input_dim <span class="token operator">=</span> input_dim
self<span class="token punctuation">.</span>hidden_dim <span class="token operator">=</span> hidden_dim
self<span class="token punctuation">.</span>proj_dim <span class="token operator">=</span> proj_dim
self<span class="token punctuation">.</span>decode_lstm <span class="token operator">=</span> nn<span class="token punctuation">.</span>LSTMCell<span class="token punctuation">(</span>self<span class="token punctuation">.</span>input_dim<span class="token punctuation">,</span> self<span class="token punctuation">.</span>hidden_dim<span class="token punctuation">,</span> bias<span class="token operator">=</span><span class="token boolean">False</span><span class="token punctuation">)</span>
self<span class="token punctuation">.</span>decoder_proj <span class="token operator">=</span> nn<span class="token punctuation">.</span>Linear<span class="token punctuation">(</span>self<span class="token punctuation">.</span>hidden_dim<span class="token punctuation">,</span> self<span class="token punctuation">.</span>proj_dim<span class="token punctuation">,</span> bias<span class="token operator">=</span><span class="token boolean">False</span><span class="token punctuation">)</span>
<span class="token keyword">def</span> <span class="token function">forward</span><span class="token punctuation">(</span>self<span class="token punctuation">,</span> encoder_inp<span class="token punctuation">,</span> hidden_state<span class="token punctuation">,</span> max_steps<span class="token operator">=</span><span class="token number">5</span><span class="token punctuation">,</span> debug<span class="token operator">=</span><span class="token boolean">False</span><span class="token punctuation">)</span><span class="token punctuation">:</span>
<span class="token punctuation">(</span>hh<span class="token punctuation">,</span> cc<span class="token punctuation">)</span> <span class="token operator">=</span> hidden_state
<span class="token keyword">for</span> idx <span class="token keyword">in</span> <span class="token builtin">range</span><span class="token punctuation">(</span>max_steps<span class="token punctuation">)</span><span class="token punctuation">:</span>
<span class="token punctuation">(</span>hh<span class="token punctuation">,</span> cc<span class="token punctuation">)</span> <span class="token operator">=</span> self<span class="token punctuation">.</span>decode_lstm<span class="token punctuation">(</span>encoder_inp<span class="token punctuation">,</span> <span class="token punctuation">(</span>hh<span class="token punctuation">,</span> cc<span class="token punctuation">)</span><span class="token punctuation">)</span>
<span class="token keyword">if</span> debug<span class="token punctuation">:</span>
sns<span class="token punctuation">.</span>heatmap<span class="token punctuation">(</span>hh<span class="token punctuation">[</span><span class="token number">0</span><span class="token punctuation">]</span><span class="token punctuation">.</span>detach<span class="token punctuation">(</span><span class="token punctuation">)</span><span class="token punctuation">.</span>numpy<span class="token punctuation">(</span><span class="token punctuation">)</span><span class="token punctuation">.</span>reshape<span class="token punctuation">(</span><span class="token operator">-</span><span class="token number">1</span><span class="token punctuation">,</span> <span class="token number">4</span><span class="token punctuation">)</span><span class="token punctuation">,</span> fmt<span class="token operator">=</span><span class="token string">".2f"</span><span class="token punctuation">,</span> vmin<span class="token operator">=</span><span class="token operator">-</span><span class="token number">1</span><span class="token punctuation">,</span> vmax<span class="token operator">=</span><span class="token number">1</span><span class="token punctuation">,</span> annot<span class="token operator">=</span><span class="token boolean">True</span><span class="token punctuation">,</span> cmap<span class="token operator">=</span><span class="token string">"YlGnBu"</span><span class="token punctuation">)</span><span class="token punctuation">.</span><span class="token builtin">set</span><span class="token punctuation">(</span>title<span class="token operator">=</span>f<span class="token string">"Decoder Hidden State, step={idx}"</span><span class="token punctuation">)</span>
plt<span class="token punctuation">.</span>show<span class="token punctuation">(</span><span class="token punctuation">)</span>
decoder_sv <span class="token operator">=</span> self<span class="token punctuation">.</span>decoder_proj<span class="token punctuation">(</span>hh<span class="token punctuation">)</span>
<span class="token keyword">if</span> debug<span class="token punctuation">:</span>
sns<span class="token punctuation">.</span>heatmap<span class="token punctuation">(</span>decoder_sv<span class="token punctuation">[</span><span class="token number">0</span><span class="token punctuation">]</span><span class="token punctuation">.</span>detach<span class="token punctuation">(</span><span class="token punctuation">)</span><span class="token punctuation">.</span>numpy<span class="token punctuation">(</span><span class="token punctuation">)</span><span class="token punctuation">.</span>reshape<span class="token punctuation">(</span><span class="token operator">-</span><span class="token number">1</span><span class="token punctuation">,</span> <span class="token number">1</span><span class="token punctuation">)</span><span class="token punctuation">,</span> fmt<span class="token operator">=</span><span class="token string">".2f"</span><span class="token punctuation">,</span> vmin<span class="token operator">=</span><span class="token operator">-</span><span class="token number">1</span><span class="token punctuation">,</span> vmax<span class="token operator">=</span><span class="token number">1</span><span class="token punctuation">,</span> annot<span class="token operator">=</span><span class="token boolean">True</span><span class="token punctuation">,</span> cmap<span class="token operator">=</span><span class="token string">"YlGnBu"</span><span class="token punctuation">)</span><span class="token punctuation">.</span><span class="token builtin">set</span><span class="token punctuation">(</span>title<span class="token operator">=</span>f<span class="token string">"Decoder Single Vector"</span><span class="token punctuation">)</span>
plt<span class="token punctuation">.</span>show<span class="token punctuation">(</span><span class="token punctuation">)</span>
<span class="token keyword">return</span> decoder_sv
</code></pre>
<p>This is as simple as it gets !</p>
<h2 id="experiments">Experiments</h2>
<table>
<thead>
<tr>
<th>Encoder <code>[hidden_dim]</code></th>
<th>Decoder <code>[hidden_dim]</code></th>
<th>Augmentation</th>
<th>Epochs</th>
<th>Test Accuracy</th>
<th>Remark</th>
</tr>
</thead>
<tbody>
<tr>
<td>64</td>
<td>64</td>
<td>none</td>
<td>5</td>
<td><strong>81.3</strong></td>
<td>Overfit, can be better</td>
</tr>
<tr>
<td></td>
<td></td>
<td></td>
<td></td>
<td></td>
<td></td>
</tr>
<tr>
<td>32</td>
<td>32</td>
<td>none</td>
<td>5</td>
<td>80.6</td>
<td>Reduce <code>hidden_dim</code></td>
</tr>
<tr>
<td></td>
<td></td>
<td></td>
<td></td>
<td></td>
<td></td>
</tr>
<tr>
<td><strong>16</strong></td>
<td><strong>16</strong></td>
<td><strong>none</strong></td>
<td><strong>5</strong></td>
<td>80.6</td>
<td>Reduce <code>hidden_dim</code>, Still good with just 16 dims !</td>
</tr>
<tr>
<td></td>
<td></td>
<td></td>
<td></td>
<td></td>
<td></td>
</tr>
</tbody>
</table><h3 id="conclusions-and-monologue">Conclusions and Monologue</h3>
<ul>
<li>The accuracy was really high, even with just 16 dims, but how ? why ?</li>
<li>This time I used Pretrained <code>GloVe</code> Vector <code>o(* ̄▽ ̄*)ブ</code></li>
<li><code>( ̄y▽ ̄)╭ Ohohoho.....</code></li>
<li>The dataset is relatively small, so i am not so surprised, it’s just 1300 tweets, which was split into 1000 train, and 300 test set, also there is class imbalance, huge, imbalance.</li>
<li>The embedding was trained on 6B texts, and has a dimension of <code>300</code>, this reduces so much of the network’s pain, the main guy working now is the encoder lstm, which is responsible to extract information such that the later decoder can understand what is happening and give a judgement.</li>
<li>I’ve also created a issue in <code>torchtext</code> of how to use <code>Vectors</code>: <a href="https://github.com/pytorch/text/issues/1323">https://github.com/pytorch/text/issues/1323</a></li>
</ul>
<h2 id="model-debug">Model Debug</h2>
<p>Raw Tweet: <code>RT @WhatTheFFacts: In his teen years, Obama has been known to use marijuana and cocaine.</code></p>
<p>Input Text: <code>In his teen years, Obama has been known to use marijuana and cocaine.</code></p>
<p>Label: <code>Negative</code></p>
<p>Predicted: <code>Negative</code></p>
<p>The states are just beautiful to watch, look at how information is encoded !!!</p>
<center>
<iframe src="https://giphy.com/embed/d1USMTfNFsvrG" width="480" height="333" class="giphy-embed" allowfullscreen=""></iframe><p><a href="https://giphy.com/gifs/cat-animal-surise-d1USMTfNFsvrG"></a></p>
</center>
<h3 id="encoder-states">Encoder States</h3>
<p><img src="https://github.com/satyajitghana/TSAI-DeepNLP-END2.0/blob/main/06_Encoder_Decoder/encoder_outputs.png?raw=true" alt="encoder outputs"></p>
<h3 id="decoder-states">Decoder States</h3>
<p><img src="https://github.com/satyajitghana/TSAI-DeepNLP-END2.0/blob/main/06_Encoder_Decoder/decoder_outputs.png?raw=true" alt="decoder states"></p>
<h3 id="final-proj-layer-state">Final Proj Layer State</h3>
<p><img src="https://github.com/satyajitghana/TSAI-DeepNLP-END2.0/blob/main/06_Encoder_Decoder/final_output.png?raw=true" alt="proj layer state"></p>
<h2 id="misclassified">Misclassified</h2>
<pre><code>Tweet: Obama entrando en Skateboard y que si es real http://t.co/Sm9s2o9i LIKE A BOSS
Tweet Cleaned: Obama entrando en Skateboard y que si es real LIKE A BOSS
Label: Positive
Predicted: Negative
Tweet: The Writer's strike gave him the opportunity to go work for Obama's campaign
Tweet Cleaned: The Writer's strike gave him the opportunity to go work for Obama's campaign
Label: Positive
Predicted: Negative
Tweet: Great photo!! LLAP!! #trekkers RT @GuyKawasaki: Barack Obama: Vulcan? http://t.co/UejtlmGt
Tweet Cleaned: Great photo!! LLAP!! Barack Obama Vulcan?
Label: Positive
Predicted: Negative
Tweet: RT @marymauldin: OBAMA HONORS the wkwnd of Holy Passover 4 Jews & Holy Passion of Jesus by hosting MUSLIM BROTHERHOOD who wld murder both Jews & Christians
Tweet Cleaned: OBAMA HONORS the wkwnd of Holy Passover 4 Jews & Holy Passion of Jesus by hosting MUSLIM BROTHERHOOD who wld murder both Jews & Christians
Label: Neutral
Predicted: Negative
Tweet: .@Notintheface1 I thought it was more, describe Obama, but instead of using Obama's qualities, list his own. Then hope nobody noticed.
Tweet Cleaned: . I thought it was more, describe Obama, but instead of using Obama's qualities, list his own. Then hope nobody noticed.
Label: Positive
Predicted: Negative
Tweet: Obama: If all the blacks line up for me, I promise I will triple your entitlements & give you all Escalades http://t.co/f8b7HLaf
Tweet Cleaned: Obama If all the blacks line up for me, I promise I will triple your entitlements & give you all Escalades
Label: Neutral
Predicted: Positive
Tweet: RT @GatorNation41: gas was $1.92 when Obama took office...I guess he did promise he would change things http://t.co/TlfMmi0G
Tweet Cleaned: gas was $1.92 when Obama took office...I guess he did promise he would change things
Label: Positive
Predicted: Negative
Tweet: RT @1Dlover_carrots: @Harry_Styles on a scale of 1-10 how attractive is this?...and don't say michelle Obama. http://t.co/YiFq4PKT
Tweet Cleaned: on a scale of 1-10 how attractive is this?...and don't say michelle Obama.
Label: Positive
Predicted: Negative
Tweet: Liberal #SteveJobs: Obama's business killing regulations forces Apple to build in #China #cnn - http://t.co/Yd8jzwoV
Tweet Cleaned: Liberal Obama's business killing regulations forces Apple to build in -
Label: Positive
Predicted: Negative
Tweet: Saul says Pres Obama "will do anything" to distract Americans from his "failed" economic record incl unemployment & higher gas prices."
Tweet Cleaned: Saul says Pres Obama "will do anything" to distract Americans from his "failed" economic record incl unemployment & higher gas prices."
Label: Positive
Predicted: Neutral
</code></pre>
<h2 id="correct-classified">Correct Classified</h2>
<pre><code>Tweet: RT @WhatTheFFacts: In his teen years, Obama has been known to use marijuana and cocaine.
Tweet Cleaned: In his teen years, Obama has been known to use marijuana and cocaine.
Label: Negative
Predicted: Negative
Tweet: RT @Drudge_Report: Obama setting up Supreme Court as campaign issue... http://t.co/1IiLN01H
Tweet Cleaned: Obama setting up Supreme Court as campaign issue...
Label: Positive
Predicted: Positive
Tweet: RT @NatlWOW: @edshow Pres. Obama understands right from wrong! And doesn't need to flip flop around to get votes! #UniteWomen #edshow
Tweet Cleaned: Pres. Obama understands right from wrong! And doesn't need to flip flop around to get votes!
Label: Negative
Predicted: Negative
Tweet: #WhatsRomneyHiding HE WONDERING.. WHATS OBAMA HIDING????? remember the most transparent adm in history.. LMBO
Tweet Cleaned: HE WONDERING.. WHATS OBAMA HIDING????? remember the most transparent adm in history.. LMBO
Label: Negative
Predicted: Negative
Tweet: President Obama * Lindsay Lohan * 1989 RUMORS business 19 TH & M ST NW DC met field agent = multi connector to FFX VA covert overt zone.
Tweet Cleaned: President Obama * Lindsay Lohan * 1989 RUMORS business 19 TH & M ST NW DC met field agent = multi connector to FFX VA covert overt zone.
Label: Negative
Predicted: Negative
Tweet: Romney and Obama agree that Augusta National should allow women to be members? Unthinkable...and bad news for the green coats.
Tweet Cleaned: Romney and Obama agree that Augusta National should allow women to be members? Unthinkable...and bad news for the green coats.
Label: Negative
Predicted: Negative
Tweet: #WhatsRomneyHiding Obama released his tax returns since 2000, where are Romney's?
Tweet Cleaned: Obama released his tax returns since 2000, where are Romney's?
Label: Negative
Predicted: Negative
Tweet: #newbedon 4/6/2012 4:25:20 AM Obama Wins Landslide Presidential Election...With Online Gamers http://t.co/JGbJwE9Z
Tweet Cleaned: 4/6/2012 42520 AM Obama Wins Landslide Presidential Election...With Online Gamers
Label: Negative
Predicted: Negative
Tweet: Obama says knock you out -- http://t.co/PUZRq7HU #screwytees
Tweet Cleaned: Obama says knock you out --
Label: Negative
Predicted: Negative
Tweet: Top Secret Obama 2012 World War 3 Illuminati Antichrist Conspiracy!: http://t.co/iqg1xarL via @youtube
Tweet Cleaned: Top Secret Obama 2012 World War 3 Illuminati Antichrist Conspiracy! via
Label: Negative
Predicted: Negative
</code></pre>
<h2 id="thats-it-folks-">That’s it Folks !</h2>
<center>
<iframe src="https://giphy.com/embed/GypVyX5Nw0R2g" width="480" height="362" class="giphy-embed" allowfullscreen=""></iframe><p><a href="https://giphy.com/gifs/cat-funny-GypVyX5Nw0R2g"></a></p>
</center>
</div>
</body>
</html>