Recurrent Neural Networks¶

For 2D data, convolutional layers draw on the inherent local correlations to improve behavior and remove location biases.

Sequential data often is of interest: time series, text, audio

For these, recurrent neural networks have proven to be an interesting technique.

With a recurrent neural network, input to a recurrent layer comes both from lower levels of the layer itself as well as internal state from previous passes.

Recurrent Neural Networks¶

Through the loop structure, an RNN is able to incorporate memory. As the network traverses sequential data, it can draw on previous observations to influence how it handles the next observation.

RNNs can be prone to difficulties in training - to decaying parts of the network or to lose track of long distance dependencies.

A commonly used version is the Long Short-Term Memory layer, which incorporates

  • A running hidden state
  • A mechanism to selectively ignore parts of the hidden state, depending on input

LSTM¶

In [3]:
plot(data["Temp"])
Out[3]:
[<matplotlib.lines.Line2D at 0x10bbfc5c0>]
In [49]:
f1
Out[49]:

Usecase: Translation¶

Usecase: Text generation¶

Train a network to predict the next character from a sequence of recent characters. Depending on corpus, different behaviours can be created.

Shakespeare:

PANDARUS: Alas, I think he shall be come approached and the day When little srain would be attain'd into being never fed, And who is but a chain and subjects of his death, I should not sleep.

Second Senator: They are away this miseries, produced upon my soul, Breaking and strongly should be buried, when I perish The earth and thoughts of many states.

DUKE VINCENTIO: Well, your wit is in the care of side and that.

Second Lord: They would be ruled after this chamber, and my fair nues begun out of the fact, to be conveyed, Whose noble souls I'll have the heart of the wars.

Clown: Come, sir, I will make did behold your worship.

VIOLA: I'll drink it.

Usecase: Text generation¶

Train a network to predict the next character from a sequence of recent characters. Depending on corpus, different behaviours can be created.

Wikipedia:

Naturalism and decision for the majority of Arab countries' capitalide was grounded by the Irish language by [[John Clair]], [[An Imperial Japanese Revolt]], associated with Guangzham's sovereignty. His generals were the powerful ruler of the Portugal in the [[Protestant Immineners]], which could be said to be directly in Cantonese Communication, which followed a ceremony and set inspired prison, training. The emperor travelled back to [[Antioch, Perth, October 25|21]] to note, the Kingdom of Costa Rica, unsuccessful fashioned the [[Thrales]], [[Cynth's Dajoard]], known in western [[Scotland]], near Italy to the conquest of India with the conflict. Copyright was the succession of independence in the slop of Syrian influence that was a famous German movement based on a more popular servicious, non-doctrinal and sexual power post. Many governments recognize the military housing of the [[Civil Liberalization and Infantry Resolution 265 National Party in Hungary]], that is sympathetic to be to the [[Punjab Resolution]] (PJS)[http://www.humah.yahoo.com/guardian.cfm/7754800786d17551963s89.htm Official economics Adjoint for the Nazism, Montgomery was swear to advance to the resources for those Socialism's rule, was starting to signing a major tripad of aid exile.]]

Usecase: Text generation¶

Train a network to predict the next character from a sequence of recent characters. Depending on corpus, different behaviours can be created.

(almost) valid LaTeX:

Usecase: Text generation¶

Train a network to predict the next character from a sequence of recent characters. Depending on corpus, different behaviours can be created.

Linux Source Code:

/*
 * If this error is set, we will need anything right after that BSD.
 */
static void action_new_function(struct s_stat_info *wb)
{
  unsigned long flags;
  int lel_idx_bit = e->edd, *sys & ~((unsigned long) *FIRST_COMPAT);
  buf[0] = 0xFFFFFFFF & (bit << 4);
  min(inc, slist->bytes);
  printk(KERN_WARNING "Memory allocated %02x/%02x, "
    "original MLL instead\n"),
    min(min(multi_run - s->len, max) * num_data_in),
    frame_pos, sz + first_seg);
  div_u64_w(val, inb_p);
  spin_unlock(&disk->queue_lock);
  mutex_unlock(&s->sock->mutex);
  mutex_unlock(&func->mutex);
  return disassemble(info->pending_bh);
}

Usecase: Text generation¶

It can be enlightening to sample intermediate stages of training the network. Here is Tolstoy's War and Peace

Epoch 100:

tyntd-iafhatawiaoihrdemot lytdws e ,tfti, astai f ogoh eoase rrranbyne 'nhthnee e plia tklrgd t o idoe ns,smtt h ne etie h,hregtrs nigtike,aoaenns lng

Epoch 300:

"Tmont thithey" fomesscerliund Keushey. Thom here sheulke, anmerenith ol sivh I lalterthend Bleipile shuwy fil on aseterlome coaniogennc Phe lism thond hon at. MeiDimorotion in ther thize."

Epoch 500:

we counter. He stutn co des. His stanted out one ofler that concossions and was to gearang reay Jotrets and with fre colt otf paitt thin wall. Which das stimn

Usecase: Text generation¶

It can be enlightening to sample intermediate stages of training the network. Here is Tolstoy's War and Peace

Epoch 700:

Aftair fall unsuch that the hall for Prince Velzonski's that me of her hearly, and behs to so arwage fiving were to it beloge, pavu say falling misfort how, and Gogition is so overelical and ofter.

Epoch 1200:

"Kite vouch!" he repeated by her door. "But I would be done and quarts, feeling, then, son is people...."

Epoch 2000:

"Why do what that day," replied Natasha, and wishing to himself the fact the princess, Princess Mary was easier, fed in had oftened him. Pierre aking his soul came to the packs and drove up his father-in-law women.