Performance comparison of LSTM with and without cuDNN(v5) in Chainer

Performance comparison of LSTM with and without cuDNN(v5) in Chainer
Mar 15, 2017

We compare the performance of an LSTM network both with and without cuDNN in Chainer. The NVIDIA CUDA® Deep Neural Network library (cuDNN) is a GPU-accelerated library of primitives for deep neural networks. cuDNN provides highly tuned implementations for standard routines such as LSTM, CNN.
In this article, we compare the performance of LSTM with or without cuDNN. In Chainer, an LSTM implementation is configurable to run with or without cuDNN. An LSTM (NStepLSTM) implementation can be found here.
https://github.com/pfnet/chainer/blob/master/chainer/functions/connection/n_step_lstm.py
NVIDIA’s official blog is related to this post. Please check here.
SUMMARY:
Thorough our experiments, we found the following observations: We should use cuDNN
if the model is large.
if the sequence data is long.
if the sequence data has variable lengths.

We conducted experiments from the following viewpoints:
The effect of mini-batch size
The effect of the number of LSTM layers and sequence length of data
The effect of the number of LSTM layers and random sequence length of data
The effect of the number of LSTM layers and the input unit size
The effect of the number of LSTM layers and the hidden unit size
The effect of the number of LSTM layers and dropout rate
When will the differences be large? (setting with/without cuDNN)

EXPERIMENTAL RESULT:
The effect of mini-batch size parameters: batchsize = {127, 128, 200, 255, 256} In all results, batchsize 128 is faster than 127 and batchsize 256 is faster than 255. (Despite smaller batch size!) Using batchsize = 2^n will provide the best performance. Note that the number of iterations is the same number (39 iterations) in the case of batchsize = {255, 256}.

Comparing the setting with/without cuDNN, about 2 times ~ 2.8 times faster when using cuDNN in forward time, and 1.6 times faster in backward time.


The effect of mini-batch size
The effect of mini-batch size

The effect of the layer size of LSTM and sequence length of data parameters: length={5, 25, 50}, layer={1, 2, 3} As the length of data and the number of layers increases, the performance benefit from the cuDNN implementation increases.

When we use a large LSTM setting (layer=3, length=50), cuDNN is about 7 times faster in forward time, and 4 times faster in backward time.
When we use a small LSTM setting (layer=1, length=5), about 1.5 times faster in forward time, and 1.04 times faster in backward time.
[图片上传中。。。(2)]
The effect of the layer size of LSTM and random sequence length of data parameters: layer = {1, 2, 3}, random={True, False} In this setting, we compare if the data sequence length is fixed or not. (Max length is 25.) When we use cuDNN, the performance impact of random sequence length is small.

The effect of the layer size of LSTM and random sequence length of data
The effect of the layer size of LSTM and random sequence length of data

The effect of the layer size of LSTM and the input unit size parameters: layer = {1, 2, 3}, input={128, 256} In the setting with cuDNN, as number of layers increases, the difference between cuDNN and no-cuDNN will be large. (About 5 times faster in forward time, and about 2.7 times faster in backward time.)

The effect of the layer size of LSTM and the input unit size
The effect of the layer size of LSTM and the input unit size

The effect of the layer size of LSTM and the hidden unit size parameters: layer = {1, 2, 3}, hidden={128, 256} In the setting with cuDNN, the number of layers increases, the difference between cuDNN and no-cuDNN will be large. (This is same as layer size and input size experiment.) However as hidden unit size increases, the difference between cuDNN and no-cuDNN will be small.

[图片上传中。。。(5)]
The effect of the layer size of LSTM and dropout rate parameters: layer={1, 2, 3}, dropout={0.5, 0.0} In the setting with cuDNN, when using dropout, the speed gets slower but the difference is very small (dropout rate=0.50).

The effect of the layer size of LSTM and dropout rate
The effect of the layer size of LSTM and dropout rate

When will the differences be large? (Setting with/without cuDNN.)

As batch size is small (batchsize=128), sequence length is long (length=50) and the number of layers is large (layer=3), the difference is large. (The cuDNN is faster than no-cuDNN setting.)
If we use a large LSTM in the experiment, the performance benefit of using cuDNN will be large. 7.8 times faster in forward time. 4.0 times faster in backward time.
EXPERIMENTAL ENVIRONMENT
GPU: GeForce GTX 970
Chainer (v1.21)
cuDNN v5.1 (cuda v8)

EXPERIMENTAL SETTING
Data: Random artificial sequence data (data size: 10,000)
Training epoch: 10
Comparing the average time per one epoch.
Performance time offorward time (for train data)
forward time (for test data)
backward time
Default experiment setting:
batchsize : 128
sequence length : 25
random length : 0 (fix length)
layer size: 1
input unit size: 128
hidden unit size : 128

The code for our experiments:
https://github.com/aonotas/test-chainer-performance

最后编辑于
©著作权归作者所有,转载或内容合作请联系作者
  • 序言:七十年代末,一起剥皮案震惊了整个滨河市,随后出现的几起案子,更是在滨河造成了极大的恐慌,老刑警刘岩,带你破解...
    沈念sama阅读 203,547评论 6 477
  • 序言:滨河连续发生了三起死亡事件,死亡现场离奇诡异,居然都是意外死亡,警方通过查阅死者的电脑和手机,发现死者居然都...
    沈念sama阅读 85,399评论 2 381
  • 文/潘晓璐 我一进店门,熙熙楼的掌柜王于贵愁眉苦脸地迎上来,“玉大人,你说我怎么就摊上这事。” “怎么了?”我有些...
    开封第一讲书人阅读 150,428评论 0 337
  • 文/不坏的土叔 我叫张陵,是天一观的道长。 经常有香客问我,道长,这世上最难降的妖魔是什么? 我笑而不...
    开封第一讲书人阅读 54,599评论 1 274
  • 正文 为了忘掉前任,我火速办了婚礼,结果婚礼上,老公的妹妹穿的比我还像新娘。我一直安慰自己,他们只是感情好,可当我...
    茶点故事阅读 63,612评论 5 365
  • 文/花漫 我一把揭开白布。 她就那样静静地躺着,像睡着了一般。 火红的嫁衣衬着肌肤如雪。 梳的纹丝不乱的头发上,一...
    开封第一讲书人阅读 48,577评论 1 281
  • 那天,我揣着相机与录音,去河边找鬼。 笑死,一个胖子当着我的面吹牛,可吹牛的内容都是我干的。 我是一名探鬼主播,决...
    沈念sama阅读 37,941评论 3 395
  • 文/苍兰香墨 我猛地睁开眼,长吁一口气:“原来是场噩梦啊……” “哼!你这毒妇竟也来了?” 一声冷哼从身侧响起,我...
    开封第一讲书人阅读 36,603评论 0 258
  • 序言:老挝万荣一对情侣失踪,失踪者是张志新(化名)和其女友刘颖,没想到半个月后,有当地人在树林里发现了一具尸体,经...
    沈念sama阅读 40,852评论 1 297
  • 正文 独居荒郊野岭守林人离奇死亡,尸身上长有42处带血的脓包…… 初始之章·张勋 以下内容为张勋视角 年9月15日...
    茶点故事阅读 35,605评论 2 321
  • 正文 我和宋清朗相恋三年,在试婚纱的时候发现自己被绿了。 大学时的朋友给我发了我未婚夫和他白月光在一起吃饭的照片。...
    茶点故事阅读 37,693评论 1 329
  • 序言:一个原本活蹦乱跳的男人离奇死亡,死状恐怖,灵堂内的尸体忽然破棺而出,到底是诈尸还是另有隐情,我是刑警宁泽,带...
    沈念sama阅读 33,375评论 4 318
  • 正文 年R本政府宣布,位于F岛的核电站,受9级特大地震影响,放射性物质发生泄漏。R本人自食恶果不足惜,却给世界环境...
    茶点故事阅读 38,955评论 3 307
  • 文/蒙蒙 一、第九天 我趴在偏房一处隐蔽的房顶上张望。 院中可真热闹,春花似锦、人声如沸。这庄子的主人今日做“春日...
    开封第一讲书人阅读 29,936评论 0 19
  • 文/苍兰香墨 我抬头看了看天上的太阳。三九已至,却和暖如春,着一层夹袄步出监牢的瞬间,已是汗流浃背。 一阵脚步声响...
    开封第一讲书人阅读 31,172评论 1 259
  • 我被黑心中介骗来泰国打工, 没想到刚下飞机就差点儿被人妖公主榨干…… 1. 我叫王不留,地道东北人。 一个月前我还...
    沈念sama阅读 43,970评论 2 349
  • 正文 我出身青楼,却偏偏与公主长得像,于是被迫代替她去往敌国和亲。 传闻我的和亲对象是个残疾皇子,可洞房花烛夜当晚...
    茶点故事阅读 42,414评论 2 342

推荐阅读更多精彩内容

  • 9月12日,周一。早上到校,想着告诉学生下午班会课后换座位,没想到一进教室,学生们很自觉的已经换过了,心里窃...
    匆匆十年阅读 305评论 0 0
  • 我是不被父母接纳的吧,父母那个年代都是重男轻女的,我有一个姐姐,生了我这个女儿,据姥姥说,爸爸是失望的,所以就连我...
    饴逸阅读 186评论 3 3
  • 假日走在街头,常常看见热闹的路边树下坐着三五个乡下姑娘,有时忙着手里的活计,闲了招揽着路上行人,"织补喽,谁织补喽...
    秦沐阳阅读 295评论 0 0
  • 我错了。 我突然意识到, 认同于某一项孤立的东西,“者”, 会让我无可选择。 无限个分枝,无数个我, 在认同个体的...
    j_haven阅读 257评论 0 0