제가 IDC에 서버를 두고 앱 서버를 돌리는데, 이번에 남는 서버가 한 대 생겨서(문제는 램 드라이버 문제 때문에 CentOS 6.4부터 안 올라감 ㅁㅊ) 거기에 GPU 하나 끼워서 제 연구서버로 써볼까 생각을 해봤습니다. 서버가 1U짜리이기 때문에 서버 케이스 안에 GPU를 넣는 건 좀 오바고, riser(PCIE 연장선)을 이용해서 좁디 좁은 케이스 밖에 GPU를 두려고 생각을 해봤습니다. 찾아보니까 PCIE riser가 있긴 있는데, 대부분 비트코인 같은 사이버코인 마이닝용으로 나온 PCIE x16 to x1 밖에 없더군요(;;) 나름 서버 내부에 x16 레인이 있기 때문에 써보고 싶었는데 x1과 x16의 GPGPU 상의 성능이 크게 차이가 나지 않는다는 소리가 들어서 한 번 테스트해봤습니다.
만저 제 컴퓨터 사양은 다음과 같습니다.
CPU: Intel i7-5820K
RAM: DDR4 16GB
GPU: GTX970
HDD: SATA2 Seagate 320GB (08년도쯤에 생산된 것으로 추정됩니다 ㄷㄷ)
PSU: SuperFlower SF-600R14SE Silver Green FX
OS: Ubuntu 16.04.1
Kernel: Linux 4.4.0-34-generic
Software: neural-style, Torch7, CUDA 7.5, cuDNN(버전은 기억이 안 나네요)
참고로 테스트시에는 모니터 3개를 켜뒀고(DVI x 2, DP x 1) 크롬 켜두고, 터미널 창 세 개 켜뒀습니다. 한 터미널은 공백 상태(log 저장용이라 걍 켜둔 셈), 한 터미널에서는 neural-style이 돌아가고 있었고, 나머지 한 터미널에서는 아래의 명령어로 GPU를 모니터링했습니다.
watch -n 1 nvidia-smi
* 테스트는 riser를 끼운 상태(x1)에서 오후 12시쯤에 테스트했고, riser를 안 끼운 상태(x16)에서는 오후 8시쯤에 테스트했습니다. nvidia-smi를 확인해보니 GPU에 쓰로틀링이 걸리던데, 일단 아래 두 테스트할 때 riser 끼운 상태에서는 GPU 사용률이 95%에서 맴돌았고 riser를 안 끼운 상태에서는 97~100%를 맴돌았습니다. 제 방은 에어컨따위 틀지 않고 오직 선풍기만 틀어서 방안 온도는 크게 변하지 않았을 것으로 추정되어 테스트에 큰 영향은 없을 것이라 생각이 됩니다.
비교를 위해 사용한 riser는 아래 링크에 있는 riser입니다. 사실 이더리움 마이닝 테스트용으로 한 개 샀던 것인데, 딱히 마이닝할 일도 없고 이번에 성능 비교해서 잘 나오면 IDC에 박으려고 생각했던 것입니다. (무려 그 일주일 전에 알리에서 샀던 이어폰보다 먼저 오시는 riser 클라스 ㄷㄷ)
라이저를 끼우고 그래픽카드를 외부로 뺀 상황입니다.
그리고 테스트를 돌렸습니다. 테스트는 위에 써둔 것처럼 Torch7 기반의 neural-style을 GPU로 돌려봤습니다. 명령어는 아래와 같습니다.
time th neural_style.lua -style_image examples/inputs/picasso_selfport1907.jpg -content_image examples/inputs/brad_pitt.jpg -gpu 0
총 3회를 테스트하였으며 맨 밑의 더보기를 통해 그 로그들을 확인하실 수 있습니다. 참고로 이상하게 컴퓨터를 부팅한 이후 바로 torch를 돌리면 준비 때문인지 약간 딜레이가 있어서 두 테스트 다 처음에 1회 간단히 돌린 이후에 테스트를 돌렸습니다. (즉, 간단히 돌린 이 1회를 제외한 그 다음 테스트부터 아래 표에 반영되어 있습니다.)
테스트 결과는 다음과 같습니다. time 명령어를 통해 real 시간을 측정하였습니다.
흠... 생각보다 정말 듣던대로 x16과 x1에서의 성능차이는 크지 않았습니다. x16 riser가 구해지지 않으면, 지금 가지고 있는 이걸 IDC에 있는 서버에 끼워서 연구용 서버로 사용해야할 거 같네요. 굿굿
ps. 차트 이쁘게 넣어보겠다고 찾아보다가, Google Chart 모듈을 찾아서 넣어봤습니다. 정작 글 쓰면서 차트 모듈 사용해서 차트 작성한게 1시간 반정도 먹는 듯... ㅠㅠ
#1
taewoo@WORKSTATION-linux:~/Git/neural-style$ time th neural_style.lua -style_image examples/inputs/picasso_selfport1907.jpg -content_image examples/inputs/brad_pitt.jpg -gpu 0
[libprotobuf WARNING google/protobuf/io/coded_stream.cc:537] Reading dangerously large protocol message. If the message turns out to be larger than 1073741824 bytes, parsing will be halted for security reasons. To increase the limit (or to disable these warnings), see CodedInputStream::SetTotalBytesLimit() in google/protobuf/io/coded_stream.h.
[libprotobuf WARNING google/protobuf/io/coded_stream.cc:78] The total number of bytes read was 574671192
Successfully loaded models/VGG_ILSVRC_19_layers.caffemodel
conv1_1: 64 3 3 3
conv1_2: 64 64 3 3
conv2_1: 128 64 3 3
conv2_2: 128 128 3 3
conv3_1: 256 128 3 3
conv3_2: 256 256 3 3
conv3_3: 256 256 3 3
conv3_4: 256 256 3 3
conv4_1: 512 256 3 3
conv4_2: 512 512 3 3
conv4_3: 512 512 3 3
conv4_4: 512 512 3 3
conv5_1: 512 512 3 3
conv5_2: 512 512 3 3
conv5_3: 512 512 3 3
conv5_4: 512 512 3 3
fc6: 1 1 25088 4096
fc7: 1 1 4096 4096
fc8: 1 1 4096 1000
Setting up style layer 2 : relu1_1
Setting up style layer 7 : relu2_1
Setting up style layer 12 : relu3_1
Setting up style layer 21 : relu4_1
Setting up content layer 23 : relu4_2
Setting up style layer 30 : relu5_1
WARNING: Skipping content loss
Running optimization with L-BFGS
<optim.lbfgs> creating recyclable direction/step/history buffers
Iteration 50 / 1000
Content 1 loss: 1804182.343750
Style 1 loss: 26072.180176
Style 2 loss: 297269.848633
Style 3 loss: 118508.483887
Style 4 loss: 586565.478516
Style 5 loss: 1485.986900
Total loss: 2834084.321861
Iteration 100 / 1000
Content 1 loss: 1198984.609375
Style 1 loss: 11620.404053
Style 2 loss: 70801.934814
Style 3 loss: 39886.474609
Style 4 loss: 258139.355469
Style 5 loss: 1673.295021
Total loss: 1581106.073341
Iteration 150 / 1000
Content 1 loss: 1024316.640625
Style 1 loss: 6056.498337
Style 2 loss: 39176.739502
Style 3 loss: 23663.417053
Style 4 loss: 217640.747070
Style 5 loss: 1777.299309
Total loss: 1312631.341896
Iteration 200 / 1000
Content 1 loss: 948938.750000
Style 1 loss: 4011.000824
Style 2 loss: 28392.517090
Style 3 loss: 17255.686951
Style 4 loss: 205114.721680
Style 5 loss: 1836.327171
Total loss: 1205549.003716
Iteration 250 / 1000
Content 1 loss: 904334.687500
Style 1 loss: 3091.413689
Style 2 loss: 22002.755737
Style 3 loss: 13823.744202
Style 4 loss: 201076.892090
Style 5 loss: 1896.784592
Total loss: 1146226.277809
Iteration 300 / 1000
Content 1 loss: 876206.640625
Style 1 loss: 2508.151627
Style 2 loss: 17643.127441
Style 3 loss: 11919.272614
Style 4 loss: 199310.803223
Style 5 loss: 1944.652939
Total loss: 1109532.648468
Iteration 350 / 1000
Content 1 loss: 858182.031250
Style 1 loss: 2093.920326
Style 2 loss: 15001.098633
Style 3 loss: 10783.903503
Style 4 loss: 197883.654785
Style 5 loss: 1967.825508
Total loss: 1085912.434006
Iteration 400 / 1000
Content 1 loss: 845832.187500
Style 1 loss: 1751.835632
Style 2 loss: 13327.618408
Style 3 loss: 10033.928680
Style 4 loss: 197321.850586
Style 5 loss: 1989.537621
Total loss: 1070256.958427
Iteration 450 / 1000
Content 1 loss: 836965.781250
Style 1 loss: 1450.771332
Style 2 loss: 12292.948151
Style 3 loss: 9567.194366
Style 4 loss: 197154.028320
Style 5 loss: 1993.361092
Total loss: 1059424.084511
Iteration 500 / 1000
Content 1 loss: 831347.734375
Style 1 loss: 1195.457649
Style 2 loss: 11631.801605
Style 3 loss: 9257.281494
Style 4 loss: 196513.488770
Style 5 loss: 1997.183609
Total loss: 1051942.947502
Iteration 550 / 1000
Content 1 loss: 826463.984375
Style 1 loss: 985.547638
Style 2 loss: 11294.330597
Style 3 loss: 9000.524139
Style 4 loss: 196460.961914
Style 5 loss: 2000.449944
Total loss: 1046205.798607
Iteration 600 / 1000
Content 1 loss: 823371.171875
Style 1 loss: 829.011536
Style 2 loss: 11031.163788
Style 3 loss: 8864.753723
Style 4 loss: 196003.063965
Style 5 loss: 2005.696106
Total loss: 1042104.860992
Iteration 650 / 1000
Content 1 loss: 821019.140625
Style 1 loss: 705.420876
Style 2 loss: 10862.005615
Style 3 loss: 8740.238190
Style 4 loss: 195652.783203
Style 5 loss: 2002.590561
Total loss: 1038982.179070
Iteration 700 / 1000
Content 1 loss: 819127.109375
Style 1 loss: 614.577723
Style 2 loss: 10756.799316
Style 3 loss: 8657.824707
Style 4 loss: 195484.399414
Style 5 loss: 2004.880142
Total loss: 1036645.590677
Iteration 750 / 1000
Content 1 loss: 817841.328125
Style 1 loss: 551.129532
Style 2 loss: 10750.648499
Style 3 loss: 8591.318512
Style 4 loss: 195288.854980
Style 5 loss: 2004.403877
Total loss: 1035027.683525
Iteration 800 / 1000
Content 1 loss: 816612.890625
Style 1 loss: 496.374846
Style 2 loss: 10756.934357
Style 3 loss: 8553.766632
Style 4 loss: 195206.445312
Style 5 loss: 2003.044891
Total loss: 1033629.456663
Iteration 850 / 1000
Content 1 loss: 815736.093750
Style 1 loss: 455.542135
Style 2 loss: 10798.479462
Style 3 loss: 8505.958557
Style 4 loss: 195011.621094
Style 5 loss: 2000.911903
Total loss: 1032508.606901
Iteration 900 / 1000
Content 1 loss: 814893.750000
Style 1 loss: 423.602390
Style 2 loss: 10798.946381
Style 3 loss: 8478.382111
Style 4 loss: 194969.238281
Style 5 loss: 2001.748657
Total loss: 1031565.667820
Iteration 950 / 1000
Content 1 loss: 814202.187500
Style 1 loss: 399.035096
Style 2 loss: 10813.613129
Style 3 loss: 8460.771179
Style 4 loss: 194879.394531
Style 5 loss: 2004.592323
Total loss: 1030759.593759
Iteration 1000 / 1000
Content 1 loss: 813637.656250
Style 1 loss: 378.980017
Style 2 loss: 10813.954926
Style 3 loss: 8443.388367
Style 4 loss: 194776.232910
Style 5 loss: 2003.505325
Total loss: 1030053.717794
<optim.lbfgs> reached max number of iterations
real 4m31.962s
user 1m28.960s
sys 3m4.224s
#2
taewoo@WORKSTATION-linux:~/Git/neural-style$ time th neural_style.lua -style_image examples/inputs/picasso_selfport1907.jpg -content_image examples/inputs/brad_pitt.jpg -gpu 0
[libprotobuf WARNING google/protobuf/io/coded_stream.cc:537] Reading dangerously large protocol message. If the message turns out to be larger than 1073741824 bytes, parsing will be halted for security reasons. To increase the limit (or to disable these warnings), see CodedInputStream::SetTotalBytesLimit() in google/protobuf/io/coded_stream.h.
[libprotobuf WARNING google/protobuf/io/coded_stream.cc:78] The total number of bytes read was 574671192
Successfully loaded models/VGG_ILSVRC_19_layers.caffemodel
conv1_1: 64 3 3 3
conv1_2: 64 64 3 3
conv2_1: 128 64 3 3
conv2_2: 128 128 3 3
conv3_1: 256 128 3 3
conv3_2: 256 256 3 3
conv3_3: 256 256 3 3
conv3_4: 256 256 3 3
conv4_1: 512 256 3 3
conv4_2: 512 512 3 3
conv4_3: 512 512 3 3
conv4_4: 512 512 3 3
conv5_1: 512 512 3 3
conv5_2: 512 512 3 3
conv5_3: 512 512 3 3
conv5_4: 512 512 3 3
fc6: 1 1 25088 4096
fc7: 1 1 4096 4096
fc8: 1 1 4096 1000
Setting up style layer 2 : relu1_1
Setting up style layer 7 : relu2_1
Setting up style layer 12 : relu3_1
Setting up style layer 21 : relu4_1
Setting up content layer 23 : relu4_2
Setting up style layer 30 : relu5_1
WARNING: Skipping content loss
Running optimization with L-BFGS
<optim.lbfgs> creating recyclable direction/step/history buffers
Iteration 50 / 1000
Content 1 loss: 1774874.531250
Style 1 loss: 25180.638123
Style 2 loss: 280689.013672
Style 3 loss: 116541.613770
Style 4 loss: 550859.960938
Style 5 loss: 1543.097687
Total loss: 2749688.855438
Iteration 100 / 1000
Content 1 loss: 1194524.609375
Style 1 loss: 11343.657684
Style 2 loss: 70086.804199
Style 3 loss: 38493.414307
Style 4 loss: 253914.013672
Style 5 loss: 1817.040634
Total loss: 1570179.539871
Iteration 150 / 1000
Content 1 loss: 1033792.265625
Style 1 loss: 5993.065262
Style 2 loss: 39630.352783
Style 3 loss: 22694.697571
Style 4 loss: 214628.369141
Style 5 loss: 1874.523163
Total loss: 1318613.273544
Iteration 200 / 1000
Content 1 loss: 953875.625000
Style 1 loss: 3985.498047
Style 2 loss: 28834.487915
Style 3 loss: 16518.402100
Style 4 loss: 205678.906250
Style 5 loss: 1913.185692
Total loss: 1210806.105003
Iteration 250 / 1000
Content 1 loss: 907314.218750
Style 1 loss: 3047.982025
Style 2 loss: 21769.166565
Style 3 loss: 13612.619019
Style 4 loss: 202037.744141
Style 5 loss: 1941.698074
Total loss: 1149723.428574
Iteration 300 / 1000
Content 1 loss: 880541.093750
Style 1 loss: 2506.102371
Style 2 loss: 17648.365784
Style 3 loss: 11860.801697
Style 4 loss: 200030.920410
Style 5 loss: 1961.605453
Total loss: 1114548.889465
Iteration 350 / 1000
Content 1 loss: 862408.906250
Style 1 loss: 2111.388016
Style 2 loss: 15024.301147
Style 3 loss: 10732.914734
Style 4 loss: 199479.638672
Style 5 loss: 1963.183784
Total loss: 1091720.332603
Iteration 400 / 1000
Content 1 loss: 851321.250000
Style 1 loss: 1773.161697
Style 2 loss: 13186.045837
Style 3 loss: 10083.725739
Style 4 loss: 197640.869141
Style 5 loss: 1968.438721
Total loss: 1075973.491135
Iteration 450 / 1000
Content 1 loss: 841806.250000
Style 1 loss: 1487.854862
Style 2 loss: 12160.719299
Style 3 loss: 9616.334534
Style 4 loss: 197616.931152
Style 5 loss: 1979.572296
Total loss: 1064667.662144
Iteration 500 / 1000
Content 1 loss: 835579.218750
Style 1 loss: 1247.486687
Style 2 loss: 11483.181000
Style 3 loss: 9304.618073
Style 4 loss: 196834.777832
Style 5 loss: 1985.949135
Total loss: 1056435.231476
Iteration 550 / 1000
Content 1 loss: 830368.593750
Style 1 loss: 1037.822342
Style 2 loss: 11031.617737
Style 3 loss: 9059.342194
Style 4 loss: 196492.834473
Style 5 loss: 1988.527107
Total loss: 1049978.737602
Iteration 600 / 1000
Content 1 loss: 826952.421875
Style 1 loss: 879.428196
Style 2 loss: 10732.587433
Style 3 loss: 8900.198364
Style 4 loss: 196125.878906
Style 5 loss: 1988.574028
Total loss: 1045579.088802
Iteration 650 / 1000
Content 1 loss: 824132.968750
Style 1 loss: 762.875557
Style 2 loss: 10613.999176
Style 3 loss: 8772.058868
Style 4 loss: 195842.285156
Style 5 loss: 1996.232414
Total loss: 1042120.419922
Iteration 700 / 1000
Content 1 loss: 821954.687500
Style 1 loss: 664.649487
Style 2 loss: 10498.056793
Style 3 loss: 8673.876953
Style 4 loss: 195554.553223
Style 5 loss: 1997.237206
Total loss: 1039343.061161
Iteration 750 / 1000
Content 1 loss: 820281.328125
Style 1 loss: 594.624805
Style 2 loss: 10469.007874
Style 3 loss: 8618.489075
Style 4 loss: 195379.443359
Style 5 loss: 1999.764442
Total loss: 1037342.657681
Iteration 800 / 1000
Content 1 loss: 819193.359375
Style 1 loss: 538.523626
Style 2 loss: 10455.603027
Style 3 loss: 8550.291443
Style 4 loss: 195016.577148
Style 5 loss: 2003.457642
Total loss: 1035757.812262
Iteration 850 / 1000
Content 1 loss: 818018.984375
Style 1 loss: 499.629784
Style 2 loss: 10483.760071
Style 3 loss: 8481.781769
Style 4 loss: 195068.542480
Style 5 loss: 2004.603577
Total loss: 1034557.302055
Iteration 900 / 1000
Content 1 loss: 817074.375000
Style 1 loss: 465.078545
Style 2 loss: 10468.334198
Style 3 loss: 8442.778778
Style 4 loss: 194996.276855
Style 5 loss: 2003.766632
Total loss: 1033450.610008
Iteration 950 / 1000
Content 1 loss: 816341.406250
Style 1 loss: 439.767027
Style 2 loss: 10492.218018
Style 3 loss: 8406.150818
Style 4 loss: 194885.900879
Style 5 loss: 2007.225037
Total loss: 1032572.668028
Iteration 1000 / 1000
Content 1 loss: 815634.296875
Style 1 loss: 419.553661
Style 2 loss: 10514.499664
Style 3 loss: 8378.218842
Style 4 loss: 194869.873047
Style 5 loss: 2008.002472
Total loss: 1031824.444561
<optim.lbfgs> reached max number of iterations
real 4m22.650s
user 1m24.372s
sys 3m0.056s
#3
taewoo@WORKSTATION-linux:~/Git/neural-style$ time th neural_style.lua -style_image examples/inputs/picasso_selfport1907.jpg -content_image examples/inputs/brad_pitt.jpg -gpu 0
[libprotobuf WARNING google/protobuf/io/coded_stream.cc:537] Reading dangerously large protocol message. If the message turns out to be larger than 1073741824 bytes, parsing will be halted for security reasons. To increase the limit (or to disable these warnings), see CodedInputStream::SetTotalBytesLimit() in google/protobuf/io/coded_stream.h.
[libprotobuf WARNING google/protobuf/io/coded_stream.cc:78] The total number of bytes read was 574671192
Successfully loaded models/VGG_ILSVRC_19_layers.caffemodel
conv1_1: 64 3 3 3
conv1_2: 64 64 3 3
conv2_1: 128 64 3 3
conv2_2: 128 128 3 3
conv3_1: 256 128 3 3
conv3_2: 256 256 3 3
conv3_3: 256 256 3 3
conv3_4: 256 256 3 3
conv4_1: 512 256 3 3
conv4_2: 512 512 3 3
conv4_3: 512 512 3 3
conv4_4: 512 512 3 3
conv5_1: 512 512 3 3
conv5_2: 512 512 3 3
conv5_3: 512 512 3 3
conv5_4: 512 512 3 3
fc6: 1 1 25088 4096
fc7: 1 1 4096 4096
fc8: 1 1 4096 1000
Setting up style layer 2 : relu1_1
Setting up style layer 7 : relu2_1
Setting up style layer 12 : relu3_1
Setting up style layer 21 : relu4_1
Setting up content layer 23 : relu4_2
Setting up style layer 30 : relu5_1
WARNING: Skipping content loss
Running optimization with L-BFGS
<optim.lbfgs> creating recyclable direction/step/history buffers
Iteration 50 / 1000
Content 1 loss: 1761753.906250
Style 1 loss: 25287.608337
Style 2 loss: 279328.173828
Style 3 loss: 118774.487305
Style 4 loss: 556809.277344
Style 5 loss: 1574.809742
Total loss: 2743528.262806
Iteration 100 / 1000
Content 1 loss: 1182324.296875
Style 1 loss: 11402.853394
Style 2 loss: 69650.585938
Style 3 loss: 39340.750122
Style 4 loss: 259656.225586
Style 5 loss: 1887.931824
Total loss: 1564262.643738
Iteration 150 / 1000
Content 1 loss: 1019987.734375
Style 1 loss: 6093.116760
Style 2 loss: 39888.647461
Style 3 loss: 23587.405396
Style 4 loss: 217462.792969
Style 5 loss: 2015.901184
Total loss: 1309035.598145
Iteration 200 / 1000
Content 1 loss: 945157.343750
Style 1 loss: 3989.292526
Style 2 loss: 28231.042480
Style 3 loss: 16980.914307
Style 4 loss: 204685.412598
Style 5 loss: 1995.962906
Total loss: 1201039.968567
Iteration 250 / 1000
Content 1 loss: 904805.312500
Style 1 loss: 3090.190125
Style 2 loss: 21773.545837
Style 3 loss: 14132.044983
Style 4 loss: 200520.361328
Style 5 loss: 2002.595520
Total loss: 1146324.050293
Iteration 300 / 1000
Content 1 loss: 879279.921875
Style 1 loss: 2552.487946
Style 2 loss: 17650.587463
Style 3 loss: 12324.729156
Style 4 loss: 198691.101074
Style 5 loss: 1998.982620
Total loss: 1112497.810135
Iteration 350 / 1000
Content 1 loss: 862677.812500
Style 1 loss: 2128.585434
Style 2 loss: 14920.870972
Style 3 loss: 11168.123627
Style 4 loss: 197176.892090
Style 5 loss: 2019.308090
Total loss: 1090091.592712
Iteration 400 / 1000
Content 1 loss: 850425.234375
Style 1 loss: 1778.481483
Style 2 loss: 13225.325012
Style 3 loss: 10348.283386
Style 4 loss: 196933.532715
Style 5 loss: 2023.924065
Total loss: 1074734.781036
Iteration 450 / 1000
Content 1 loss: 841917.734375
Style 1 loss: 1473.486519
Style 2 loss: 11935.565948
Style 3 loss: 9877.096558
Style 4 loss: 196288.562012
Style 5 loss: 2025.212097
Total loss: 1063517.657509
Iteration 500 / 1000
Content 1 loss: 836202.812500
Style 1 loss: 1229.069328
Style 2 loss: 11319.752502
Style 3 loss: 9557.929993
Style 4 loss: 195492.712402
Style 5 loss: 2033.366776
Total loss: 1055835.643501
Iteration 550 / 1000
Content 1 loss: 832063.750000
Style 1 loss: 1050.545311
Style 2 loss: 10994.984436
Style 3 loss: 9315.908051
Style 4 loss: 195270.397949
Style 5 loss: 2032.663155
Total loss: 1050728.248901
Iteration 600 / 1000
Content 1 loss: 828718.359375
Style 1 loss: 902.553940
Style 2 loss: 10808.340454
Style 3 loss: 9189.046478
Style 4 loss: 195338.623047
Style 5 loss: 2035.297012
Total loss: 1046992.220306
Iteration 650 / 1000
Content 1 loss: 826589.609375
Style 1 loss: 787.669420
Style 2 loss: 10693.334961
Style 3 loss: 9089.991760
Style 4 loss: 195035.144043
Style 5 loss: 2037.417030
Total loss: 1044233.166590
Iteration 700 / 1000
Content 1 loss: 824686.796875
Style 1 loss: 693.629313
Style 2 loss: 10608.457947
Style 3 loss: 9021.148682
Style 4 loss: 194985.095215
Style 5 loss: 2039.324951
Total loss: 1042034.452982
Iteration 750 / 1000
Content 1 loss: 822703.515625
Style 1 loss: 621.417093
Style 2 loss: 10597.530365
Style 3 loss: 8947.353363
Style 4 loss: 195422.351074
Style 5 loss: 2044.232941
Total loss: 1040336.400461
Iteration 800 / 1000
Content 1 loss: 821585.078125
Style 1 loss: 565.353537
Style 2 loss: 10590.690613
Style 3 loss: 8891.757202
Style 4 loss: 195093.957520
Style 5 loss: 2044.330597
Total loss: 1038771.167593
Iteration 850 / 1000
Content 1 loss: 820346.484375
Style 1 loss: 520.988417
Style 2 loss: 10593.524933
Style 3 loss: 8851.898193
Style 4 loss: 195220.495605
Style 5 loss: 2046.004295
Total loss: 1037579.395819
Iteration 900 / 1000
Content 1 loss: 819439.296875
Style 1 loss: 489.326048
Style 2 loss: 10619.567108
Style 3 loss: 8821.213531
Style 4 loss: 195218.457031
Style 5 loss: 2045.465469
Total loss: 1036633.326063
Iteration 950 / 1000
Content 1 loss: 818755.859375
Style 1 loss: 464.089680
Style 2 loss: 10625.397491
Style 3 loss: 8791.822052
Style 4 loss: 195127.233887
Style 5 loss: 2044.708824
Total loss: 1035809.111309
Iteration 1000 / 1000
Content 1 loss: 818057.812500
Style 1 loss: 444.171810
Style 2 loss: 10650.716400
Style 3 loss: 8766.759491
Style 4 loss: 195127.233887
Style 5 loss: 2045.172310
Total loss: 1035091.866398
<optim.lbfgs> reached max number of iterations
real 4m39.655s
user 1m8.500s
sys 3m34.000s
#1
taewoo@WORKSTATION-linux:~/Git/neural-style$ time th neural_style.lua -style_image examples/inputs/picasso_selfport1907.jpg -content_image examples/inputs/brad_pitt.jpg -gpu 0
[libprotobuf WARNING google/protobuf/io/coded_stream.cc:537] Reading dangerously large protocol message. If the message turns out to be larger than 1073741824 bytes, parsing will be halted for security reasons. To increase the limit (or to disable these warnings), see CodedInputStream::SetTotalBytesLimit() in google/protobuf/io/coded_stream.h.
[libprotobuf WARNING google/protobuf/io/coded_stream.cc:78] The total number of bytes read was 574671192
Successfully loaded models/VGG_ILSVRC_19_layers.caffemodel
conv1_1: 64 3 3 3
conv1_2: 64 64 3 3
conv2_1: 128 64 3 3
conv2_2: 128 128 3 3
conv3_1: 256 128 3 3
conv3_2: 256 256 3 3
conv3_3: 256 256 3 3
conv3_4: 256 256 3 3
conv4_1: 512 256 3 3
conv4_2: 512 512 3 3
conv4_3: 512 512 3 3
conv4_4: 512 512 3 3
conv5_1: 512 512 3 3
conv5_2: 512 512 3 3
conv5_3: 512 512 3 3
conv5_4: 512 512 3 3
fc6: 1 1 25088 4096
fc7: 1 1 4096 4096
fc8: 1 1 4096 1000
Setting up style layer 2 : relu1_1
Setting up style layer 7 : relu2_1
Setting up style layer 12 : relu3_1
Setting up style layer 21 : relu4_1
Setting up content layer 23 : relu4_2
Setting up style layer 30 : relu5_1
WARNING: Skipping content loss
Running optimization with L-BFGS
<optim.lbfgs> creating recyclable direction/step/history buffers
Iteration 50 / 1000
Content 1 loss: 1770470.000000
Style 1 loss: 25200.851440
Style 2 loss: 279319.580078
Style 3 loss: 114366.149902
Style 4 loss: 566877.636719
Style 5 loss: 1667.632866
Total loss: 2757901.851006
Iteration 100 / 1000
Content 1 loss: 1183926.562500
Style 1 loss: 11637.490845
Style 2 loss: 70894.787598
Style 3 loss: 39278.530884
Style 4 loss: 259315.722656
Style 5 loss: 1830.713844
Total loss: 1566883.808327
Iteration 150 / 1000
Content 1 loss: 1022770.078125
Style 1 loss: 6135.875702
Style 2 loss: 40126.321411
Style 3 loss: 23872.772217
Style 4 loss: 215668.579102
Style 5 loss: 1925.718880
Total loss: 1310499.345436
Iteration 200 / 1000
Content 1 loss: 948589.218750
Style 1 loss: 4054.824066
Style 2 loss: 29272.415161
Style 3 loss: 17253.109741
Style 4 loss: 205020.629883
Style 5 loss: 1973.143959
Total loss: 1206163.341560
Iteration 250 / 1000
Content 1 loss: 908381.406250
Style 1 loss: 3100.628281
Style 2 loss: 22342.042542
Style 3 loss: 14063.865662
Style 4 loss: 200462.548828
Style 5 loss: 1986.809731
Total loss: 1150337.301292
Iteration 300 / 1000
Content 1 loss: 881643.515625
Style 1 loss: 2533.939743
Style 2 loss: 17899.694824
Style 3 loss: 12262.798309
Style 4 loss: 199025.854492
Style 5 loss: 1996.909714
Total loss: 1115362.712708
Iteration 350 / 1000
Content 1 loss: 864568.828125
Style 1 loss: 2149.227524
Style 2 loss: 15181.250000
Style 3 loss: 11136.645508
Style 4 loss: 197624.707031
Style 5 loss: 2000.788498
Total loss: 1092661.446686
Iteration 400 / 1000
Content 1 loss: 851826.250000
Style 1 loss: 1816.370010
Style 2 loss: 13477.555847
Style 3 loss: 10388.988495
Style 4 loss: 197668.273926
Style 5 loss: 2016.307831
Total loss: 1077193.746109
Iteration 450 / 1000
Content 1 loss: 842961.875000
Style 1 loss: 1509.063244
Style 2 loss: 12325.123596
Style 3 loss: 9865.729523
Style 4 loss: 197026.635742
Style 5 loss: 2014.598083
Total loss: 1065703.025188
Iteration 500 / 1000
Content 1 loss: 836541.250000
Style 1 loss: 1240.675163
Style 2 loss: 11597.870636
Style 3 loss: 9516.201782
Style 4 loss: 196186.853027
Style 5 loss: 2017.001724
Total loss: 1057099.852333
Iteration 550 / 1000
Content 1 loss: 831430.625000
Style 1 loss: 1024.967384
Style 2 loss: 11160.164642
Style 3 loss: 9218.138885
Style 4 loss: 195863.586426
Style 5 loss: 2025.366592
Total loss: 1050722.848930
Iteration 600 / 1000
Content 1 loss: 827929.218750
Style 1 loss: 847.535706
Style 2 loss: 10932.852936
Style 3 loss: 9005.509186
Style 4 loss: 195480.139160
Style 5 loss: 2030.476189
Total loss: 1046225.731926
Iteration 650 / 1000
Content 1 loss: 824603.515625
Style 1 loss: 722.611713
Style 2 loss: 10826.539612
Style 3 loss: 8873.562622
Style 4 loss: 195730.249023
Style 5 loss: 2033.355904
Total loss: 1042789.834499
Iteration 700 / 1000
Content 1 loss: 822086.640625
Style 1 loss: 627.222776
Style 2 loss: 10770.568085
Style 3 loss: 8798.066711
Style 4 loss: 195724.450684
Style 5 loss: 2040.531158
Total loss: 1040047.480040
Iteration 750 / 1000
Content 1 loss: 819990.625000
Style 1 loss: 556.353378
Style 2 loss: 10736.837006
Style 3 loss: 8734.281921
Style 4 loss: 195790.539551
Style 5 loss: 2037.804413
Total loss: 1037846.441269
Iteration 800 / 1000
Content 1 loss: 818772.578125
Style 1 loss: 503.467751
Style 2 loss: 10738.347626
Style 3 loss: 8658.325958
Style 4 loss: 195604.064941
Style 5 loss: 2040.077782
Total loss: 1036316.862183
Iteration 850 / 1000
Content 1 loss: 817643.906250
Style 1 loss: 459.562540
Style 2 loss: 10741.659546
Style 3 loss: 8618.347931
Style 4 loss: 195547.070312
Style 5 loss: 2040.391159
Total loss: 1035050.937738
Iteration 900 / 1000
Content 1 loss: 816812.421875
Style 1 loss: 424.837399
Style 2 loss: 10740.884399
Style 3 loss: 8583.791351
Style 4 loss: 195388.598633
Style 5 loss: 2039.632416
Total loss: 1033990.166073
Iteration 950 / 1000
Content 1 loss: 816022.109375
Style 1 loss: 398.780870
Style 2 loss: 10740.989685
Style 3 loss: 8566.586304
Style 4 loss: 195350.878906
Style 5 loss: 2039.472771
Total loss: 1033118.817911
Iteration 1000 / 1000
Content 1 loss: 815321.953125
Style 1 loss: 377.406740
Style 2 loss: 10754.589081
Style 3 loss: 8542.569733
Style 4 loss: 195367.919922
Style 5 loss: 2040.253639
Total loss: 1032404.692240
<optim.lbfgs> reached max number of iterations
real 4m25.287s
user 1m28.848s
sys 2m57.688s
#2
taewoo@WORKSTATION-linux:~/Git/neural-style$ time th neural_style.lua -style_image examples/inputs/picasso_selfport1907.jpg -content_image examples/inputs/brad_pitt.jpg -gpu 0
[libprotobuf WARNING google/protobuf/io/coded_stream.cc:537] Reading dangerously large protocol message. If the message turns out to be larger than 1073741824 bytes, parsing will be halted for security reasons. To increase the limit (or to disable these warnings), see CodedInputStream::SetTotalBytesLimit() in google/protobuf/io/coded_stream.h.
[libprotobuf WARNING google/protobuf/io/coded_stream.cc:78] The total number of bytes read was 574671192
Successfully loaded models/VGG_ILSVRC_19_layers.caffemodel
conv1_1: 64 3 3 3
conv1_2: 64 64 3 3
conv2_1: 128 64 3 3
conv2_2: 128 128 3 3
conv3_1: 256 128 3 3
conv3_2: 256 256 3 3
conv3_3: 256 256 3 3
conv3_4: 256 256 3 3
conv4_1: 512 256 3 3
conv4_2: 512 512 3 3
conv4_3: 512 512 3 3
conv4_4: 512 512 3 3
conv5_1: 512 512 3 3
conv5_2: 512 512 3 3
conv5_3: 512 512 3 3
conv5_4: 512 512 3 3
fc6: 1 1 25088 4096
fc7: 1 1 4096 4096
fc8: 1 1 4096 1000
Setting up style layer 2 : relu1_1
Setting up style layer 7 : relu2_1
Setting up style layer 12 : relu3_1
Setting up style layer 21 : relu4_1
Setting up content layer 23 : relu4_2
Setting up style layer 30 : relu5_1
WARNING: Skipping content loss
Running optimization with L-BFGS
<optim.lbfgs> creating recyclable direction/step/history buffers
Iteration 50 / 1000
Content 1 loss: 1772370.781250
Style 1 loss: 25639.126587
Style 2 loss: 288977.197266
Style 3 loss: 121097.363281
Style 4 loss: 584248.876953
Style 5 loss: 1483.149242
Total loss: 2793816.494579
Iteration 100 / 1000
Content 1 loss: 1182193.046875
Style 1 loss: 11808.596802
Style 2 loss: 71484.185791
Style 3 loss: 37775.152588
Style 4 loss: 260741.650391
Style 5 loss: 1718.350220
Total loss: 1565720.982666
Iteration 150 / 1000
Content 1 loss: 1012518.046875
Style 1 loss: 6095.630264
Style 2 loss: 39620.596313
Style 3 loss: 22384.872437
Style 4 loss: 218369.042969
Style 5 loss: 1843.844414
Total loss: 1300832.033272
Iteration 200 / 1000
Content 1 loss: 936346.875000
Style 1 loss: 3955.977631
Style 2 loss: 28240.701294
Style 3 loss: 16076.272583
Style 4 loss: 206783.447266
Style 5 loss: 1913.689423
Total loss: 1193316.963196
Iteration 250 / 1000
Content 1 loss: 894767.421875
Style 1 loss: 3071.393776
Style 2 loss: 21802.029419
Style 3 loss: 13125.479126
Style 4 loss: 203000.610352
Style 5 loss: 1954.009628
Total loss: 1137720.944176
Iteration 300 / 1000
Content 1 loss: 871530.625000
Style 1 loss: 2533.951569
Style 2 loss: 17694.816589
Style 3 loss: 11609.832764
Style 4 loss: 199420.776367
Style 5 loss: 1958.725739
Total loss: 1104748.728027
Iteration 350 / 1000
Content 1 loss: 855184.921875
Style 1 loss: 2145.102119
Style 2 loss: 15071.836853
Style 3 loss: 10637.272644
Style 4 loss: 198276.794434
Style 5 loss: 1963.727379
Total loss: 1083279.655304
Iteration 400 / 1000
Content 1 loss: 844113.515625
Style 1 loss: 1811.988449
Style 2 loss: 13447.627258
Style 3 loss: 9967.781830
Style 4 loss: 197559.680176
Style 5 loss: 1970.382690
Total loss: 1068870.976028
Iteration 450 / 1000
Content 1 loss: 836581.250000
Style 1 loss: 1494.517899
Style 2 loss: 12317.330170
Style 3 loss: 9534.007263
Style 4 loss: 196697.412109
Style 5 loss: 1977.483559
Total loss: 1058602.000999
Iteration 500 / 1000
Content 1 loss: 830795.703125
Style 1 loss: 1218.155003
Style 2 loss: 11651.380157
Style 3 loss: 9245.436859
Style 4 loss: 196263.305664
Style 5 loss: 1990.524101
Total loss: 1051164.504910
Iteration 550 / 1000
Content 1 loss: 826486.640625
Style 1 loss: 992.049217
Style 2 loss: 11268.521881
Style 3 loss: 9048.507690
Style 4 loss: 195994.799805
Style 5 loss: 2001.094627
Total loss: 1045791.613846
Iteration 600 / 1000
Content 1 loss: 823684.921875
Style 1 loss: 826.040077
Style 2 loss: 11003.535461
Style 3 loss: 8869.055939
Style 4 loss: 195558.911133
Style 5 loss: 2004.430771
Total loss: 1041946.895256
Iteration 650 / 1000
Content 1 loss: 821524.921875
Style 1 loss: 698.187351
Style 2 loss: 10883.559418
Style 3 loss: 8742.881012
Style 4 loss: 195234.680176
Style 5 loss: 2009.388924
Total loss: 1039093.618755
Iteration 700 / 1000
Content 1 loss: 819682.968750
Style 1 loss: 597.130919
Style 2 loss: 10796.788025
Style 3 loss: 8657.701111
Style 4 loss: 195100.488281
Style 5 loss: 2012.996483
Total loss: 1036848.073568
Iteration 750 / 1000
Content 1 loss: 818287.890625
Style 1 loss: 523.622227
Style 2 loss: 10764.929962
Style 3 loss: 8591.511536
Style 4 loss: 194971.813965
Style 5 loss: 2017.439079
Total loss: 1035157.207394
Iteration 800 / 1000
Content 1 loss: 816979.062500
Style 1 loss: 468.699408
Style 2 loss: 10756.433105
Style 3 loss: 8553.033447
Style 4 loss: 194977.722168
Style 5 loss: 2018.948174
Total loss: 1033753.898802
Iteration 850 / 1000
Content 1 loss: 815905.625000
Style 1 loss: 428.766632
Style 2 loss: 10724.550629
Style 3 loss: 8512.917328
Style 4 loss: 195035.717773
Style 5 loss: 2019.717598
Total loss: 1032627.294960
Iteration 900 / 1000
Content 1 loss: 815093.906250
Style 1 loss: 397.440267
Style 2 loss: 10703.311157
Style 3 loss: 8484.098053
Style 4 loss: 194969.592285
Style 5 loss: 2019.787979
Total loss: 1031668.135991
Iteration 950 / 1000
Content 1 loss: 814480.703125
Style 1 loss: 374.551177
Style 2 loss: 10693.264008
Style 3 loss: 8445.094299
Style 4 loss: 194855.603027
Style 5 loss: 2022.425270
Total loss: 1030871.640906
Iteration 1000 / 1000
Content 1 loss: 813861.093750
Style 1 loss: 355.271554
Style 2 loss: 10717.465973
Style 3 loss: 8427.762604
Style 4 loss: 194778.356934
Style 5 loss: 2022.392082
Total loss: 1030162.342896
<optim.lbfgs> reached max number of iterations
real 4m40.817s
user 1m34.204s
sys 3m8.640s
#3
taewoo@WORKSTATION-linux:~/Git/neural-style$ time th neural_style.lua -style_image examples/inputs/picasso_selfport1907.jpg -content_image examples/inputs/brad_pitt.jpg -gpu 0
[libprotobuf WARNING google/protobuf/io/coded_stream.cc:537] Reading dangerously large protocol message. If the message turns out to be larger than 1073741824 bytes, parsing will be halted for security reasons. To increase the limit (or to disable these warnings), see CodedInputStream::SetTotalBytesLimit() in google/protobuf/io/coded_stream.h.
[libprotobuf WARNING google/protobuf/io/coded_stream.cc:78] The total number of bytes read was 574671192
Successfully loaded models/VGG_ILSVRC_19_layers.caffemodel
conv1_1: 64 3 3 3
conv1_2: 64 64 3 3
conv2_1: 128 64 3 3
conv2_2: 128 128 3 3
conv3_1: 256 128 3 3
conv3_2: 256 256 3 3
conv3_3: 256 256 3 3
conv3_4: 256 256 3 3
conv4_1: 512 256 3 3
conv4_2: 512 512 3 3
conv4_3: 512 512 3 3
conv4_4: 512 512 3 3
conv5_1: 512 512 3 3
conv5_2: 512 512 3 3
conv5_3: 512 512 3 3
conv5_4: 512 512 3 3
fc6: 1 1 25088 4096
fc7: 1 1 4096 4096
fc8: 1 1 4096 1000
Setting up style layer 2 : relu1_1
Setting up style layer 7 : relu2_1
Setting up style layer 12 : relu3_1
Setting up style layer 21 : relu4_1
Setting up content layer 23 : relu4_2
Setting up style layer 30 : relu5_1
WARNING: Skipping content loss
Running optimization with L-BFGS
<optim.lbfgs> creating recyclable direction/step/history buffers
Iteration 50 / 1000
Content 1 loss: 1727502.031250
Style 1 loss: 25181.008911
Style 2 loss: 279645.605469
Style 3 loss: 115658.410645
Style 4 loss: 537293.261719
Style 5 loss: 1567.115211
Total loss: 2686847.433205
Iteration 100 / 1000
Content 1 loss: 1149662.812500
Style 1 loss: 11313.766479
Style 2 loss: 66863.574219
Style 3 loss: 37709.082031
Style 4 loss: 255507.153320
Style 5 loss: 1823.103142
Total loss: 1522879.491692
Iteration 150 / 1000
Content 1 loss: 996355.390625
Style 1 loss: 5862.733078
Style 2 loss: 37343.096924
Style 3 loss: 21865.435791
Style 4 loss: 215474.414062
Style 5 loss: 1882.864380
Total loss: 1278783.934860
Iteration 200 / 1000
Content 1 loss: 929966.015625
Style 1 loss: 3869.946289
Style 2 loss: 27235.559082
Style 3 loss: 16045.399475
Style 4 loss: 204308.947754
Style 5 loss: 1940.538406
Total loss: 1183366.406631
Iteration 250 / 1000
Content 1 loss: 894266.796875
Style 1 loss: 3033.085251
Style 2 loss: 21651.263428
Style 3 loss: 13380.670166
Style 4 loss: 200595.568848
Style 5 loss: 1956.622887
Total loss: 1134884.007454
Iteration 300 / 1000
Content 1 loss: 872245.468750
Style 1 loss: 2481.245613
Style 2 loss: 17448.698425
Style 3 loss: 11821.946716
Style 4 loss: 197789.428711
Style 5 loss: 1963.530159
Total loss: 1103750.318375
Iteration 350 / 1000
Content 1 loss: 856956.406250
Style 1 loss: 2110.918427
Style 2 loss: 15038.397217
Style 3 loss: 10839.212799
Style 4 loss: 197184.240723
Style 5 loss: 1958.585930
Total loss: 1084087.761345
Iteration 400 / 1000
Content 1 loss: 846861.875000
Style 1 loss: 1775.813293
Style 2 loss: 13281.933594
Style 3 loss: 10184.021759
Style 4 loss: 196428.918457
Style 5 loss: 1955.384827
Total loss: 1070487.946930
Iteration 450 / 1000
Content 1 loss: 839153.203125
Style 1 loss: 1477.078056
Style 2 loss: 12096.578217
Style 3 loss: 9744.568634
Style 4 loss: 196074.755859
Style 5 loss: 1967.421341
Total loss: 1060513.605232
Iteration 500 / 1000
Content 1 loss: 833483.203125
Style 1 loss: 1216.745472
Style 2 loss: 11397.233582
Style 3 loss: 9411.297607
Style 4 loss: 195777.624512
Style 5 loss: 1972.847366
Total loss: 1053258.951664
Iteration 550 / 1000
Content 1 loss: 829485.703125
Style 1 loss: 1003.780365
Style 2 loss: 10984.004211
Style 3 loss: 9186.926270
Style 4 loss: 195408.129883
Style 5 loss: 1975.588989
Total loss: 1048044.132843
Iteration 600 / 1000
Content 1 loss: 826736.484375
Style 1 loss: 831.589508
Style 2 loss: 10672.335052
Style 3 loss: 9043.427277
Style 4 loss: 194725.012207
Style 5 loss: 1973.835945
Total loss: 1043982.684364
Iteration 650 / 1000
Content 1 loss: 823919.609375
Style 1 loss: 717.177391
Style 2 loss: 10605.742645
Style 3 loss: 8961.063385
Style 4 loss: 195016.979980
Style 5 loss: 1976.427460
Total loss: 1041197.000237
Iteration 700 / 1000
Content 1 loss: 821942.734375
Style 1 loss: 619.871664
Style 2 loss: 10538.509369
Style 3 loss: 8869.702911
Style 4 loss: 194853.723145
Style 5 loss: 1977.566719
Total loss: 1038802.108183
Iteration 750 / 1000
Content 1 loss: 820429.453125
Style 1 loss: 550.355339
Style 2 loss: 10497.028351
Style 3 loss: 8795.985413
Style 4 loss: 194766.992188
Style 5 loss: 1979.179192
Total loss: 1037018.993607
Iteration 800 / 1000
Content 1 loss: 819407.812500
Style 1 loss: 490.980673
Style 2 loss: 10444.064331
Style 3 loss: 8718.345642
Style 4 loss: 194412.792969
Style 5 loss: 1981.171417
Total loss: 1035455.167532
Iteration 850 / 1000
Content 1 loss: 818385.625000
Style 1 loss: 450.058937
Style 2 loss: 10476.403809
Style 3 loss: 8667.263794
Style 4 loss: 194400.549316
Style 5 loss: 1982.979774
Total loss: 1034362.880630
Iteration 900 / 1000
Content 1 loss: 817489.140625
Style 1 loss: 419.271374
Style 2 loss: 10532.171631
Style 3 loss: 8636.356354
Style 4 loss: 194408.105469
Style 5 loss: 1984.999084
Total loss: 1033470.044537
Iteration 950 / 1000
Content 1 loss: 816714.921875
Style 1 loss: 391.808033
Style 2 loss: 10521.137238
Style 3 loss: 8617.547607
Style 4 loss: 194365.051270
Style 5 loss: 1985.528564
Total loss: 1032595.994587
Iteration 1000 / 1000
Content 1 loss: 816158.359375
Style 1 loss: 371.645737
Style 2 loss: 10549.217987
Style 3 loss: 8597.760773
Style 4 loss: 194247.155762
Style 5 loss: 1986.596870
Total loss: 1031910.736504
<optim.lbfgs> reached max number of iterations
real 4m41.463s
user 1m33.336s
sys 3m9.500s
'makeapp > Machine Learning' 카테고리의 다른 글
3. TensorFlow와 놀자! : Multiple Linear Regression (0) | 2016.11.20 |
---|---|
2. TensorFlow와 놀자! : Logistic (Regression) Classification (0) | 2016.11.17 |
1-1. Tensorflow와 놀자! : Linear Regression (직접 계산) (0) | 2016.11.17 |
1. Tensorflow와 놀자! : Linear Regression (0) | 2016.11.17 |
neural-style 써보기! (0) | 2016.11.17 |
Total
Today
Copyright © 2018 Codict. All Rights Reserved.
Powerd by Tistory kakao group.