Skip to content

Instantly share code, notes, and snippets.

@EnverOsmanov
Created February 20, 2017 12:03
Show Gist options
  • Save EnverOsmanov/cb6316d9beecf68c0073ccbd87e46774 to your computer and use it in GitHub Desktop.
Save EnverOsmanov/cb6316d9beecf68c0073ccbd87e46774 to your computer and use it in GitHub Desktop.
akka-http vs play-netty-server

2k was used as warmup

Projects used:

https://github.com/schmitch/performance

Hardware Client: MacBook Pro 11,3 - Intel(R) Core(TM) i7-4850HQ CPU @ 2.30GHz with a Broadcom NetXtreme Gigabit Ethernet Controller Network Switch: Consumer grade 5 Port Hardware Server: Intel(R) Xeon(R) CPU E5-2670 0 @ 2.60GHz (HT enabled)

Both were run with: -J-Xmx2G -J-Xms2G -J-server -J-XX:+UseNUMA -J-XX:+UseG1GC -J-XX:+AlwaysPreTouch -J-XX:+PerfDisableSharedMem -J-XX:+ParallelRefProcEnabled

akka-http 2.4.10

schmitch@deployster:~/projects/schmitch/wrk2$ ./wrk -t2 -c100 -d300s -R2k http://192.168.179.157:3000 Running 5m test @ http://192.168.179.157:3000 2 threads and 100 connections Thread calibration: mean lat.: 5.300ms, rate sampling interval: 10ms Thread calibration: mean lat.: 5.234ms, rate sampling interval: 10ms Thread Stats Avg Stdev Max +/- Stdev Latency 1.54ms 1.21ms 102.02ms 97.20% Req/Sec 1.05k 190.70 5.22k 82.02% 599802 requests in 5.00m, 101.69MB read Requests/sec: 1999.33 Transfer/sec: 347.11KB

CPU was below 5%

schmitch@deployster:~/projects/schmitch/wrk2$ ./wrk -t2 -c100 -d300s -R20k http://192.168.179.157:3000 Running 5m test @ http://192.168.179.157:3000 2 threads and 100 connections Thread calibration: mean lat.: 3.031ms, rate sampling interval: 10ms Thread calibration: mean lat.: 3.022ms, rate sampling interval: 10ms Thread Stats Avg Stdev Max +/- Stdev Latency 1.67ms 2.68ms 79.87ms 96.83% Req/Sec 10.55k 2.48k 60.00k 94.45% 5997552 requests in 5.00m, 1.00GB read Requests/sec: 19991.80 Transfer/sec: 3.41MB

CPU was between 20% - 30%

schmitch@deployster:~/projects/schmitch/wrk2$ ./wrk -t2 -c100 -d300s -R120k http://192.168.179.157:3000 Running 5m test @ http://192.168.179.157:3000 2 threads and 100 connections Thread calibration: mean lat.: 787.360ms, rate sampling interval: 2975ms Thread calibration: mean lat.: 585.613ms, rate sampling interval: 2473ms Thread Stats Avg Stdev Max +/- Stdev Latency 30.11s 22.42s 1.58m 62.48% Req/Sec 44.77k 4.77k 54.28k 58.88% 26888534 requests in 5.00m, 4.50GB read Requests/sec: 89628.49 Transfer/sec: 15.37MB

CPU was between 65% - 70%

Play on Netty (no native)

schmitch@deployster:~/projects/schmitch/wrk2$ ./wrk -t2 -c100 -d300s -R2k http://192.168.179.157:9000 Running 5m test @ http://192.168.179.157:9000 2 threads and 100 connections Thread calibration: mean lat.: 1.453ms, rate sampling interval: 10ms Thread calibration: mean lat.: 1.482ms, rate sampling interval: 10ms Thread Stats Avg Stdev Max +/- Stdev Latency 1.34ms 517.30us 18.19ms 70.46% Req/Sec 1.05k 145.06 2.78k 75.98% 599802 requests in 5.00m, 86.84MB read Requests/sec: 1999.33 Transfer/sec: 296.40KB

CPU was below 5%

schmitch@deployster:~/projects/schmitch/wrk2$ ./wrk -t2 -c100 -d300s -R20k http://192.168.179.157:9000 Running 5m test @ http://192.168.179.157:9000 2 threads and 100 connections Thread calibration: mean lat.: 1.520ms, rate sampling interval: 10ms Thread calibration: mean lat.: 1.540ms, rate sampling interval: 10ms Thread Stats Avg Stdev Max +/- Stdev Latency 1.48ms 2.04ms 70.40ms 98.85% Req/Sec 10.54k 1.54k 45.00k 83.84% 5997554 requests in 5.00m, 0.85GB read Requests/sec: 19991.74 Transfer/sec: 2.92MB

CPU was between 10%-30% (mostly only spikes were at 20%-30%, while most of the time it was at 10%-20%)

schmitch@deployster:~/projects/schmitch/wrk2$ ./wrk -t2 -c100 -d300s -R120k http://192.168.179.157:9000 Running 5m test @ http://192.168.179.157:9000 2 threads and 100 connections Thread calibration: mean lat.: 1314.033ms, rate sampling interval: 5185ms Thread calibration: mean lat.: 1473.175ms, rate sampling interval: 5271ms Thread Stats Avg Stdev Max +/- Stdev Latency 41.34s 20.05s 1.53m 61.79% Req/Sec 44.84k 3.11k 51.08k 68.81% 26778534 requests in 5.00m, 3.84GB read Requests/sec: 89261.83 Transfer/sec: 13.10MB

CPU was at 98% the first two minutes and then dropped to 40-70% looks like the initial spike was a extreme pressure for netty

Play on Netty native (play.server.netty.transport = "native")

schmitch@deployster:~/projects/schmitch/wrk2$ ./wrk -t2 -c100 -d300s -R2k http://192.168.179.157:9000 Running 5m test @ http://192.168.179.157:9000 2 threads and 100 connections Thread calibration: mean lat.: 15.750ms, rate sampling interval: 10ms Thread calibration: mean lat.: 15.704ms, rate sampling interval: 10ms Thread Stats Avg Stdev Max +/- Stdev Latency 1.39ms 577.39us 23.25ms 73.82% Req/Sec 1.05k 158.04 3.11k 79.41% 599802 requests in 5.00m, 86.83MB read Requests/sec: 1999.33 Transfer/sec: 296.39KB

CPU was below 3%

schmitch@deployster:~/projects/schmitch/wrk2$ ./wrk -t2 -c100 -d300s -R20k http://192.168.179.157:9000 Running 5m test @ http://192.168.179.157:9000 2 threads and 100 connections Thread calibration: mean lat.: 2.028ms, rate sampling interval: 10ms Thread calibration: mean lat.: 2.078ms, rate sampling interval: 10ms Thread Stats Avg Stdev Max +/- Stdev Latency 1.66ms 3.10ms 117.95ms 97.44% Req/Sec 10.54k 1.89k 46.44k 92.02% 5997552 requests in 5.00m, 0.85GB read Requests/sec: 19991.73 Transfer/sec: 2.92MB

CPU was below 20%

schmitch@deployster:~/projects/schmitch/wrk2$ ./wrk -t2 -c100 -d300s -R120k http://192.168.179.157:9000 Running 5m test @ http://192.168.179.157:9000 2 threads and 100 connections Thread calibration: mean lat.: 625.068ms, rate sampling interval: 2504ms Thread calibration: mean lat.: 696.276ms, rate sampling interval: 2562ms Thread Stats Avg Stdev Max +/- Stdev Latency 28.14s 18.49s 1.32m 61.39% Req/Sec 46.78k 3.23k 51.52k 52.63% 28079997 requests in 5.00m, 4.02GB read Requests/sec: 93600.05 Transfer/sec: 13.74MB

CPU was between 50%-70%

On Netty Native, Real CPU Threads (non HT Threads) had way more pressure =20%

// Personal Conclusion: Push akka-http on Play to gain good performance, aswell.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment