Hi all,
we had this day of innovation and discovery on Friday. One subtopic was the eternal battle between RPCs, so, how to make one client have a call to data on a server (Remote Procedure Call). REST, not REST, this is the problem! Indeed it is possible to use other libraries for this purpose, other that creating a restful api with one of the many libraries available.

So, I already know that there are hundreds of protocol to study for communicating over tcp (Hessian, for example… it should be integrated in Spring if I am not wrong), but the discussion was about the Netty connection/server and GRPC (Protobuffer). I was not happy of the result we reached, so I tested it by myself.

First of all, I needed network latency. So I had to find a server that was hosting my application. AWS was not a good idea, I want to activate the free account when needed, google cloud I didn’t have my credit card near me ( 😛 ), so I have chosen Heroku, an host for webapp and java app. Putting there the server would have been easier, but unfortunately I got several problems and I suspect there are some tricks to make the port open from external traffic. No problem, I host the server and I put the client on Heroku.

I setup different tests: few messages sent with short message lenght (25 messages of 100 characters) and I got this result:

Statistic Netty result GRPC result
Msg sent 25 25
Success 22 25
Failure 3 0
Minimum time spent 100 98
Maximum time spent 999 552
Total time spent 10507 5972
Average time spent 420 238
90-perc 941 512
99-perc 999 552

that sounds strange because, for example, we can see that Netty has some problems in comparing the messages the client sends with the ones it receives from the server (the messages are echoed by the server, so the message sent should be the same that the one received).
I increased the number of message sent and the result were:

Statistic Netty result GRPC result
Msg sent 250 250
Success 222 250
Failure 28 0
Minimum time spent 100 98
Maximum time spent 6443 1484
Total time spent 136870 48113
Average time spent 547 192
90-perc 1006 462
99-perc 2854 577

This shows that GRPC seems to be quite fast in terms of transmission. Please consider that the creation of the content of the messages is made before starting the test.
I decided to do another test, with medium sized messages (5000 characters), and this was the result:

Statistic Netty result GRPC result
Msg sent 25 25
Success 0 25
Failure 25 0
Minimum time spent 107 147
Maximum time spent 1282 410
Total time spent 13040 5412
Average time spent 521 216
90-perc 984 378
99-perc 1282 410

That was amazing!!! Netty has failed to recognize ALL the messages it sent. Even considering that maybe I did something stupid in the code (like messing with the encoding charset or something like that) we can see that the average time is less than half. I redid the test with a bigger number of messages:

Statistic Netty result GRPC result
Msg sent 500 500
Success 3 500
Failure 497 0
Minimum time spent 103 146
Maximum time spent 8397 4438
Total time spent 297231 264479
Average time spent 594 528
90-perc 1111 1028
99-perc 2095 1865

This was strange: even if GRPC is still better, we had a strange result here, different from the previous one: only a 10-15% of gain. So I went for the huge test: 10 messages of one million characters.

Statistic Netty result GRPC result
Msg sent 10 10
Success 0 10
Failure 10 0
Minimum time spent 2072 10472
Maximum time spent 6115 14924
Total time spent 32054 121836
Average time spent 3205 12183
90-perc 615 14924

And here Netty won against GRPC!!! I run this test 3 or 4 times with the same result. Apparently, with some big messages there is something that makes Netty be faster than GRPC. So I did the final test, by sending the Bible as a single message. GRPC throws an exception on server side

  1. 2017 9:13:15 PM io.grpc.netty.NettyServerStream$TransportState deframeFailed
  2. WARNING: Exception processing message
  3. io.grpc.StatusRuntimeException: INTERNAL: io.grpc.netty.NettyServerStream$TransportState: Frame size 4525281 exceeds maximum: 4194304.
  4.   at io.grpc.Status.asRuntimeException(Status.java:531)
  5.   at io.grpc.internal.MessageDeframer.processHeader(MessageDeframer.java:348)

First of all please notice the io.grpc.netty.NettyServerStream$TransportState deframeFailed that is due to the fact that GRPC is based on Netty. So I substringed the Bible from 4525276 characters to 4194000 and I got the following results:

Statistic Netty result GRPC result
Successfully sent? No Yes
Total time spent 3044 9405

If you want to test it again, I leave you my repository. Fell free to run it in another cloud service (but remember Heroku for your tests in future). You can find my repo here.
Stay tuned!

Share