Forum

Nethermind errors encountered in xdai RPC node: Unable to rent an instance of IEthRpcModule. Too many concurrent requests

Hello,

We are running a relay server for gnosis safe txn that leverages the official POA maintained xDai RPC nodes: https://rpc.xdaichain.com/

Every now and then we get this error when sending a txn to the RPC node from our relay server, which results in a failed gnosis safe txn:

Nethermind.JsonRpc.Modules.ModuleRentalTimeoutException: Unable to rent an instance of IEthRpcModule. Too many concurrent requests.
   at Nethermind.JsonRpc.Modules.BoundedModulePool`1.SlowPath() in /src/Nethermind/Nethermind.JsonRpc/Modules/BoundedModulePool.cs:line 58
  at Nethermind.JsonRpc.Modules.RpcModuleProvider.<>c__DisplayClass15_0`1.<<Register>b__0>d.MoveNext() in /src/Nethermind/Nethermind.JsonRpc/Modules/RpcModuleProvider.cs:line 74
--- End of stack trace from previous location ---\n   at Nethermind.JsonRpc.JsonRpcService.ExecuteAsync(JsonRpcRequest request, String methodName, ValueTuple`2 method, JsonRpcContext context) in /src/Nethermind/Nethermind.JsonRpc/JsonRpcService.cs:line 161
   at Nethermind.JsonRpc.JsonRpcService.ExecuteRequestAsync(JsonRpcRequest rpcRequest, JsonRpcContext context) in /src/Nethermind/Nethermind.JsonRpc/JsonRpcService.cs:line 114
   at Nethermind.JsonRpc.JsonRpcService.SendRequestAsync(JsonRpcRequest rpcRequest, JsonRpcContext context) in /src/Nethermind/Nethermind.JsonRpc/JsonRpcService.cs:line 104

We’re not too sure what we should do about this, as it seems pretty much out of our control. I’m guessing this is a result of many clients issuing requests to the RPC node simultaneously. At times that we have seen this, our own relay server has had very little load so i think this message is more in the vein of our client competing with other clients for use of the xDai RPC node. What kind of SLA’s should we be able to expect for the RPC nodes that POA maintains?

thanks!
-Hassan

Would you share your Relay server setup, please? What is the base, who can access and what are the rate limits, if any?

Sure, we are using a lightly forked version of the Gnosis safe relay server. It is a Django/Python app hosted in AWS as a single t2.small EC2 instance. This is currently being used as part of our private beta (about 40 users have been invited to our beta), albeit we seldomly see concurrent usage. There are currently no rate limits that our relay server enforces.

Do you have a link to the repo or code, please? Thanks!

sure! here ya go: https://github.com/cardstack/card-protocol-relay-service. Actually this s private, can you send me your github username and I’ll add you to the repo

You should retry on that error. It’s an expected behavior of one of the public RPC nodes which can’t accept more requests than it’s already processing at the time of request. At the moment, there are around 40 nodes 16 dedicated CPU each with NETHERMIND_JSONRPCCONFIG_EthModuleConcurrentInstances: 32 setting on each node. If a node having more than 32 concurrent request at a time it will return such error which should be retry from the client side.

Here are stats of time required for different requests on one of the nodes

ethereum    | 2021-11-16 07:04:18.8760|***** JSON RPC report *****
ethereum    | -----------------------------------------------------------------------------------------------------------------------------------------
ethereum    | method                                  | successes |  avg time |  max time |    errors |  avg time |  max time | avg size | total size |
ethereum    | -----------------------------------------------------------------------------------------------------------------------------------------
ethereum    | # collection serialization #            |         1 | 320007948 | 320007948 |         0 |         0 |         0 |      108 |        108 |
ethereum    | admin_peers                             |         0 |         0 |         0 |         9 |       169 |       188 |      172 |       1548 |
ethereum    | eth_accounts                            |         2 |       260 |       294 |         0 |         0 |         0 |       82 |        164 |
ethereum    | eth_blockNumber                         |      1398 |       170 |       817 |         0 |         0 |         0 |       51 |      70763 |
ethereum    | eth_call                                |       209 |    182293 |  18664213 |        43 |   8372776 |  20002405 |      890 |     224218 |
ethereum    | eth_chainId                             |        42 |       165 |       565 |         0 |         0 |         0 |       43 |       1818 |
ethereum    | eth_coinbase                            |         1 |       291 |       291 |         0 |         0 |         0 |       78 |         78 |
ethereum    | eth_estimateGas                         |         3 |      2875 |      5173 |         3 |      2880 |      7715 |       89 |        535 |
ethereum    | eth_feeHistory                          |         0 |         0 |         0 |         5 |       415 |       466 |       78 |        389 |
ethereum    | eth_gasPrice                            |         4 |       165 |       184 |         0 |         0 |         0 |       52 |        209 |
ethereum    | eth_getBalance                          |        95 |     10456 |    952559 |         0 |         0 |         0 |       58 |       5539 |
ethereum    | eth_getBlockByNumber                    |        44 |       696 |      4760 |         0 |         0 |         0 |     4719 |     207637 |
ethereum    | eth_getCode                             |         3 |       365 |       443 |         0 |         0 |         0 |      400 |       1201 |
ethereum    | eth_getLogs                             |       221 |       969 |     69530 |         2 |  19999880 |  19999931 |       82 |      18368 |
ethereum    | eth_getTransactionByHash                |         4 |       781 |      2036 |         0 |         0 |         0 |     1129 |       4517 |
ethereum    | eth_getTransactionCount                 |        27 |     29265 |    766359 |         0 |         0 |         0 |       43 |       1170 |
ethereum    | eth_getTransactionReceipt               |        30 |       488 |      1925 |         0 |         0 |         0 |     1987 |      59617 |
ethereum    | eth_sendRawTransaction                  |         3 |      3198 |      4084 |         4 |       807 |      2041 |       93 |        649 |
ethereum    | eth_subscribe                           |        74 |       781 |      8528 |         1 |       195 |       195 |       75 |       5595 |
ethereum    | miner_setEtherbase                      |         0 |         0 |         0 |         5 |       234 |       264 |      103 |        515 |
ethereum    | miner_start                             |         0 |         0 |         0 |        30 |       262 |       675 |       96 |       2880 |
ethereum    | net_version                             |        32 |     64022 |    181695 |         0 |         0 |         0 |       43 |       1390 |
ethereum    | parity_netPeers                         |        10 |      3642 |      4800 |         0 |         0 |         0 |    23623 |     236228 |
ethereum    | web3_clientVersion                      |         6 |       267 |       358 |         0 |         0 |         0 |       91 |        547 |
ethereum    | -----------------------------------------------------------------------------------------------------------------------------------------
ethereum    | TOTAL                                   |      2209 |    164130 | 320007948 |       102 |   3922096 |  20002405 |      366 |     845683 |
ethereum    | -----------------------------------------------------------------------------------------------------------------------------------------

You can see that eth_call and eth_blockNumber time can be very different on a node.

I’m guessing this is a result of many clients issuing requests to the RPC node simultaneously.

the load on the public RPC is quite busy. There are 339mm requests every 86,400 seconds which is ~3900 requests per second

What kind of SLA’s should we be able to expect for the RPC nodes that POA maintains?

you can expect 100-283000*100/339000000 or 99.91651917% requests to be successful.

To get better results we recommend:

1 Like