Setup high tps test xDai chain

#1

Hi POA community!

I’m Nikolaj from the integrations team at Maker. We are experimenting with creating a solution that produce a higher tps throughput of ERC20 tokens than is currently available on the Ethereum mainnet, in order to meet partner demand in terms of high volume of Dai transactions.

xDai chain already offers greater tps than mainnet, while being easy to implement services on top of, however we are interested in seeing if we can push the boundaries even further. Therefore we are interested in setting up a test xDai chain with a larger block size, to ideally allow for +1500 tps of ERC20 tokens.

My proposal: Setup new test xDai chain with new parameters

  • Increase block size to at least 133 m gas
  • Keep block time at 5 s
  • Whitelisting (to ensure specified actors get necessary throughput)
  • The goal is to achieve +1500 TPS (ERC20 token)

We are investigating creating a Gitcoin bounty to research the ideal chain specification to reach the goal.

2 Likes
#2

Such high gas limit may cause serious block propagation issues. Did you have a chance to test that setup in the wild?

You might be interested in compounding higher gas limit with batching ERC20 transfers. See this research paper I wrote on this topic a while back. On its own batching would result in about 2x throughput increase.

My calculations show that a block with 7500 ERC20 transfers would consume 277-389 million gas without batching and 132-245 million if all transactions are batched. The bounds are due to different SSTORE costs. The lower bound is everyone having a token balance and the upper bound is when all writes hit an empty storage slots.

The difference is significant enough to consider building a relayer that batches the transactions. Another option to look into is the abandoned EIP-1283. It would’ve reduced the requirements to 49-209 million gas per block. Another thing to explore is to profile the actual computing time and carefully adjust the opcode costs accordingly. This will allow to keep gas limits lower while increasing the throughput.

3 Likes
#3

Hello, I’m Zak from Whiteblock. We’re a blockchain testing company. Me and my team could easily run these test and have run similar ones in the past. Here is some of our prior research in a related category.

We also have experience working with POA, of course. Let me know if you’re interested in working with us on this or have any questions! I’d gladly hop on a call to show you a demo of our testing framework and tooling.

Cheers!

1 Like
#4

Such high gas limit may cause serious block propagation issues. Did you have a chance to test that setup in the wild?

GoChain have 135mm gas for over a year and it works for them. https://stats.gochain.io/
We can test on their network

#5

We have not tested this setup ourselves yet - we wanted to check if someone in the community had any insights on a similar setup first, however, as stated above, we might launch a Gitcoin bounty for someone to do the research.
Thanks for the insights, I will take a look at the paper, and investigate the relayer option. :slight_smile:

#6

Hi Zak, thanks for the insights. It would be great to setup a call to discuss options for collaboration on this.

#7

How many validating nodes do you think should be enough for the test? 2?

1 Like
#8

Good question. Minimal viable governance is 3 nodes now. Although, 2 nodes will be enough to test out block propagation and play with block speed. @lollike what do you think?

#9

You will need at least 3 Validators to be able to test a vote.

#10

This can be downloaded during the network init. I think one validator is a minimum, but two will show network slow downs and consensus faults can be tested.

#11

This is where I need the expertise of someone else :slight_smile:
What is the implication of upping the amount of nodes? Block propagation, and therefore less responsive network = less tps? For a test setup, two should do, but the closer we can get it to look like a real setup, the better. The live xDai chain has 4 validator nodes, right?

#12

We should also do a test when at least one of the validators runs two nodes (with same key)

#13

Could you please explain your use case? 1,500 transactions per second, sustained, is a huge amount.
1,500 transactions per second
90,000 transactions per minute
5,400,000 transactions per hour
129,600,000 transactions per day (almost 130 Million transactions per day = lots)

Are you looking for volume of transactions or speed? Horizontal scalability handles most simultaneous load questions. Interested in your goal. Thanks in advance!

#14

The most important is probably the volume of transactions, that it is able to sustain a heavy load, but ideally you also want a fast settlement time (if that is what you are referring to by speed). I have also thought of horizontal scale as a solution by running a set of side-chains that utilizes one token bridge, but before going into that it was interesting to see what is possible with one chain, and take it from there.

1 Like
#15

Thanks for your response. Yes, “speed” is typically measured from the end-user perspective of total transaction time. If you could please describe the use case it will be very helpful. How will you be using this in the world? What sorts of transactions, simple token transfers between addresses or something more complex?
For example, the gas limit in the use case will significantly impact perceived transactions per second. You have suggested increasing the block size to at least 133 m Gas limit; complex transactions use more Gas than simple transactions, which by default affects total transactions per block. Looking forward to understanding your application. Thanks!

#16

Hi @lollike,

At Protofire we have developed many automation tools for deploying POA Network infrastructure (the POA networks, xDAI networks, cross-chains Bridge, etc) and deployed other POA Networks based sidechains for our clients. We can help to quickly deploying xDAI type of chains with different sets of configuration. I would be glad to jump into a call anytime to explore how can we help.

BTW, Protofire is a xDAI validator along with POA Networks, MakerDAO and Giveth.

Best.

2 Likes
#17

Have you done the experiment yet? What is the setting up and results? :smiling_face_with_three_hearts:

#18

We are setting up some internal test infrastructure as we speak to easily spin up and down chains with different parameters to test it out in practice.
It takes a little while get it all running though :sweat_smile:

1 Like
#19

We are mostly interested in native currency transfers - so the plan is that the network will mostly be used for simple eth transactions, but denoted in xDai, since the bridge will be ERC20-to-Native. The whitepaper by @banteg outlines some good ways of optimizing the costs by batching these transfers, which I am looking into implementing in conjunction with the higher gas limit.

2 Likes
#20

Hi @lollike
do you intend to share your experiences of the test?
We are running ARTIS a similar chain like POA, but I think it would be of general interest, if we all would know, that something like GoChain works sustainably with Parity and what are the limits (e.g. # of nodes, server latencies). Best.

1 Like