POA Forum

POSDAO activation

@matilote Please check your xDai node, seems it has 0 peers and thus skipping blocks.

A friendly ping to update your nodes for upcoming POSDAO activation on 2020-04-01T07:00:00Z

@chebykin (Galt Project) please restart your node since it’s offline and isn’t producing blocks: https://blockscout.com/poa/xdai/address/0x0000999dc55126CA626c20377F0045946db69b6E/validations

This can be done with the following command:

sudo systemctl stop poa-parity && sleep 10 && sudo systemctl start poa-parity && sleep 10 && sudo systemctl restart poa-netstats

1 Like

@varasev this is done, had some issues with firewall setup.

2 Likes

@varasev done, there was an issue with a server itself, not a parity client. thanks for quick reaction

2 Likes

@varasev - lab10 has been updated

1 Like

Three days left before the hard fork.

Portis @tomteman
Gnosis @denisgranha
MakerDAO @lollike @savdao

Please update.

Good afternoon @varasev Gnosis node has just been updated https://dai-netstat.poa.network/

2 Likes

@varasev MakerDAO node updated

2 Likes

Hi @tomteman We have less than 24 hours before POSDAO activation. All nodes (except yours) have already been updated. Please, follow the instruction above to switch your spec and client to a new version.

1 Like

@varasev Portis is live, sorry for the delay :slight_smile:

4 Likes

Hello

@saschagoebel (Anyblock Analytics)
@chebykin (Galt Project)

We’re observing a slow work of your nodes when committing/revealing random numbers. This makes the next validators (after you in AuRa round - Syncnode and Nethermind respectively) skip blocks sometimes (because the previous block time is 8-9 seconds instead of 5).

The reason is unknown yet, but the logs on Nethermind’s node and our node show that the bottleneck is definitely in the code responsible for making committing/revealing transactions.

This is only observed on your nodes. The rest validators are fine.

To help us find the bottleneck in Parity’s code, please switch the logs on your node to the detailed mode.

For that, you need to set

logging = "engine=trace,miner=trace"

in [misc] section of node.toml and restart the node with

sudo systemctl stop poa-parity && sleep 10 && sudo systemctl start poa-parity && sleep 10 && sudo systemctl restart poa-netstats

This will turn on advanced logs which we will be able to analyze a bit later.

@varasev, done…

Done. 350G free disk space, should be enough for some logs :wink:

@saschagoebel @chebykin thanks! Now please share the logs with me.

1 Like

@saschagoebel @chebykin

Thank you for the detailed logs! Thanks to them I managed to find places in the code which need to be analyzed deeply. For that I made a special build for you with extended logs.

The changes in the Parity code can be seen here: https://github.com/poanetwork/open-ethereum/compare/posdao-backport...poanetwork:posdao-backport-extra-logged

Now please replace the current parity binary with the new one and launch it:

  1. ssh login to your node
  2. stop parity service
sudo systemctl stop poa-parity
  1. switch to home directory
cd /home/validator
  1. switch to validator user
sudo -i -u validator
  1. remove the current parity binary
rm parity
  1. download new parity binary from our repo
curl -SfL 'https://github.com/poanetwork/open-ethereum/releases/download/v2.7.2-posdao-stable/parity-logged' -o parity
  1. check binary integrity
echo '03887c1bb1ac3c95df8c22d023c64bf9d4e0a2e18d0401361e710a4d030b8137 parity' | sha256sum -c

output should be

parity: OK
  1. set permission to execute the binary
chmod +x parity
  1. login back to sudo-enabled user
exit
  1. restart services
sudo systemctl start poa-parity && sleep 10 && sudo systemctl restart poa-netstats

The new binary will collect necessary logs for further analyzing.

You can also build the binary manually from this tree: https://github.com/poanetwork/open-ethereum/tree/posdao-backport-extra-logged

Thanks in advance!

2 Likes

Done. We’re leaving the trace logs on, right?

2 Likes

Yes, please send me the new logs tomorrow.

1 Like

Done. Activation was around 14:30 yesterday

1 Like

Received your logs, thank you!

1 Like