Intro

This is a fourth part of the seria of publications regarding work on the Swap project. It covers a scope of work on setting up Continues Integration (CI), integrating tests as a part of CI and automation of an Ethereum private chain deployment.

Continues Integration (CI)

The project is hosted in the public github repository. Github team makes a good job to evolve towards a complete workspace beyond its initial codebase cloud repository. Several years ago they introduced Github Actions as their CI tool. When demand in CI for project became a vital task, it was obvious to enable Github Actions CI as a native solution for this platform.

The configuration script is available below:

                name: Node.js CI

                on:
                    push:
                        branches: [ "main" ] # [sic!] this repo has 'main' instead of 'master' branch
                    pull_request:
                        branches: [ "main" ] # [sic!] this repo has 'main' instead of 'master' branch 

                jobs:
                    build:

                        runs-on: ubuntu-latest
                        
                        defaults:
                        run:
                            working-directory: 'server'

                        strategy:
                            matrix:
                                node-version: [16.x]
                                # node-version: [12.x, 14.x, 16.x]
                                # See supported Node.js release schedule at https://nodejs.org/en/about/releases/
                        
                        env:
                            DB_DATABASE: test_db_swap
                            DB_USER: root
                            DB_PASSWORD: root

                        steps:
                            - name: Repository checkout
                                uses: actions/checkout@v3

                            - name: Check directory structure
                                run: ls

                            - name: Use Node.js $
                                uses: actions/setup-node@v3
                                with:
                                    node-version: $
                                    cache: 'npm'
                                    cache-dependency-path: server/package-lock.json
                                # each command run from default folder, changing folder with 'cd' command relevant only during execution of this step 
                                # - run: cd server && ls

                            - name: Print MySql version
                                run: mysql --version
                            
                            - name: Run MySql command (it is unnecessary command, but shows mysql command execution works successfully)
                                run: |
                                    sudo /etc/init.d/mysql start
                                    mysql -e 'CREATE DATABASE $;' -u$ -p$
                                
                            - name: Create client user
                                run: |
                                    mysql -e "CREATE USER 'CI_client'@'%' IDENTIFIED WITH mysql_native_password BY '$';" -u$ -p$
                                    mysql -e "GRANT ALL PRIVILEGES ON *.* TO '$'@'%';" -u$ -p$
                                    mysql -e "FLUSH PRIVILEGES;" -u$ -p$
                                # caching_sha2_password seems causes Client does not support authentication protocol requested by server; consider upgrading MySQL client issue  

                            - name: Run import command
                                run: |
                                    ls
                                    mysql -h 127.0.0.1 -u $ --password=$ test_db_swap < test_db_swap_schema_and_data.sql
                            - name: Check directory structure
                                run: ls

                            - name: Npm check
                                run: npm --version

                            - name: Run npm ci (CI focused alternative for npm install)
                                run: npm ci

                            - name: Run npm build
                                run: npm run build --if-present
                            
                            - name: Run jest debug tests
                                run: npm run test:debug
                            
                            - name: Run all jest test suites
                                run: npm run test
                            
                            - name: Zip build reports
                                id: zipBuildReports
                                if: always()
                                    run: zip -r build-reports.zip test-report.html
                                
                                - name: Upload build reports
                                    if: always() && steps.zipBuildReports.outcome == 'success'
                                    uses: actions/upload-artifact@v2
                                    with:
                                        name: build-reports
                                        path: server/build-reports.zip
                                        retention-days: 3
            


What this script does is install an essential software, setting up test database, runs all tests and prepare test reports.

I am not satisfied with this solution. Due further work I faced with some bizarre issues which I haven't resolved yet which is to give always correct feedback on codebase integrity. I will touch this issues in the next sections, but after thinking & discussing possible mitigation steps to overcome such issues in this or another CI tool, I agreed on the best solution is to make all this CI automation within a bash script to split up valuable automation logic from CI's specific implementation. A known drawback is it will disable us from using some important CI's features such as caching.

Test automation

At this moment server API tests, smart contracts tests and mobile client tests have been written already, but the task is to make them run on each commit to the repository to confirm a commit does not break anything.

Tasks of CI responsible for setting up test environment and execute tests on each commit are shared above.

Similar to the case of government-rus tests passes locally, but on the server one test has been failing which marks CI run as unsuccessful. The root cause of this is unclear. Build reports created at the end of the build work as 2nd source of confidence as a workaround of this issue.

Running Android UI tests requires support of KVM which is rarely provided by cloud CI providers. In case of Github Actions macos-latest and self-hosted runners are only possible solutions.

Running in cloud on macos-latest almost always ends up with failed build, caused by 1-3 tests have been failed. Despite on run the same tests locally is executed successfully.

Setting up a whole flow on another machine with help of self-hosted runner faced with extra issues of additional configurations. With increasing demand this option should be explored further. Some extra thoughts on this and similar topics https://stackoverflow.com/questions/59241249/how-to-run-github-actions-workflows-locally

Backend startup and extracting keys automation

The npm package manager which stays back of NodeJs and the rest JS based technologies supports custom scripts. You have to write such script and later invoke it from the CI.

This are my scripts to launch server under different environments:

                "scripts": {
                    "start": "NODE_ENV=development node app.js",
                    "start:prod": "NODE_ENV=production node app.js",
                    "start:test": "NODE_ENV=test node app.js",
                    "test": "NODE_ENV=test jest --forceExit --detectOpenHandles",
                    // the rest scripts
                }
            


Npm's community provides Node-config to create and support multiple environment configurations. Web ports, databases configs, urls, credentials - all goes there.

One extra script for blockchain was implemented too. Each account on the Ethereum chain has corresponding private keys. When account has been created, we get its keys in the encoded form. For contract deployment we will need this key decoded. Keythereum looks like the best option to do this, script which exploits keythereum was implemented and added as a npm script.

Private chain deployment automation

In previous publications I shared reasons why I have to migrate to ethereum personal private chain. In short, it is not only speed up the development, but also become an essential way to utilise ethereum chain as a part of this project.

Currently, due development stage, a private chain is hosted on the virtual machine deployed on my personal computer. Despite on the VM has a hibernate mode, it still needs to be restarted and the chain deployment should be done again. That's why I decided to automate it.

On the chain start we will have to tell each node others nodes addresses, etherbase accounts and start mining. My initial script which partly automate this is shared below:

                # !/bin/bash

                export pwd=""

                export NODE1_PORT=2001
                export NODE1_PORT_UDP_TCP=30304
                export NODE1_PORT_RPC=8552

                export NODE2_PORT=2002
                export NODE2_PORT_UDP_TCP=30305
                export NODE2_PORT_RPC=8553

                export NODE3_PORT=2003
                export NODE3_PORT_UDP_TCP=30306
                export NODE3_PORT_RPC=8554

                export NODE_IP=127.0.0.1

                echo "Run existing geth nodes. Please make sure they has been create & configured first!"

                if ! command -v geth &> /dev/null
                then 
                    echo "geth command could not be found"
                    exit
                else 
                    echo "geth has been found. continue shell script"
                fi

                # nodes should be run over http to allow curl interactiion for add peer automation

                geth --allow-insecure-unlock --http --http.addr "0.0.0.0" --port $NODE1_PORT_UDP_TCP --http.corsdomain '*' --authrpc.port $NODE1_PORT_RPC --http.port $NODE1_PORT --http.api admin,personal,eth,net,web3  --datadir ./node1/data --miner.gasprice 1 --verbosity 3 &

                geth --http --port $NODE2_PORT_UDP_TCP --http.corsdomain '*' --authrpc.port $NODE2_PORT_RPC --http.port $NODE2_PORT --http.api admin,personal,eth,net,web3  --datadir ./node2/data --miner.gasprice 1 --verbosity 3 &

                geth --http --port $NODE3_PORT_UDP_TCP --http.corsdomain '*' --authrpc.port $NODE3_PORT_RPC --http.port $NODE3_PORT --http.api admin,personal,eth,net,web3  --datadir ./node3/data --miner.gasprice 1 --verbosity 3 &

                echo "Install jq"

                # sudo apt-get install jq -y $p

                sudo apt-get install jq -y "${pwd}"

                echo "Get enode info and add peers to each node"

                node1_enode_result=$(curl -X GET http://$NODE_IP:$NODE1_PORT -H "Content-Type: application/json" --data '{"jsonrpc":"2.0", "id": 1,  "method":"admin_nodeInfo"}' | jq -r '.result.enode')

                IFS="@" read -r node1_enode_id node1_end_point <<< "$node1_enode_result"

                echo "$node1_enode_id"

                node2_enode_result=$(curl -X GET http://$NODE_IP:$NODE2_PORT -H "Content-Type: application/json" --data '{"jsonrpc":"2.0", "id": 1,  "method":"admin_nodeInfo"}' | jq -r '.result.enode')

                IFS="@" read -r node2_enode_id node2_end_point <<< "$node2_enode_result"

                echo "$node2_enode_id"

                node3_enode_result=$(curl -X GET http://$NODE_IP:$NODE3_PORT -H "Content-Type: application/json" --data '{"jsonrpc":"2.0", "id": 1,  "method":"admin_nodeInfo"}' | jq -r '.result.enode')

                IFS="@" read -r node3_enode_id node3_end_point <<< "$node3_enode_result"

                echo "$node3_enode_id"

                node1_endpoint="${node1_enode_id}@${NODE_IP}:${NODE1_PORT}"

                echo "${node1_endpoint}"

                node2_endpoint="${node2_enode_id}@${NODE_IP}:${NODE2_PORT}"

                echo "${node2_endpoint}"

                node3_endpoint="${node3_enode_id}@${NODE_IP}:${NODE3_PORT}"

                echo "${node3_endpoint}"
            


This script will give us urls of nodes, code which tells nodes this urls is:

                curl -X POST http://$NODE_IP:$NODE2_PORT -H "Content-Type:application/json" --data "{"jsonrpc": "2.0", "method":"admin_addPeer", "id":1, "params":["$node1_endpoint"]}"
            


, but this code has been disabled because of issue with getting response from the server and the lack of actual peers despite on a positive server response when command has been executed via console.

Recently the Geth team published documentation how to deploy a private chain. For node configuration they uses a bootnode and in the secure way unlock accounts for start. This week I tried their end-to-end example, it was designed for Clique consensus algorithm. Unfortunately, despite on benefits of the Clique for a private network, it has issues and some severe of them are still open. At this moment the Clique is not a option to go further.

Applying the same example for the Ethash consensus algorithm gives me another issue. I am not able to pass passwords from a separate file. Consequently it makes bootnode not useful here and --allow-insecure-unlock is still necessary, despite on it is a hack which should be done before deployment into production.

Out of the scope

Static analysis tools, mobile client publishing automation, further CI configuration to support email notifications etc are important parts of automation process, but there is no much demand in this yet.

In current plans migration from VM to Docker as a more superior tool compare with traditional virtualization technologies and try to apply Ansible to orchestrate ethereum nodes configuration.

Summary

This a 4th and final part of publications about ongoing work on the Swap project.

The 1st part covers work on the smart contracts integration.
The 2nd part covers work on the project test coverage.
The 3rd part covers work on the mobile and the server clients.