In this article, we continue building on our previous topic, the Solidity compiler installation:
🌍 Previous Topic: Solidity Compiler Installation (NPM)
The previous article was focused on an installation via npm, and in this article, we’ll go through the installation and use of the Solidity compiler via Docker.
🌍 Related Tutorials:
- Install Solidity Compiler via npm
- Install Solidity Compiler via Docker on Ubuntu
- Install Solidity Compiler via Source Code Compilation
- Install Solidity Compiler via Static Binary and Linux Packages
Our goal in this article is to get more familiar with the possibilities of this approach, as well as to get introduced to the technology that “runs the show”. This knowledge and experience will enable us to recognize the reasons behind choosing any of the approaches in the future, depending on the real-world needs of our projects.
What is Docker?
Before we go into details about the Docker installation of solc
, let’s first get introduced to what Docker is.
💡 Docker is an open platform for developing, shipping, and running applications… Docker provides the ability to package and run an application in a loosely isolated environment called a container… Containers are lightweight and contain everything needed to run the application, so you do not need to rely on what is currently installed on the host.
Source: https://docs.docker.com/get-started/overview/
There are some parts of the description I’ve deliberately left out (separated by the symbol …) because they’re not essential to our understanding of the technology.
Now, let’s dissect the Docker description: the keywords of our interest are platform, isolated environment, and container. Let’s quickly dive into each of those next
Platform
A platform is a software framework that supports a specific function or a goal.
The goal Docker supports is enabling a piece of software (application, service, etc.) to correctly run, regardless of the target environment.
For us, this means running the Solidity compiler, i.e. feeding it with the input source code and producing the output bytecode in the form of .abi
and .bin
files.
Isolated Environment
By mentioning an isolated environment, we remember the concept of virtualization learned about earlier, meaning that Docker enables our software to run as intended by providing it with the resources in form of software libraries, network access, remote services, and other dependencies.
Container
Docker ensures the resources are provided without additional intervention by arranging them in a package called a container. Containers begin their lifecycle as images that we most commonly download and run.
We can also create a Docker image, but that’s another story.
Running an image creates a live instance of it, a container. Before it can be used, a Docker image has to be prepared, meaning that someone should install and configure all the required resources needed for the software to run.
Preparation of a Docker image falls in the domain of DevOps, i.e. Development and Operations:
💡 “DevOps engineers manage the operations of software development, implementing engineering tools and knowledge of the software development process to streamline software updates and creation.”
Source: https://www.indeed.com/hire/c/info/devops-engineer
Also, read our article:
🌍 Recommended Article: Top 20 Skills Every DevOps Engineer Ought to Have
Using Solidity Compiler via Docker
Now that we have introduced Docker in general, we are continuing with the installation of the Solidity compiler via Docker.
First, we have to check if Docker is present on our system by simultaneously checking the Docker version:
$ docker version bash: /usr/bin/docker: No such file or directory
As our check shows, we have to install Docker on our system before we can use it. The installation process via the Ubuntu repository is made of several steps (https://docs.docker.com/engine/install/ubuntu/):
Step 1: Update the apt package index
$ sudo apt update … Reading package lists... Done Building dependency tree Reading state information... Done All packages are up to date.
Step 2: Install packages
Installation of additional packages; we need these packages to enable the installation process accessing the repository over the secure HTTPS connection (note the backslash symbol \
for the multiline command):
$ sudo apt install \ ca-certificates \ curl gnupg lsb-release ... The following additional packages will be installed: gnupg-l10n gnupg-utils gpg-wks-server Suggested packages: parcimonie xloadimage The following NEW packages will be installed: ca-certificates curl gnupg gnupg-l10n gnupg-utils gpg-wks-server lsb-release ... Do you want to continue? [Y/n] y ...
Step 3: Add Docker GPG key
Adding the Docker’s official GPG key:
$ sudo mkdir \ -p /etc/apt/keyrings $ curl -fsSL https://download.docker.com/linux/ubuntu/gpg \ | sudo gpg --dearmor -o /etc/apt/keyrings/docker.gpg
ℹ️ Info: “GPG, or GNU Privacy Guard, is a public key cryptography implementation. This allows for the secure transmission of information between parties and can be used to verify that the origin of a message is genuine.”
Source: https://www.digitalocean.com/community/tutorials/how-to-use-gpg-to-encrypt-and-sign-messages
Step 4: Set up repository
Setting up the repository by writing to docker.list
file.
The echo
command evaluates the text inside the $( )
, populates it with the command outputs (in parentheses), and sends it via stdin to system utility sudo tee
with root privileges, which in turn overwrites the docker.list
file and omits the output by redirecting it to /dev/null
:
$ echo \ "deb [arch=$(dpkg --print-architecture) \ signed-by=/etc/apt/keyrings/docker.gpg] \ https://download.docker.com/linux/ubuntu \ $(lsb_release -cs) stable" | sudo tee \ /etc/apt/sources.list.d/docker.list > /dev/null
ℹ️ Info: Repositories added by mistake can be removed from Ubuntu 20.04 by selectively deleting them in /etc/apt/sources.list.d/
directory.
Step 5: Update apt package index
Updating the apt package index (once again):
$ sudo apt update ... Reading package lists... Done Building dependency tree Reading state information... Done All packages are up to date.
Step 6: Install Docker
Installing Docker (the latest stable version) and its components:
$ sudo apt-get install docker-ce docker-ce-cli containerd.io docker-compose-plugin Reading package lists... Done Building dependency tree Reading state information... Done The following additional packages will be installed: docker-ce-rootless-extras docker-scan-plugin pigz slirp4netns Suggested packages: aufs-tools cgroupfs-mount | cgroup-lite The following NEW packages will be installed: containerd.io docker-ce docker-ce-cli docker-ce-rootless-extras docker-compose-plugin docker-scan-plugin pigz slirp4netns 0 upgraded, 8 newly installed, 0 to remove and 0 not upgraded. Need to get 108 MB of archives. After this operation, 449 MB of additional disk space will be used. Do you want to continue? [Y/n] y ...
Let’s check the Docker version once again:
$ docker version Client: Docker Engine - Community Version: 20.10.17 API version: 1.41 Go version: go1.17.11 Git commit: 100c701 Built: Mon Jun 6 23:02:57 2022 OS/Arch: linux/amd64 Context: default Experimental: true Server: Docker Engine - Community Engine: Version: 20.10.17 API version: 1.41 (minimum version 1.12) Go version: go1.17.11 Git commit: a89b842 Built: Mon Jun 6 23:01:03 2022 OS/Arch: linux/amd64 Experimental: false containerd: Version: 1.6.7 GitCommit: 0197261a30bf81f1ee8e6a4dd2dea0ef95d67ccb runc: Version: 1.1.3 GitCommit: v1.1.3-0-g6724737 docker-init: Version: 0.19.0 GitCommit: de40ad0
Now that we’re sure that our Docker installation went through and the Docker Engine version we have is 20.20.17
(at the time of writing this article). The next step is getting the Docker image with the Solidity compiler.
Docker images are identified by their release organization, image name (shorter, images), and tag, i.e. label that makes them unique. In general, we can download a Docker image by referencing it with its organization/image:tag
marker.
We will download a Docker image of the Solidity compiler by specifying its marker as ethereum/solc:stable
for a stable version, and ethereum/solc:nightly
for the bleeding edge, potentially unstable version.
We can also specify a distinct version of the Solidity compiler by setting a tag to a specific version, e.g. ethereum/solc:0.5.4
.
We will do three things with one Docker command: we’ll download the image, instantiate (run) a container from the image and print the container usage (flag –help):
docker run ethereum/solc:stable --help
Sure enough, we’d like to compile our Solidity files, so we’ll make three preparations (First, Second, Third):
First: Create a local directory containing our Solidity source code (I’ll use 1_Storage.sol
from the Remix contracts folder by creating an empty file and pasting the content into it):
$ mkdir ~/solidity_src/ && cd ~/solidity_src/ $ touch 1_Storage.sol
Second: You can write your own contract for testing purposes or just open the 1_Storage.sol
with your favorite text editor and paste the contents from 1_Storage.sol
example in Remix.
Third: Run a Docker container (we already have the image so the download procedure will be skipped); command flag -v
mounts our local ~/solidity_src
directory to the container’s path /sources
, path ethereum/solc:stable
selects the Docker image to run a container, command flag -o
sets the output location for the compiled files, --abi
and --bin
activate the generation of both .abi
and .bin
files, and the path /sources/1_Storage.sol
selects the source file for compilation:
$ docker run -v ~/solidity_src:/sources ethereum/solc:stable -o /sources/output --abi --bin /sources/1_Storage.sol Compiler run successful. Artifact(s) can be found in directory "/sources/output".
When checking our solidity_src
directory, we’ll discover a new directory output, created by the Solidity compiler, containing both .abi
and .bin
files.
Docker also enables us to use the standard JSON interface, and it is a recommended approach when using the compiler with a toolchain. This interface doesn’t require mounted directories if the JSON input is self-contained, in other words, all the code is already contained in the source files and there are no references to external, imported files:
docker run ethereum/solc:stable --standard-json < input.json > output.json
Since we haven’t done any examples using the JSON interface, we’ll suspend this approach until a later time.
👉 If you want to find out more ways to install the Solidity compiler, check out our full guide on the Finxter blog.
Conclusion
This article introduced us to a Solidity-supporting technology called Docker.
Of course, our main focus is on an ecosystem consisting of Solidity, Ethereum, blockchain technology, etc., but I recognized an opportunity of making a detour and walking us through the process of setting up and using the Solidity compiler via the Docker platform. Therefore, although initially unplanned, we’re also gaining some DevOps skills.
In the first and only chapter (yeah, I’m a bit surprised as well) we’ve set the mining charges by getting to know what Docker is. Then we blew a big piece of rock away by discovering how to install Docker on Ubuntu Linux (and by extension, some other operating systems). I believe this article will prove useful and provide multiple tips and tricks in terms of setting your development environment for Solidity on Ubuntu Linux. Besides that and personally speaking, it was always useful to gain secondary knowledge whenever I learned a specific topic, and I’m sure you’ll have the same experience.
🌍 Recommended Tutorial: Solidity Crash Course (by Matija)
Learn Solidity Course
Solidity is the programming language of the future.
It gives you the rare and sought-after superpower to program against the “Internet Computer”, i.e., against decentralized Blockchains such as Ethereum, Binance Smart Chain, Ethereum Classic, Tron, and Avalanche – to mention just a few Blockchain infrastructures that support Solidity.
In particular, Solidity allows you to create smart contracts, i.e., pieces of code that automatically execute on specific conditions in a completely decentralized environment. For example, smart contracts empower you to create your own decentralized autonomous organizations (DAOs) that run on Blockchains without being subject to centralized control.
NFTs, DeFi, DAOs, and Blockchain-based games are all based on smart contracts.
This course is a simple, low-friction introduction to creating your first smart contract using the Remix IDE on the Ethereum testnet – without fluff, significant upfront costs to purchase ETH, or unnecessary complexity.