Size of tutor(16MB) built on my machine is less than the official tutor binary(37MB)

OS:-
Linux edx-integration 4.15.0-76-generic #86-Ubuntu SMP Fri Jan 17 17:24:28 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux

Python 3.6.9
pip 20.0.2 from /py_venv_3.6/lib/python3.6/site-packages/pip (python 3.6)

Steps followed:-

git clone https://github.com/overhangio/tutor.git
cd tutor
pip install -r requirements/dev.txt
pip install setuptools==v44.0.0
make test
make test-format
make format
make test-lint
 make bundle
 du -sh ./dist/tutor

16M ./dist/tutor

Is there anything that I miss while building locally?

No, the steps you are following are correct. Actually, there is no need to run test-format format test-lint once you have run test. The CI script used to produce the binaries is here: https://github.com/overhangio/tutor/blob/master/.travis.yml

On my machine (Ubuntu 18.04) the Tutor binary is also smaller than the one produced by Travis CI. It is strange that the binary built by Travis CI ends up being more than twice the size. My guess is that this comes from the fact that we are running CI in Ubuntu 14.04, and not Ubuntu 18.04 or Ubuntu 20.04. According to this comment we are doing this to keep Tutor compatible with older Linux releases.

We can attempt to replicate the CI environment by running make bundle in a Ubuntu 14.04 container. From the tutor folder, run:

docker run --rm -it -v $(pwd):/tutor ubuntu:14.04 bash
apt update
apt install -y python3-pip
cd /tutor
pip3 install -r requirements/dev.txt
make bundle

Unfortunately, only Python 3.4 is available in this docker image, so the above commands do not work. I realise that Ubuntu 14.04 is not supported since April last year. I’ll attempt to upgrade, see if this reduces the binary size.

EDIT: upgrading to Ubuntu 16.04 does not work :frowning: Travis CI

$ ls -lh ./dist/tutor
-rwxr-xr-x 1 travis travis 38M Jul  1 07:25 ./dist/tutor

So the increased size is probably due to the fact that we are running Python 3.6 on Xenial, where the default Python is Python 3.5. I’ll keep the upgrade to Xenial, although it does not fix our issue.