Compile python from scratch, install in a local development environment and create a python virtual environment on top, that has no system dependencies.
Why?: Because I couldn't find a complete tutorial or anything similar and I do need a way to easily build the exact same environment on multiple hosts (different distros).
This is different from pyevn in the following ways:
- Installs locally all required libraries, including build tools when these are
missing (if
pyenv
was doing that, this script wouldn't be needed) - Not intended to be a python environment manager, instead is only the builder
while
virtualenv
is used to change between environments - It can build a package using
checkinstall
- Its a lot more immature and untested
NOTE: Tested on:
- CentOS 6.6
- Debian 7.1
- Ubuntu 14.04.3
Drop me a message if you have successfully run this on different distros/platforms.
Build tools: gcc, make. All other tools will be installed if missing
DO NEVER RUN THIS AS ROOT
/path/to/pfs.sh -p </path/to/new/env> -v <X.Y.Z> [-r <pip-requirements-file>]
If a requirements file is given, after the environment is created (and sourced) it runs:
pip install -r <pip-requirements-file>
Use this environment as any python venv:
source /path/to/new/env/bin/activate
Well, the obvious:
rm -rf /path/to/env
However, the following will remove the python virtual environment keeping the sources and local build environment intact:
/path/to/pfs.sh -p </path/to/new/env> -c
Appending -a
will remove lib
, bin
and src
folders and thus the python
virtual environment and sources are gone. The venv can be rebuild at any time
using the "create" command. Libraries are not going to be re-build since the
local
folder (development/build environment) has been preserved.
Once you are happy with your build you can remove the sources to reduce disk space (from ~650M to ~250M):
rm -rf /path/to/new/env/src
We have a custom development/build environment and therefore, in order to
build new libraries, we need to export the correct include and lib paths.
The virtualenv.py
activate
scripts has been modified to do this, so
all we need to do is:
export C_INCLUDE_PATH && pip install <module>
Installing pure python modules should not be a problem. Depending on how
the build environment was created, we might have to export more variables
(see later: TMPDIR
, LD_LIBRARY_PATH
, etc)
Few issues I had:
-
Python linking against system's
libpython.X.X.so
, solution: As suggested onstackoverflow
[todo], modify the setup.py (seesed -i
in the script). -
pip --global-option
being ignored or includes not found, solution: Setup yourLD_LIBRARY_PATH
andC_INCLUDE_PATH
and export them. If the venv is active you can also use$VIRTUAL_ENV
, example:export PATH; export TMPDIR=~/tmp; export C_INCLUDE_PATH && \ pip install cffi --global-option=build_ext --global-option=-L$VIRTUAL_ENV/lib64
-
/tmp
missing exec permissions (security on some systems), solution: export anotherTMPDIR
as in the example above
This project is under development so things are not complete - not all python dependencies are satisfied...
The high level process is:
-
Install build tools:
m4
,shtool
,autoconf
,automake
,libtool
-
Install python dependencies. At the moment we are installing:
ncurses
,readline
,zlib
,bzip2
,lzma
,gdbm
,openssl
andsqlite3
-
Install python:
... the usual process. The only difference is that setup.py is modified to not look into
/usr/loca/lib
andinclude
, instead these are replaced with the local prefix beforemake && make install
:sed -i "s|/usr/local/lib|$PREFIX/lib|" ./setup.py sed -i "s|/usr/local/include|$PREFIX/include|" ./setup.py
-
Install few extra libs on top. They are mainly for SNMP and xml parsing:
libsmi
,libffi
,libxml2
andlibxslt
,libyaml
A function is provided to take care of the heavy lifting: installLib
use it
the following way:
installLib "http://ftp.gnu.org/pub/gnu/ncurses/ncurses-6.0.tar.gz" \ # <-- Download URL
"ncurses-6.0.tar.gz" \ # <-- File name to save it in `src` folder
"ncurses-6.0" \ # <-- Folder name after extraction (tar.gz and zip supported)
"ncurses/curses.h" \ # <-- A single `include` file that is use to check if already installed
"confmake" \ # <-- Build type (confmake, autogen - see source)
"--with-shared --without-normal" # <-- Additional args to `./configure` (--prefix and --enable-shared are added)
If you don't really intend to keep the development environment, consider using
-p /dev/shm/pfs
or /run/shm/pfs
which are going to be discarded after the
next reboot. Once you are happy with your build you can either copy it over to
disk, or create a package out of it and copy that.
NOTE: if you move the python virtual environment (bin
and lib
) folders, it
will require rebuilding. Therefore, your local
should also be copied over to
the new location
If you do want to keep the development environment but you don't care about the
sources and build, you can specify -s /run/shm/pfs/src
as your source/build
while the developement can be elsewhere (-p ~/pfs/dev3.4.3
).
Have fun!
THIS SOFTWARE IS PROVIDED "AS IS" AND ANY EXPRESSED OR IMPLIED WARRANTIES, INCLUDING, BUT NOT LIMITED TO, THE IMPLIED WARRANTIES OF MERCHANTABILITY AND FITNESS FOR A PARTICULAR PURPOSE ARE DISCLAIMED. IN NO EVENT SHALL THE REGENTS OR CONTRIBUTORS BE LIABLE FOR ANY DIRECT, INDIRECT, INCIDENTAL, SPECIAL, EXEMPLARY, OR CONSEQUENTIAL DAMAGES (INCLUDING, BUT NOT LIMITED TO, PROCUREMENT OF SUBSTITUTE GOODS OR SERVICES; LOSS OF USE, DATA, OR PROFITS; OR BUSINESS INTERRUPTION) HOWEVER CAUSED AND ON ANY THEORY OF LIABILITY, WHETHER IN CONTRACT, STRICT LIABILITY, OR TORT (INCLUDING NEGLIGENCE OR OTHERWISE) ARISING IN ANY WAY OUT OF THE USE OF THIS SOFTWARE, EVEN IF ADVISED OF THE POSSIBILITY OF SUCH DAMAGE.