Why does Python compile modules but not the script being run?

Files are compiled upon import. It isn’t a security thing. It is simply that if you import it python saves the output. See this post by Fredrik Lundh on Effbot.

>>>import main
# main.pyc is created

When running a script python will not use the *.pyc file.
If you have some other reason you want your script pre-compiled you can use the compileall module.

python -m compileall .

compileall Usage

python -m compileall --help
option --help not recognized
usage: python compileall.py [-l] [-f] [-q] [-d destdir] [-x regexp] [directory ...]
-l: don't recurse down
-f: force rebuild even if timestamps are up-to-date
-q: quiet operation
-d destdir: purported directory name for error messages
   if no directory arguments, -l sys.path is assumed
-x regexp: skip files matching the regular expression regexp
   the regexp is searched for in the full path of the file

If the response is potential disk permissions for the directory of main.py, why does Python compile modules?

Modules and scripts are treated the same. Importing is what triggers the output to be saved.

If the reason is that benefits will be minimal, consider the situation when the script will be used a large number of times (such as in a CGI application).

Using compileall does not solve this. Scripts executed by python will not use the *.pyc unless explicitly called. This has negative side effects, well stated by Glenn Maynard in his answer.

The example given of a CGI application should really be addressed by using a technique like FastCGI. If you want to eliminate the overhead of compiling your script you may want eliminate the overhead of starting up python too, not to mention database connection overhead.

A light bootstrap script can be used or even python -c "import script", but these have questionable style.

Leave a Comment