(This is a scaled-back version of a proposal I sent to this list a
couple of months ago )
There are various configuration flags that can be used when building
Currently we have a configuration aimed at the typical use-case: as much
optimization as reasonable.
However, upstream Python supports a number of useful debug options which
use more RAM and CPU cycles, but make it easier to track down bugs 
Typically these are of use to people working on Python C extensions, for
example, for tracking down awkward reference-counting mistakes. I've
had at least three developers whose opinion I value very highly ask me
for these (for example John Palmieri is currently working on the PyGI
stack, and is running into difficult reference-counting issues).
Indeed, Debian and Ubuntu have had these alternate builds available for
a couple of years now. 
I've looked through Debian's patch , and come up with a somewhat
modified version that does mostly the same thing, though (I think)
somewhat better fitting our build process.
The python.spec now configures and builds, and installs the python
sources twice, once with the regular optimized settings, and again with
debug settings. (in most cases the files are identical between the two
installs, and for the files that are different, they get separate paths)
I've been testing with this on my machine and it works fine; I've also
been able to successfully use distutils to build extension modules
So I've decided to try this in Rawhide for F-14; the latest build is
The relevant CVS commit is here:
and the specfile comment contains more detailed implementation notes.
The builds are set up so that they can share the same .py and .pyc files
- they have the same bytecode format.
However, they are incompatible at the machine-code level: the extra
debug-checking options change the layout of Python objects in memory, so
the configurations have different shared library ABIs. A compiled C
extension built for one will not work with the other.
The key to keeping the different module ABIs separate is that module
"foo.so" for the standard optimized build will instead be "foo_d.so"
i.e. gaining a "_d" suffix to the filename, and this is what the
"import" routine will look for. This convention is from the Debian
patch, and ultimately comes from the way the Windows build is set up in
the upstream build process.
Similarly, the optimized libpython2.6.so.1.0 now has a
libpython2.6_d.so.1.0 cousin for the debug build: all of the extension
modules are linked against the appropriate libpython, and there's
a /usr/include/python2.6-debug directory, parallel with
the /usr/include/python2.6 directory. There's a new "sys.pydebug"
boolean to distinguish the two configurations, and the distutils module
uses this to supply the appropriate header paths ,and linker flags when
building C extension modules.
Finally, the debug build's python binary is /usr/bin/python2.6-debug,
hardlinked as /usr/bin/python-debug (as opposed to /usr/bin/python2.6
It's easy to spot the debug build: the interactive mode tells you the
total reference count of all live Python objects after each command:
[david@surprise devel]$ python-debug
Python 2.6.5 (r265:79063, May 19 2010, 18:20:14)
[GCC 4.4.3 20100422 (Red Hat 4.4.3-18)] on linux2
Type "help", "copyright", "credits" or "license" for more information.
>>> print "hello world"
So the debug build shares _most_ of the files with the regular build
(.py/.pyc/.pyo files; directories; support data; documentation); the
only differences are the ELF files (binaries/shared libraries), and
infrastructure relating to configuration (Include files, Makefile,
python-config => python-debug-config, etc) that are different.
I've tested building the "coverage" module against both runtimes, and it
works; it installs shared .py/.pyc files and a pair of
I tried a few different ways of packaging the debug configuration: I
(a) adding it to the python-devel subpackage, or
(b) to the python-debuginfo subpackage (Debian adds it to their
python-dbg packages, which are kind of the equivalent of our -debuginfo
(c) building out a "debug" subpackage for each of the subpackages within
the python specfile, doubling the number of subpackages
The approach I favor (option (d), I guess), is to have a single
"python-debug" subpackage, holding everything to do with the debug
configuration: equivalent to all of the subpackaes from the regular
configuration, and requiring them all (since they leverage the
shared .py files, for instance). My reasoning here is that this feature
is aimed at advanced Python developers, and if you want some of it you
probably want all of it - so just one subpackage for simplicitly - but
you don't need it for regular builds or debugging, so it's seems better
to keep separate from the -devel and -debuginfo subpackages.
This is a scaled-back version of my earlier proposal (in which I
proposed entirely parallel stacks, and varying the unicode settings)
This is far simpler. In particular, the optimized build should be
unaffected: all of the paths and the ELF metadata for the standard build
should be unchanged compared to how they were before adding the debug
I would like to build out some of our compiled extension modules so that
we can add -debug subpackages, in an analogous way to the core python
package, but I think it should purely be a voluntary thing: I don't want
to burden people packaging Python modules with additional work. Having
said that, if you do find yourself debugging a nasty reference counting
issue inside an extension module, you'll need a debug build of every C
extension module that your reproducer script uses, so the more the
better. For reference, Ubuntu do this for all of the Python code in a
typical GNOME desktop . We should figure out sane RPM conventions
for packaging these (sorry: yes I want to change the python packaging
guidelines again, hopefully less invasive than the Python 3 change
I'm tracking all of this work here:
I hope for it to be a Fedora 14 feature. It's debatable whether it
should be a feature: this is an area where we're somewhat behind other
distributions, so not so good from a marketing perspective - but a good
thing to get fixed.
I plan to work next on doing the same for our python3 src.rpm. I need
to try to get this upstream in some form as well.
Hope this seems sane - thoughts? (thanks for reading this far; I know
this email is too long)
http://patch-tracker.debian.org/patch/series/view/python2.6/2.6.5-2/debug... and http://patch-tracker.debian.org/patch/series/view/python2.6/2.6.5-2/pydeb...
This is just a heads up -- python-sphinx-1.0b1 has arrived. hircus or
I will be updating rawhide to the new version soon -- which may cause your
documentation to stop building. There are some incompatible changes in this
update, notably to the markup used for c functions and modules. There is
a compatibility plugin that you can enable to take care of some of the
issues or you can port to the new "domains" syntax.
we probably will not be pushing this back to older versions of Fedora but
we'll have to see what the response is. If we need to push out a compat
package for EPEL5 (python-sphinx1-1.0) then we'll have the possibility of
doing the same for older Fedora.
Pynie is yet another implementation of Python, this time on top of the
Parrot virtual machine.
I've created a src.rpm and opened a package review request for Pynie
Anyone feel like reviewing it?
Hope this is helpful
Last week I updated the entire TurboGears2/Pylons stack in rawhide, F13,
F12, and EL-5. Everything should currently be in in updates-testing, so
any feedback or karma you can provide would be appreciated.
I know there is a TG2 hackfest this Sunday, where we are trying to chip
away at the small blocker list for the TG2.1 release. I'm also still
trying to figure out a couple of bugs in Moksha & fedoracommunity due to
the latest stack, but I'm hoping to have those ironed out before or
shortly after F13.
I also updated the TurboGears resources on the wiki.
pypy is an alternate implementation of Python , implemented in Python
itself, together with a set of tools for compiling that implementation
down to C code. The resulting implementation can, in theory, be faster
than the regular "CPython" implementation we know as /usr/bin/python
I've managed to package up pypy for Fedora; I've opened up a review
request for it here:
Anyone feel like reviewing it?
You can download F-13 packages from the scratch build here:
Upstream pypy is working on implementing the C extension module API, so
in theory we could try recompiling our python extension modules against
pypy at some future point. I don't yet know if that's going to be
feasible in the F14 timescale, or if we're looking at further in the
Hope this is useful