Wednesday, September 10, 2008

Packages in Python extension modules

I've been working on some Python bindings for Nebula3 in the last few days, using Boost Python. At this point I don't intend to bind everything in Nebula3, I just need access to a few of the classes via Python (more on that in some future post). Boost Python is pretty nice, once you figure out the basics. The Boost build system (bjam) on the other hand is a royal pain to figure out if you want to use it to build your own projects, so I won't be using it for anything other than building the Boost libraries.

Anyway, in this post I'm going to explain how to create a package-like Python C/C++ extension module. I wanted all the Nebula3 Python bindings in one C++ extension module, with each Nebula3 namespace in it's own sub-module. Python packages are usually defined using a hierarchy of directories, so I want the equivalent of:

/mypackage
    __init__.py
    /Util
        __init__.py
        string.py
    /IO
        __init__.py
        uri.py

First of all you need to indicate to Python that your module is actually a package, you do so by by setting the __path__ attribute on the module to the name of the module.

// mypackage.cpp

BOOST_PYTHON_MODULE(mypackage)
{
    namespace bp = boost::python;

    // specify that this module is actually a package
    bp::object package = bp::scope();
    package.attr("__path__") = "mypackage";

     export_util();
     export_io();
}

Now we can create the Util sub-module.

// export_util.cpp

void export_util()
{
    namespace bp = boost::python;
    // map the Util namespace to a sub-module
    // make "from mypackage.Util import <whatever>" work
    bp::object utilModule(bp::handle<>(bp::borrowed(PyImport_AddModule("mypackage.Util"))));
    // make "from mypackage import Util" work
    bp::scope().attr("Util") = utilModule;
    // set the current scope to the new sub-module
    bp::scope util_scope = utilModule;
    // export stuff in the Util namespace
    bp::class_<Util::String>("String");
    // etc.
}

In line 8 we use PyImport_AddModule() to add the sub-module to Python's sys.modules list, and in line 10 we store the sub-module as an attribute of the package. And here's the IO sub-module.

// export_io.cpp

void export_io()
{
    namespace bp = boost::python;

    // map the IO namespace to a sub-module
    // make "from mypackage.IO import <whatever>" work
    bp::object ioModule(bp::handle<>(bp::borrowed(PyImport_AddModule("mypackage.IO"))));
    // make "from mypackage import IO" work
    bp::scope().attr("IO") = ioModule;
    // set the current scope to the new sub-module
    bp::scope io_scope = ioModule;
    // export stuff in the IO namespace
    class_<IO::URI>("URI",
        "A URI object can split a Uniform Resource Identifier string into "
        "its components or build a string from URI components.")
        .def(bp::init<const Util::String&>());
}

Once you've built the extension module you can import it into Python and check that it works just like a regular package.

>>> import mypackage
>>> from mypackage.Util import String
>>> from mypackage import Util
>>> from mypackage.IO import URI
>>> from mypackage import IO

This is something that doesn't seem to be well documented anywhere at this point, so hopefully this short writeup will save a few people some time. Note that while the code here uses Boost Python it should be relatively simple to adapt it to use the plain Python C API.

5 comments:

Anonymous said...

Thanks for posting this. It's exactly what I was looking for.

Brad Froehle said...

If you do not know the main package name at compile time, you can still use this method with a slight modification.

Anonymous said...

Nice post. I've also found boost python very nice. But I'm cross-compiling code to run on an embedded system. Can you illustrate how to get my setup.py to install the libraries on the embedded system?

Anonymous said...

Thanks, works really well!

Unknown said...

This was exactly what I was looking for and was indeed not documented anywhere else. Thanks!