Archive for April, 2009

A buildout for Plone 2.0.5

At ONE/Northwest we’re always looking for ways to improve and streamline our system administration tasks. Recently, we’ve been working on converting all our old Zope instances to be buildout-based (to make it easier to recreate the environment for local testing of changes or in case the instance needs to move to another server). Here are some tips based on things we’ve learned in the process of putting together our buildout for Plone 2.0.5 …

Use the right Python

Plone 2.0.5 is based on Zope 2.7, which requires Python 2.3 rather than Python 2.4 like modern versions of Zope. (We tested using Python 2.4 and it seems to work okay; however Zope 2.7’s RestrictedPython has not been audited in Python 2.4 and there’s no guarantee that users with the rights to edit scripts won’t be able to do something nasty.)

I installed Python 2.3 using macports, then made sure to bootstrap and run my buildout using Python 2.3. Buildout initially complained about the ‘subprocess’ module being missing, but I was able to work around this by copying from my Python 2.4 libs (/opt/local/lib/python2.4/ in my case) into the Python 2.3 libs.

Update: Recent versions of plone.recipe.zope2install use some Python generators which aren’t compatible with Python 2.3, so I had to pin this egg to version 3.2.

Fry up some products

In a classic Zope installation you keep all your products in one Products directory. In buildout they are typically spread between several different product directories. In the case of Zope 2.7, we were seeing an issue where it only found products located in a Products directory at the root of the buildout, even if we listed additional directories. To work around this, I used collective.recipe.omelette to symlink the various buildout-generated products dirs into the main Products dir that Zope finds. Always nice to find a new use for a tool I designed for a completely different problem!

We do our development on OS X which uses a case insensitive filesystem, so I started using /svnproducts as a replacement for the /products dir that is often found in buildouts, since this would otherwise conflict with the auto-generated /Products.

Your configuration is no good here

Unfortunately the zope.conf that the plone.recipe.zope2instance recipe generates contains a couple bits of configuration (verbose-security and default-zpublisher-encoding) that cause Zope 2.7 to barf, since they were not added until later versions of Zope. To work around this, we used sed (via the plone.recipe.command recipe) to remove the offending bits.

The buildout

I ripped out the bits specific to our own systems and ended up with the following, which incorporates the above learnings. If I didn’t mess up while abridging it, it even works!

parts =
versions = versions

plone.recipe.zope2install = 3.2

recipe = plone.recipe.distros
urls =
nested-packages = Plone-2.0.5.tar.gz
version-suffix-packages = Plone-2.0.5.tar.gz

recipe = plone.recipe.zope2install
url =
fake-zope-eggs = false

# Archetypes and kupu are not strictly required, but here's how to get them if you need them.
recipe = plone.recipe.distros
urls =
nested-packages =
version-suffix-packages =

recipe = collective.recipe.omelette
eggs =
packages =
    ${buildout:directory}/svnproducts .
    ${buildout:directory}/parts/productdistros .
    ${buildout:directory}/parts/plone .
location = ${buildout:directory}/Products

recipe = plone.recipe.zope2instance
zope2-location = ${zope2:location}
user = admin:admin
http-address = 8080
debug-mode = on
verbose-security = on
products =

recipe = plone.recipe.command
command =   
    sed -i '' 's/verbose-security/#verbose-security/' ${buildout:directory}/parts/instance/etc/zope.conf
    sed -i '' 's/default-zpublisher-encoding/#default-zpublisher-encoding/' ${buildout:directory}/parts/instance/etc/zope.conf
update-command = ${fixer:command}

Many thanks to my colleague Jon Baldivieso who did some of the initial work on this buildout.

Update 5/1/2009: Added fake-zope-eggs = false to avoid trying to build fake eggs from a directory that doesn’t exist in Zope 2.7.

Update 8/21/2009: Pinned plone.recipe.zope2install to version 3.2, as newer versions use Python generators that aren’t compatible with Python 2.3.

Image captions and the PortalTransforms cache

Plone 3 contains a cool feature whereby images inserted in kupu can automatically be captioned with the description stored on the image itself. However, if you’ve tried to use this you may have noticed that after you edit the description, and then try to view a page that includes that image with a caption, the caption sometimes does not update for up to an hour.

The reason has to do with the way in which the captions are added, via the html-to-captioned transform (which should be applied to anything with the text/x-html-safe output mimetype, thanks to a special transform policy that is configured for that mimetype). It turns out that PortalTransforms caches the result of a transform for an hour. It stores this result in a volatile (that means non-persistent) attribute of the value that is being transformed, so in normal cases when you are editing and replacing that value entirely, we don’t have to worry about cache invalidation–that is, if I save new body text for a page, the cached transform value will be wiped out and we’ll see the new text the next time we view the page. However, in the case of editing the caption of an image, I’m saving a completely different value, on a different object, than the one where the transform result is cached. So even if I reload the page I’ll see the old caption.

Workaround 1: Change cache lifetime setting

Option 1: Go to /portal_transforms/manage_cacheForm in the ZMI and lower the cache lifetime. But remember that this will have a negative effect on performance (particularly for pages that are viewed frequently but not served out of a reverse proxy), as the transform will have to be applied more often.

Workaround 2: Manually invalidate the cache when the image is edited

Option 2: When the image caption is edited, invalidate the transform cache on the field referencing the image. I took a first stab at doing this for a project currently underway. For this project I have a custom subclass of ATImage and images are always contained within a (folderish) article, so I modified the caption mutator as follows:

    def setCaption(self, value, **kw):
        self.getField('caption').set(self, value, **kw)

        # The result of transforms is cached for up to one hour.
        # Our caption is inserted into article text via a transform,
        # so if we're located within an article, invalidate the
        # (volatile) cache of the article's text.
        parent = aq_parent(aq_inner(self))
        if IArticle.providedBy(parent):
            value = parent.getField('text').getBaseUnit(parent)
            if hasattr(value, '_v_transform_cache'):
                delattr(value, '_v_transform_cache')

(Note: I first investigated using the Cache class from Products.PortalTransforms.cache so that I could simply do Cache(value).purgeCache() …however, in the current PortalTransforms release the purgeCache method uses a method which was not imported, so I just recreated what that method does. I checked in a fix to the PortalTransforms bug which should make it into the next release of that package.)

For a more generic implementation of this cache invalidation, you would want to do the cache invalidation in a handler for the IObjectEdited event on ATImages, and use whatever the linkintegrity code does to determine what items contain references to the image which was edited–and are therefore candidates for cache invalidation. (The cache invalidation is only needed if the referencing field has already been rendered and is still in memory in the ZODB cache, so of course it would be good to make sure that the cache invalidation doesn’t cause extra objects to be woken up.)

I don’t have the need, time, or interest to work on this more generic solution myself, but it would be cool if someone tackled it. 🙂 Perhaps some of you also have ideas for other options for how to deal with this…