Posts Tagged ‘python’

Migrated to github pages

May 29, 2013 Comments off

I’ve migrated this with Pelican to¬† It’s hosted on github pages (sources on github repo).



Windows 8 doesn’t let you change the network type ?

November 18, 2012 3 comments

In case you’ve been feeling adventurous like me and installed Windows 8 you probably also installed Hyper-V cause well, it’s free and it has way better integration with Windows. Yes, you can make your VM start when Windows starts with VMware too – but Hyper-V remembers if you had it running before shutdown. And shutdown works unlike VMware which does poweroff in case you did not reinstall vmware-tools after kernel upgrade (at least with Ubuntu). Could I have gotten something wrong ? Don’t think so … cause it works properly on Hyper-V.

Well anyway, the Network and Sharing Center in Windows 8 is quite weird … could not change the network type (to private) for one virtual Hyper-V adapter even tho I’ve enabled all user control with gpedit.msc. To fix it I had to use the Network List Manager API’s SetCategory it seems. There was this PowerShell script that seem to do that but, damn it, I don’t want to edit the script every time I have the problem. And that PowerShell thing looks awful compared to Python …

Here’s a python script that does the right thing: asks the user what network should be made private.

import win32com.client
    1: "PRIVATE",
    0: "PUBLIC",
    2: "DOMAIN"
m = win32com.client.Dispatch("{DCB00C01-570F-4A9B-8D69-199FDBA5723B}")
more = 1
pos = 1
connections = m.GetNetworkConnections()

while more:
    connection, more = connections.Next(pos)
    if connection:
        network = connection.GetNetwork()
        category = network.GetCategory()
        print '%s. "%s" is %s' % (pos, network.GetName(), NETWORK_CATEGORIES[category])
        if not category and raw_input("Make private [N]") in ['y', 'Y']:
    pos += 1

Now isn’t this pretty ? (except the shitty iterator-wanna-be api in GetNetworkConnections …)

Tags: ,

Tweaks for making django admin faster

January 19, 2012 6 comments

Here follow a number of tricks I’ve employed in the past to make django admin faster.

Editable foreign keys in the changelist

If you have foreign keys in list_editable django will make 1 database query for each item in the changelist. Quite a lot for a changelist of 100 items. The trick is to cache the choices for that formfield:

class MyAdmin(admin.ModelAdmin):
    list_editable = 'myfield',
    def formfield_for_dbfield(self, db_field, **kwargs):
        request = kwargs['request']
        formfield = super(MyAdmin, self).formfield_for_dbfield(db_field, **kwargs)
        if == 'myfield':
            myfield_choices_cache = getattr(request, 'myfield_choices_cache', None)
            if myfield_choices_cache is not None:
                formfield.choices = myfield_choices_cache
                request.myfield_choices_cache = formfield.choices
        return formfield

Foreign keys or many to many fields in admin inlines

If you have fk of m2m fields on InlineModelAdmin for every object in the formset you’ll get a database hit. You can avoid this by having something like:

class MyAdmin(admin.TabularInline):
    fields = 'myfield',
    def formfield_for_dbfield(self, db_field, **kwargs):
        formfield = super(MyAdmin, self).formfield_for_dbfield(db_field, **kwargs)
        if == 'myfield':
            # dirty trick so queryset is evaluated and cached in .choices
            formfield.choices = formfield.choices
        return formfield

Enable template caching

It’s amazing how easy it is to forget to add this in your settings:

    ('django.template.loaders.cached.Loader', (

Use select_related for the edit forms too

In case you have some readonly fields on the edit form and they need related data to display list_select_related doesn’t help. Eg:

class MyAdmin(admin.ModelAdmin):
    readonly_fields = 'myfield',
    def queryset(self, request):
        return super(MyAdmin, self).queryset(request).select_related('myfield')

Use annotations if possible for function entries list_display instead of making additional queries

Check the aggregation api to see if you can use this. Here’s the typical example:

class AuthorAdmin(admin.ModelAdmin):
    list_display = 'books_count',
    def books_count(self, obj):
        return obj.books_count
    def queryset(self, request):
        return super(AuthorAdmin, self).queryset(

The models would look like this:

class Author(models.Model):
    name = models.CharField(max_length=100)

class Book(models.Model):
    name = models.CharField(max_length=100)
    author = models.ForeignKey(Author, related_name="books")

Cache the filters from the admin changelist

This has the obvious tradeoff that you’ll have stale data in the list of filter but if they don’t change that often and those distinct queries are killing your database then this will help a lot. Just add a custom change_list.html template in your project (eg: templates/<myapp>/change_list.html):

{% extends "admin/change_list.html" %}
{% load admin_list i18n cache %}

{% block filters %}
    {% cache 300 admin_filters request.GET.items request.path request.user.username %}
        {% if cl.has_filters %}
          <div id="changelist-filter">
            <h2>{% trans 'Filter' %}</h2>
            {% for spec in cl.filter_specs %}{% admin_list_filter cl spec %}{% endfor %}
        {% endif %}
    {% endcache %}
{% endblock %}

Bonus trick

frame = sys._getframe(1)
while frame:
    if frame.f_code.co_name == 'render_change_form':
        if 'request' in frame.f_locals:
            request = frame.f_locals['request']
    frame = frame.f_back
    raise RuntimeError("Could not find request object.")

# do stuff with request

This could be used in some specific cases (eg: you need the request in a widget’s render method), as a last resort ofcourse ;)

What did you do to make django admin faster ?

Drop-in celery AbortableTask replacement

October 24, 2011 1 comment

If you need to report progress updates from the tasks (or you call update_state in the task) you cannot use the bundled AbortableTask from celery.contrib.abortable because it relies on status updates too. That means you’ll get race conditions if you do that.

You can use revokes for aborting tasks but they don’t give you enough control and it’s not guaranteed that your tasks will stop gracefully (or stop at all). Revokes can raise SoftTimeLimitExceeded if enabled (via TERM signal) however it might be tricky to perform cleanup – if you call C extension the exception will get delayed till the call returns. See the signal module docs for what happens when you raise an exception from a signal handler (that’s what celery does).

Given this, an alternative is to use redis to store the aborted task ids in a redis set. If you use the redis broker you can use this drop-in replacement:

from contextlib import contextmanager
import celery
from celery.task.base import Task
from celery.result import AsyncResult

from django.conf import settings

assert settings.BROKER_TRANSPORT == 'redis', "AbortableTask can only work with a 'redis' BROKER_TRANSPORT"
REDIS_KEY = getattr(settings, 'ABORTABLE_REDIS_KEY', 'task-aborts')

def client_from_pool():
    connection = celery.current_app.pool.acquire()
        yield connection.default_channel.client

class AbortableAsyncResult(AsyncResult):

    def is_aborted(self):
        with client_from_pool() as client:
            return client.sismember(REDIS_KEY, self.task_id)

    def abort(self):
        with client_from_pool() as client:
            client.sadd(REDIS_KEY, self.task_id)

class AbortableTask(Task):

    def AsyncResult(cls, task_id):
        return AbortableAsyncResult(task_id, backend=cls.backend,

    def is_aborted(self, **kwargs):
        task_id = kwargs.get('task_id',
        with client_from_pool() as client:
            return client.sismember(REDIS_KEY, task_id)

    def cleanup(self, **kwargs):
        task_id = kwargs.get('task_id',
        with client_from_pool() as client:
            client.srem(REDIS_KEY, task_id)

    def after_return(self, status, retval, task_id, args, kwargs, einfo):

This will use the broker’s connection pool if enabled (you should enable it, just set BROKER_POOL_LIMIT).

Tags: , ,

Django pro tip: if you only use the admin

October 11, 2011 5 comments

If you have a project that only exposes the admin you should just use the 500/404 templates from the admin.

Put this in your project’s

from django.utils.functional import curry
from django.views.defaults import server_error, page_not_found

handler500 = curry(server_error, template_name='admin/500.html')
handler404 = curry(page_not_found, template_name='admin/404.html')

I wonder why django doesn’t mention those templates in the docs …

If you have other drop-in apps that need authentication (like rosetta or sentry) bare in mind that the admin doesn’t have a reusable login view so you must hook one. You should just reuse django admin’s login template. Put this in the urlpatterns (don’t forget to match it to LOGIN_URL in the settings):

    url(r'^accounts/login/$', 'django.contrib.auth.views.login', {'template_name': 'admin/login.html'}),

You might note that this is not very DRY but actually the LOGIN_URL might differ than the one in the urlpatterns (eg: you mount the django wsgi handler on a non-root path).

Tags: ,

Tmux scripting

September 25, 2011 5 comments

I usually need to run more than 1 command for some project and got tired of searching through those putty windows for the session I want. So I thought of using a terminal multiplexer like Tmux.

I’m using celery with two queues and I need to run this:

  • celeryd -Q queueA
  • celeryd -Q queueB
  • celerycam -E

I need celerycam because it will get those stats in djcelery up to date.

It’s also a good idea to tail the postgresql log – when you break your models or database in some cases Django isn’t very helpful so this helps a lot:

  • tail -f /var/log/postgresql/postgresql-8.4-main.log

I use a wide screen so I want a layout like this:

    |                                    |                   |
    |              runserver             |                   |
    |                                    |     celerycam     |
    +------------------------------------+                   |
    |                                    |                   |
    |               celeryd              +-------------------+
    |                                    |                   |
    +------------------------------------+                   |
    |                                    |   postgresql log  |
    |               celeryd              |                   |
    |                                    |                   |

I also want to start a new tmux session from the same command so I can close everything easily – those celeryd’s don’t reload automatically :(

You’d usually run something like:

tmux new-session "tmux splitw 'command1';  tmux splitw 'command3'; tmux splitw 'command3'; command4"

but that get’s rather long and you need to quote and escape, calculate the panel sizes manually (I want equal height) and for the layout above you also need to select the right panels before splitting.

The commands vary across projects (some have more and some have less) – so how about we make a script:

import subprocess

left_commands = [
    "python runserver",
    "python celeryd -Q queueA -c 2 -E -n worker1",
    "python celeryd -Q queueB -c 2 -E -n worker2",
right_commands = [
    "python celerycam",
    "tail -f /var/log/postgresql/postgresql-8.4-main.log",
session = ''

if right_commands:
    session += 'tmux selectp -t 0; tmux splitw -hd -p 35 \"%s\"; ' % right_commands[-1]
for index, command in enumerate(right_commands[:-1]):
    session += 'tmux selectp -t 1; tmux splitw -d -p %i \"%s\"; ' % (
        100 / (len(right_commands) - index),

for index, command in enumerate(left_commands[1:]):
    session += 'tmux selectp -t 0; tmux splitw -d -p %i \"%s\"; ' % (
        100 / (len(left_commands) - index),
if left_commands:
    session += left_commands[0]

args = [
print 'Running ', args

Measure your code

July 3, 2011 9 comments

I found a cool tool today to measure code: metrics. It measures SLOC, comments and cyclomatic complexity. It’s easy to install: pip install pygments metrics

Run this in your project’s root:

metrics -v `find . -type f \( -iname "*.css" -or -iname "*.py" -or -iname "*.js" -or -iname "*.html" -or -iname "*.txt" \) \! -path "*/migrations/*.py" -print`

I have a django project so I added \! -path "*/migrations/*.py" to skip any files that are in a migrations dir (I’d skip the automatically generated south migrations).

You probably bundle other libraries or apps in your source tree (eg: jquery or that nice django app the author didn’t bother to make a script for) so you want to measure only some specific paths. Eg, to collect stats only for files in src/foobar, lib/tools and src/otherbar:

metrics -v `find src/foobar lib/tools src/otherbar -type f \( -iname "*.css" -or -iname "*.py" -or -iname "*.js" -or -iname "*.html" -or -iname "*.txt" \) \! -path "*/migrations/*.py" -print`

If you work on multiple projects you can make a script or alias for this:

metrics -v `find \`cat METRICS\` -type f \( -iname "*.css" -or -iname "*.py" -or -iname "*.js" -or -iname "*.html" -or -iname "*.txt" \) \! -path "*/migrations/*.py" -print`

And in each project just save a METRICS file with the list of paths.

I get something like this for one random project:

Metrics Summary:
Files                       Language        SLOC Comment McCabe
----- ------------------------------ ----------- ------- ------
  129                         Python        4831     289    261
    2                      Text only           0       0      0
   49              HTML+Django/Jinja        1381      19    166
    7                     JavaScript        2204     231    352
   21                            CSS        1839     111      0
----- ------------------------------ ----------- ------- ------
  208                          Total       10255     650    779

Do McCabe (aka cyclomatic complexity) ratios (McCabe/(SLOC-Comment)) look odd ? (0.17 for javascript and and 0.05 for python).

Do you know other measurement tools adequate for Django projects ? Do tell me and, if possible, how to run it for specific files like above (I’m lazy :)


Get every new post delivered to your Inbox.

Join 172 other followers