Why I care about how fast CPython is
Python is not really a fast language. People who use Python (me included) more or less accept this (or else it would be foolish to write things in the language), but still we'd often like our code to be faster. For a while the usual answer to this has been that you should look into PyPy in order to speed up Python. This is okay as far as it goes, and PyPy certainly can speed up some things, but there are reasons to still care about CPython's speed and wish for it to be faster.
The simple way of putting the big reason is to say that CPython is
the universal solvent of Python. To take one example, Apache's
mod_wsgi uses CPython; if you want to use it to deploy a WSGI
application in a relatively simple, hands-off way, you're stuck
with however fast CPython is. Another way that CPython is a universal
solvent is that CPython is more or less everywhere; most Unix systems
/usr/bin/python, for example, and it's going to be some
version of CPython. Finally, CPython is what most people develop
with, test against, write deployment documentation for, and so on;
this is both an issue of whether a package will work at all and an
issue of whether it's doing that defeats much of PyPy's speedups.
Thus, speeding up CPython speeds up 'all' Python in a way that improvements to PyPy seem unlikely to. Maybe in the future PyPy will be so pervasive (and so much a drop in replacement for CPython) that this changes, but that doesn't seem likely to happen any time soon (especially since PyPy doesn't yet support Python 3 and that's where the Python world is moving).
(Some people will say that speeding up things like Django web apps is unimportant, because web apps mostly don't use CPU anyways but instead wait for IO and so on. I disagree with this view on Python performance in general, but specifically for Django web apps it can be useful if your app uses less CPU in order to free up more of it for other things, and that's what 'going fast' translates to.)