2019-04-26
Brief notes on making Prometheus instant queries with curl
Every so often, I wind up wanting to directly extract some information from Prometheus without going through a Grafana dashboard or even the Prometheus web UI. At this point I have stubbed my toes on the same issues enough times (ie, more than once) that I'm going to write down some how-to information for my future reference.
In general, the Prometheus HTTP API itself is tersely documented
in the obvious spot in the documentation, and
see also Extracting raw samples from Prometheus.
Since it returns JSON, you'll probably want to have some JSON
processing program for command line usage; I'm fond of jq
, which at this point should be in
everyone's toolbox.
Making a query requires at least a PromQL
expression. In simple cases where all you want is all metric points
for a bare metric, you can give this directly on the command line
for curl
, eg:
curl 'localhost:9090/api/v1/query?query=up'
In more complex cases you'll have a PromQL query that requires
encoding to be properly interpreted, and perhaps extra parameters
on the query. Fortunately Prometheus accepts POST
requests just
like GET
requests, and curl
will do the work for us:
curl --data-urlencode 'query=up{job="alertmanager"}' --data-urlencode 'time=1556330052' localhost:9090/api/v1/query
Generally I want to use the -s
flag with curl
, to make it not
spit out status information when I'm feeding its output to jq
.
I may or may not want to use -f
to avoid server error pages. In
scripts I want some sort of timeout, such as '-m 10
'.
Some sources will provide curl
command lines that also use -k
.
You don't want to use that flag unless you know what it means and
you know that you need it and it's safe to use it.
(Possibly there is a clever way to get curl
to URL-encode your
query parameters in GET
requests, but if so I didn't see it in
a casual skim of the very large curl
manpage. I don't think it matters
for querying Prometheus, since Prometheus accepts either GET
or
POST
requests here.)
PS: Extracting raw samples from Prometheus says that 'the remote
read endpoint at api/v1/read
[...] is more difficult to use [...]'.
That would be a polite understatement. It turns out that Prometheus
remote read requests and replies are snappy-compressed protobufs, following schemas
that you can find here.
You can apparently work with protobufs from the command line
with curl
, per this gist example, but I don't know
how you'd handle the snappy compression side of things.
Sidebar: Sending data to Pushgateway with curl
Since I do this too, I might as well add my usual curl
command
for it here:
<generate metrics as text> | curl -X PUT -f -s --data-binary @- localhost:9091/metrics/job/...
I almost always use PUT
here instead of the default of POST
because of when and what metrics disappear on updates in Pushgateway.
Various aspects of Python made debugging my tarfile
problem unusual
I was recently thinking about what I like when I use Python, and
in the process I wound up reflecting about how working out that
the tarfile
module is too generous about what is a tar file was made different and easier by various
aspects of Python. I'm not going to say that I couldn't have
worked out a similar problem in, say, Go,
but if I had, I think it would have been a relatively different
experience.
One aspect of CPython specifically is that a lot of the standard
library is written in Python and so intrinsically has its source
code available even on a standard Python install (because the source
code is what CPython will run). You don't have to try to install
debugging symbols or fetch a source package; I could just go find
tarfile.py
and read it immediately. This reduced friction is part
of what made me actually go digging in the first place, because it
wasn't that much work to take a quick peek to see if I could figure
out what was going on (then things snowballed from there).
Once I was poking at the tarfile
module, another useful Python
peculiarity became important. Python lets you use (or abuse) the
import path to provide your own versions of modules from the standard
library, preempting the stock version. I could copy my program to
a scratch directory, copy the tarfile.py
from Python distribution
to the same directory, and start adding print
statements and so
on to understand the flow of execution through the module's code.
I didn't have to change the 'import tarfile
' in my own program
to another name or another path, the way I would have had to in
some other languages.
(This was useful for more than using a hacked tarfile.py
for
diagnosing things. It also meant that when I thought I had a
workaround in my own code, I could rename my tarfile.py
and have
my program instantly revert to using the stock Python tarfile
module, so I could verify that my fix wasn't being influenced by
my tarfile.py
hacks.)
Everyone cites Python's interactive interpreter and the ease of examining objects in it as great advantages, and I'm not going to argue; certainly I've used it for lots of exploration. Once I had things narrowed down to what I thought was the cause, the interactive interpreter was the fastest place to get to running code and so the best environment to quickly try out my guesses. In other languages I might have to fire up an editor to write a program or at least some tests, or craft a carefully built input file for my program.
(Technically it also sort of made for a pretty minimal reproduction case in my eventual bug report, because I implicitly assumed I didn't need to write up anything more than what would be needed to duplicate it inside an interactive interpreter.)
The cycle of editing tarfile.py
and re-running my program to test
and explore the module's behavior was probably not any faster in
Python than it might have been in a non-interpreted language, but
it felt different. The code I was editing was what was actually
running a few moments later, not something that was going to be
transformed through a build process. And for some reason, Python
code often feels more mutable to me than code in other languages
(perhaps because I percieve it as having less bureaucracy, due to
dynamic typing and the ability to easily print out random things
and so on).
Overall, I think the whole experience felt more lightweight and casual in Python than it would have in many other languages I'm familiar with. I was basically bashing things together and seeing how far I could get with relatively little effort, and the answer turned out to be all the way to a standard library bug.