I had one of those trivial chores today — adding a new queue to our Sidekiq worker configuration.

It seems like one of those unthinking, mechanical tasks. Just find the list that defines the queues, add a new item, save, commit, send for code review.

I had already committed the code change, but not yet sent it for review, when I noticed a code comment.

Among other things, I was changing some helm kustomization files that pertain to Sidekiq setup. The file looked something like this (not a real code sample, just to give you the idea):

sidekiq:
  deployment:
    keda:
      enable: true
  deployments:
    default:
      config:
        queues:
          # This list must be kept in sync with the autoscaling config below
          - fast
          - slow
          - medium
      spec:
        maxReplicaCount: 3
        triggers:
          - type: prometheus
            metadata:
              serverAddress: http://<some-prometheus-cluster>:9090
              threshold: "150"
              query: "sidekiq_queue_size{queue=~"(fast|slow|medium)"}

The comment says: This list must be kept in sync with the autoscaling config below.

The idea is, if you change which queues are handled by a pool of Sidekiq workers, you want to edit the autoscaling configuration at the same time. Otherwise, you are going to see weird system behavior when your new queue gets a long backlog and autoscaling doesn’t notice…

Nothing forces you to keep the two parts of the configuration in sync, except your own reading skills.

And I almost missed it. Which would have been embarrassing.

But here’s the funny thing: I wrote that comment.

It was sometime last year, when I last worked on this area of our system.

I’m sure I must have thought to myself: It would be so easy to edit the queue configuration without updating the Keda configuration. I’ll just leave a comment to point out this footgun.

Nine months later — the comment worked perfectly, and saved me from myself.

You can’t rely on anyone reading the comments (not even yourself, apparently). But still.


Posted under: programming reading