Latest posts of series Things I learnt in November 2022
- str.endswith() can take a tuple of possible endings instead of a single string
About JACK and Debian
- There are 3 JACK implementations: jackd1, jackd2, pipewire-jack.
- jackd1 is mostly superseded in favour of jackd2, and as far as I understand, can be ignored
- pipewire-jack integrates well with pipewire and the rest of the Linux audio world
- jackd2 is the native JACK server. When started it handles the sound card directly, and will steal it from pipewire. Non-JACK audio applications will likely cease to see the sound card until JACK is stopped and wireplumber is restarted. Pipewire should be able to keep working as a JACK client but I haven't gone down that route yet
- pipewire-jack mostly works. At some point I experienced glitches in complex JACK apps like giada or ardour that went away after switching to jackd2. I have not investigated further into the glitches
- So: try things with pw-jack. If you see odd glitches, try without pw-jack to use the native jackd2. Keep in mind, if you do so, that you will lose standard pipewire until you stop jackd2 and restart wireplumber.
Python: typing.overload
typing.overload
makes it easier to type functions with behaviour that depends on input types.
Functions marked with @overload
are ignored by Python and only used by the
type checker:
@overload
def process(response: None) -> None:
...
@overload
def process(response: int) -> tuple[int, str]:
...
@overload
def process(response: bytes) -> str:
...
def process(response):
# <actual implementation>
Python's multiprocessing and deadlocks
Python's multiprocessing is prone to deadlocks in a number of conditions. In my case, the running program was a standard single-process, non-threaded script, but it used complex native libraries which might have been the triggers for the deadlocks.
The suggested workaround is using set_start_method("spawn")
, but when we
tried it we hit serious performance penalties.
Lesson learnt: multiprocessing is good for prototypes, and may end up being too hacky for production.
In my case, I was already generating small python scripts corresponding to worker tasks, which were useful for reproducing and debugging Magics issues, so I switched to running those as the actual workers. In the future, this may come in handy for dispatching work to HPC nodes, too.
Here's a parallel execution scheduler based on asyncio that I wrote to run them, which may always come in handy on other projects.
Debian:
- You can Build-Depend on
debhelper-compat (=version)
and get rid ofdebhelper
as a build-dependency, and ofdebian/compat
(details) - You can Build-Depend on
dh-sequence-foo
and get rid of the correspondingdh-foo
build-dependency, and of the need to add--with foo
indebian/rules
(details) - You can (and should) get rid of
dh-buildinfo
, which is now handled automatically - In salsa.debian.org there is a default CI
pipeline for Debian packages
that works beautifully without needing to add any
.gitlab-ci.yml
to a repository - Add
Testsuite: autopkgtest-pkg-python
todebian/control
, and you get a free autopkgtest that verifies that your packaged Python module can be imported. The default CI pipeline in salsa will automatically run the tests. (specification, details)
Python:
- From Python 3.8, you can use
=
in format strings to make it easier to debug variables and expressions (details):
>>> name="test"
>>> print(f"{name=}")
name='test'
>>> print(f"{3*8=}")
3*8=24
Leaflet:
[abc].tile.openstreetmap.org
links need to be replaced withtile.openstreetmap.org
(details)