» On NixOS, GPU programming and other assorted topics

June 23, 2018 | cpp development english linux | Adrian Kummerländer

So I recently acquired a reasonably priced second-hand CAD workstation computer featuring a Xeon CPU, plenty of RAM as well as a nice Nvidia K2200 GPU with 4 GiB of memory and 640 cores as the heart of the matter. The plan was that this would enable me to realize my long hedged plans of diving into GPU programming - specifically using compute shaders to implement mathematical simulation type stuff. True to my previously described inclination to procrastinate interesting projects by delving into other interesting topics my first step to realizing this plan was of course acquainting myself with a new Linux distribution: NixOS.

So after weeks of configuring I am now in the position of working inside a fully reproducible environment declaratively described by a set of version controlled textfiles1. The main benefit of this is that my project-specific development environments are now easily portable and consistent between all my machines: Spending the morning working on something using the workstation and continuing said work on the laptop between lectures in the afternoon is as easy as syncing the Nix environments. This is in turn easily achieved by including the corresponding shell.nix files in the project’s repository.

As an example this is the environment I use to generate this very website, declaratively described in the Nix language:

with import <nixpkgs> {};

stdenv.mkDerivation rec {
  name = "blog-env";
  env = buildEnv { name = name; paths = buildInputs; };

  buildInputs = let
    generate  = pkgs.callPackage ./pkgs/generate.nix {};
    preview   = pkgs.callPackage ./pkgs/preview.nix {};
    katex     = pkgs.callPackage ./pkgs/KaTeX.nix {};
  in [

Using this shell.nix file the blog can be generated using my mostly custom XSLT-based setup2 by issuing a simple nix-shell --command "generate" in the repository root. All dependencies - be it pandoc for markup transformation, a custom KaTeX wrapper for server-side math expression typesetting or my very own InputXSLT - will be fetched and compiled as necessary by Nix.

{ stdenv, fetchFromGitHub, cmake, boost, xalanc, xercesc, discount }:

stdenv.mkDerivation rec {
  name = "InputXSLT";

  src = fetchFromGitHub {
    owner = "KnairdA";
    repo = "InputXSLT";
    rev = "master";
    sha256 = "1j9fld3sh1jyscnsx6ab9jn5x6q67rjh9p3bgsh5na1qrs40dql0";

  buildInputs = [ cmake boost xalanc xercesc discount ];

  meta = with stdenv.lib; {
    description = "InputXSLT";
    homepage    = https://github.com/KnairdA/InputXSLT/;
    license     = stdenv.lib.licenses.asl20;

This will work on any system where the Nix package manager is installed without any further manual intervention by the user. So where in the past I had to manually guarantee that all dependencies are available which included compiling and installing my custom site generator stack I can now simply clone the repository and generate the website in a single command3.

It can not be overstated how powerful the system management paradigm implemented by Nix and NixOS is. On NixOS I am finally free to try out anything I desire without fear of polluting my system and creating an unmaintainable mess as everything can be isolated and garbage collected when I don’t need it anymore. Sure it is some additional effort to maintain Nix environments and write a custom derivation here and there for software that is not yet available4 in nixpkgs but when your program works or your project compiles you can be sure that it does so because the system is configured correctly and all dependencies are accounted for - nothing works by accident5.

Note that the nix-shell based example presented above is only a small subset of what NixOS offers. Besides shell environments the whole system configuration consisting of systemd services, the networking setup, my user GUI environment and so on is also configured in the Nix language. i.e. the whole system from top to bottom is declaratively described in a consistent fashion.

NixOS is the first destribution I am truly excited for since my initial stint of distro-hopping when I first got into Linux a decade ago. Its declarative package manager and configuration model is true innovation and one of those rare things where you already know that you will never go back to the old way of doing things after barely catching a climpse of it. Sure, other distros can be nice and I greatly enjoyed my nights of compiling Gentoo as well as years spent tinkering with my ArchLinux systems but NixOS offers something truely distinct and incredibly useful. At first I thought about using the Nix and Scheme based GuixSD distribution instead but I got used to the Nix language quickly and do not think that the switch to Guile Scheme as the configuration language adds enough to offset having to deal with GNU’s free software fundamentalism6.

Of course I was not satisfied merely porting my workflows onto a new superior distribution but also had to switch from i3 to XMonad in the same breath. By streamlining my tiling window setup on top of this Haskell-based window manager my setup has reached a new level of minimalsim. Layouts are now restricted to either fullscreen, tabbed or simple side by side tiling and everything is controlled using Rofi instances and keybindings. My constant need of checking battery level, fan speed and system performance was fixed by removing all bars and showing only minimally styled windows. And due to the reproducibility7 of NixOS the interested reader can check out the full system herself if he so desires ! :-) See the home-manager based user environment or specifically the XMonad config for further details.

After getting settled in this new working environment I finally was out of distractions and moved on to my original wish of familiarizing myself with delegating non-graphical work to the GPU. The first presentable result of this undertaking is my minimalistic fieldplay clone computicle.

computicle impression

What computicle does is simulate many particles moving according to a vector field described by a function f:R2R2f : \mathbb{R}^2 \to \mathbb{R}^2 that is interpreted as a ordinary differential equation to be solved using classical Runge-Kutta methods. As this problem translates into many similiar calculations performed per particle without any communication to other particles it is an ideal candidate for massive parallelization using GLSL compute shaders on the GPU.

#version 430

layout (local_size_x = 1) in;
layout (std430, binding=1) buffer bufferA{ float data[]; };

vec2 f(vec2 v) {
    return vec2(

vec2 classicalRungeKutta(float h, vec2 v) {
    const vec2 k1 = f(v);
    const vec2 k2 = f(v + h/2. * k1);
    const vec2 k3 = f(v + h/2. * k2);
    const vec2 k4 = f(v + h    * k3);

    return v + h * (1./6.*k1 + 1./3.*k2 + 1./3.*k3 + 1./6.*k4);


void main() {
    const uint i = 3*gl_GlobalInvocationID.x;
    const vec2 v = vec2(data[i+0], data[i+1]);
    const vec2 w = classicalRungeKutta(0.01, v);

    data[i+0]  = w.x;  // particle x position
    data[i+1]  = w.y;  // particle y position
    data[i+2] += 0.01; // particle age

As illustrated by this simplified extract of computicle’s compute shader, writing code for the GPU can look and feel quite similar to to targeting the CPU in the C language. It fits that my main gripes during development were not with the GPU code itself but rather with the surrounding C++ code required to pass the data back an forth and talk to the OpenGL state machine in a sensible manner.

The first issue was how to include GLSL shader source into my C++ application. While the way OpenGL accepts shaders as raw strings and compiles them for the GPU on the fly is not without benefits (e.g. switching shaders generated at runtime is trivial) it can quickly turn ugly and doesn’t feel well integrated into the overall language. Reading shader source from text files at runtime was not the way I wanted to go as this would feel even more clunky and unstable. What I settled on until the committee comes through with something like std::embed is to include the shader source as multi-line string literals stored in static constants placed in separate headers. This works for now and offers at least syntax highlighting in terms of editor support.

What would be really nice is if the shaders could be generated from a domain-specific language and statically verified at compile time. Such a solution could also offer unified tools for handling uniform variables and data buffer bindings. While something like that doesn’t seem to be available for C++8 I stumbled upon the very interesting LambdaCube 3D and varjo projects. The former promises to become a Haskell-like purely functional language for GPU programming and the latter is an interesting GLSL generating framework for LISP.

I also could not find any nice and reasonably lightweight library for interfacing with the OpenGL API in a modern fashion. I ended up creating my own scope-guard type wrappers around the OpenGL functionality required by computicle but while what I ended up with looks nice it is probably of limited portability to other applications.

// simplified extract of computicle's update loop
window.render([&]() {
    if ( timer::millisecondsSince(last_frame) >= 1000/max_ups ) {
        auto guard = compute_shader->use();

        compute_shader->setUniform("world", world_width, world_height);

        last_frame = timer::now();
        auto texGuard = texture_framebuffers[0]->use();
        auto sdrGuard = scene_shader->use();

        scene_shader->setUniform("MVP", MVP);

        auto guard = display_shader->use();

        display_shader->setUniform("screen_textures",      textures);
        display_shader->setUniform("screen_textures_size", textures.size());



One idea that I am currently toying with in respect to my future GPU-based projects is to abandon C++ as the host language and instead use a more flexible9 language such as Scheme or Haskell for generating the shader code and communicating with the GPU. This could work out well as the performance of CPU code doesn’t matter as much when the bulk of the work is performed by shaders. At least this is the impression I got from my field visualization experiment - the CPU load was minimal independent of how many kiloparticles were simulated.

computicle impression

  1. See nixos_system and nixos_home

  2. See the summary node or Expanding XSLT using Xalan and C++

  3. And this works on all my systems, including my Surface 4 tablet where I installed Nix on top of Debian running in WSL

  4. Which is not a big problem in practice as the repository already provides a vast set of software and builders for many common build systems and adapters for language specific package managers. For example my Vim configuration including plugin management is also handled by Nix. The clunky custom texlive installation I maintained on my ArchLinux system was replaced by nice, self-contained shell environments that only provide the LATEX\LaTeX packages that are actually needed for the document at hand.

  5. At least if you are careful about what is installed imperatively using nix-env or if you use the pure flag in nix-shell

  6. Which I admire greatly - but I also want to use the full power of my GPU and run proprietary software when necessary

  7. And the system really is fully reproducible: I now tested this two times, once when moving the experimental setup onto a new SSD and once when installing the workstation config on my laptop. Each time I was up and running with the full configuration as I left it in under half an hour. Where before NixOS a full system failure would have incurred days of restoring backups, reconstructing my specific configuration and reinstalling software I can now be confident that I can be up and running on a replacement machine simply by cloning a couple of repositories and restoring a home directory backup.

  8. At least when one wants to work with compute shaders - I am sure there are solutions in this direction for handling graphic shaders for gaming and CAD type stuff.

  9. Flexible as in better support for domain-specific languages

» Notes on BlackBerry OS 10 development in 2017

October 3, 2017 | cpp development english | Adrian Kummerländer

I recently broke my seven-year streak of only using smartphones in the tradition of Nokia’s fabled N900. The reason for this change was the growing age of my Jolla phone combined with my continued desire for a working physical keyboard1. In today’s market overflowing with faceless, exchangeable, uninspired and uniform touch-only devices this left me no choice but to opt for the last true BlackBerry: The Passport.

Thoughts on OS 10

The BlackBerry Passport is the last device that was both manufactured by RIM and runs their QNX based OS 10 operating system. OS 10 offers everything you can expect from a state of the art smartphone operating system. Furthermore its communication stack is beyond comparison and renders any third-party mail, calendar, task and contact applications redundant - as one would expect of a device built by the erstwhile mobile handset pioneer. The system integrates perfectly with the hardware and features true multitasking that is exposed to the user in a Maemo-like fashion. One very nice feature is the keyboard’s touch capability that allows for text-navigation and scroll movements without touching and thus obscuring the actual screen. Although the native app catalog obviously doesn’t feature as many applications as are available on the big platforms I don’t miss anything2.

QNX is a unixoid real-time operating system that is very interesting in its own right. A nice side benefit of this is the ability to run a native terminal emulator and execute cross-compiled UNIX tools in the way I got used to during my time with Maemo and Sailfish OS. Sadly BlackBerry OS 10 is deprecated and won’t receive future support as the whole platform has been discontinued3.

Luckily I found the system to be very stable and feature complete. Thus I don’t see any immediate problems requiring continued support - as long as I can compile and install my own code.

OS 10 application development is based mostly on C++ and Qt/QML which suits my preferences quite nicely. Additionally the default stack includes a Python 3.2 interpreter as well as the ability to install and run arbitrary APKs.

Momentics IDE

BlackBerry offers a native SDK which bundles a Eclipse-based IDE called Momentics in addition to a full set of CLI utilities spanning the full development process. Installing and running it on my preferred distribution was straight forward. The only issue I encountered was the need to manually pass the path to an older XUL runner instance:

./qde -Dorg.eclipse.swt.browser.XULRunnerPath=~/tmp/xul/xulrunner/

~/tmp/xul/xulrunner contains the extracted contents of a matching XUL runtime version 10.0 archive.

While enabling the development mode in the device’s settings menu is just a toggle flick away and only requires a device password to be defined actually installing a program archive generated by Momentics requires one to register with RIM in order to be able to sign the archive. Fortunately enough this process is completely free and devoid of any further organisatory hurdles.

When I first tried to connect to my Passport via the IDE I encountered an SSL authentication error. All my attempts at solving this by temporarily disabling authentication via JVM flags failed. This issue also extended to browser based device access using e.g. BB10 App Manager which failed due to a invalid server response. In fact the only thing that worked was the SSH connection initiated via the SDK’s CLI utilities:

./blackberry-connect $device_ip -password $device_password -sshPublicKey ~/.

After some investigation I was able to trace this issue to some custom system update trickery I performed during the first days of setting up the device4. Luckily this was easily solved by resetting the device to the latest OS 10 version via the official autoloader.

Momentics IDE

After solving this issue Momentics turned out to be a workable IDE that seems to be enough for most purposes. Should this turn out to be a premature assessment it will be worthwhile to check out older versions of Qt Creator which include BlackBerry 10 support. And if everything visual fails there are still the SDK’s CLI utilities that allow for fully text driven packaging, signing and deployment of applications:

λ ~/t/m/b/h/l/x/u/bin ● ls | grep blackberry

Device-local development

As the alert reader might have noticed everything described in the last section hinges on the continued willingness of BlackBerry to generate and distribute development certificates without which one cannot deploy applications to physical devices. This is probably one of the primary tradeoffs one makes when leaving the world of open mobile systems behind.

Luckily there is a way to cross compile, install and run arbitrary CLI programs without even enabling the development mode: mordak/playbook-dev-tools.

This set of scripts developed by the same person that wrote the premier native terminal emulator for OS 10 offers a fully automated systems that compiles and installs GCC 4.6.3 amongst further development and system utilities.


It remains to be investigated if this entry point can be expanded to device-local development of full Cascades-based UI applications.

  1. i.e. something better than the TOHKBDv2

  2. Especially since it offers an Android runtime analogously to what I am used to on Sailfish OS

  3. I am not convinced by RIM’s externally manufactured Android smartphone. Neither the build quality nor the software are anywhere near to what they offered with the Passport. I fear that they will further degrade the keyboard quality in future devices and expect them to fully disappear from the market within the next couple of years.

  4. Namely impatiently forcing an OS upgrade using Sachesi

» Tinkering with meta tools

January 20, 2017 | english linux opinion | Adrian Kummerländer

If the contents of this website are of interest to you chances are that you also know how easy it is to spend large parts of one’s leisure time tinkering with computer systems and improving workflows while losing sight of what one actually wants to accomplish. I am going to use this article to document parts of my current personal computing setup in an attempt to summarize and refocus my efforts in this direction in addition to articulating some of my thoughts in this context.

Working with computers is my job to the extend that computers are the machines that my software developing self instructs to - hopefully - solve actual problems in the real world™. This means that configuring and optimizing my personal computing environments is not only a leisure time activity but can be considered a productive activity within certain boundaries1. If I think of time consuming optimization-activties that are definitely yielding payoff in my day to day work a few things come to mind: Learning and mastering a single powerful text editor2, switching to an open operating system and using open source software wherever possible, using a tiling window manager, version control everything, maintaining a private internet archive, frequently experimenting with new programming languages and paradigms as well as studying mathematics. I am going to use the following sections to further explore some of these activities.

Text editing

Plain text is still the most used format for expressing and storing program instructions for all common and nearly all uncommon languages out there. Although the simplicity and thus flexibility of plain text continues to cause distracting discussions around topics such as tabs-vs-spaces3 there sadly are no usable alternative in the vein of e.g. generalized tree encodings available at this point in time. S-Expressions could probably be used as a foundation for such an alternative but alas we don’t live in the alternative time line where the adoption of a LISP as the internet’s scripting language instead of the quickly hacked together thing that is JavaScript caused a widespread rise of LISP-like languages culminating in all common languages being LISP-dialects and thus being expressible in sexprs.

To return the topic at hand: plain text it is and will continue to be for the foreseeable future. If one looks for powerful, widely available and extensible text editors the choice will in most cases come down to the still reigning giants Vim and Emacs 4.
When I stood before this decision nearly half a lifetime ago5 my choice fell on Vim. If one only considers the default editing interface this definitely was the right choice for me as I would not trade Vim’s modal text manipulation language for overly long default Emacs key chords. But I have to admit that I regularly take wistful looks at some of the applications available inside Emacs such as Org-mode. Had I already discovered the joys of LISP when I was trying to decide between the two editors I probably would have chosen Emacs. Luckily one can always alter such decisions - sooner or later I will probably settle on some kind of Evil based setup just to be able to use the text based operating system that is Emacs.

a Vim session in its natural environment

Altough the 122 lines including comments that currently make up most of my Vim configuration are not much by most standards I have invested quite some time in setting the editor up to my taste - not the least by writing my very own color scheme to match the rest of my primary desktop environment.

Over time I have grown so accustomed to Vim’s keybindings that I now use them wherever possible including but not restricted to: my web browser, PDF reader, various IDEs, image viewer and window manager of choice.

Operating system and desktop environment

ArchLinux continues to be the Linux distribution of choice for most of my computing devices. It offers most of the flexibility of Gentoo if one needs it while preserving fast installation and frequent updates of most of the software I require. The only feature I currently miss and that will probably lead me switch distributions sooner or later is declarative package management as offered by NixOS or GuixSD. A fully declarative configuration of my Linux installation in a plain-text and thus easily version controllable file would be the logical conclusion of my current efforts in managing system configurations: /etc is tracked using etckeeper, various dotfiles are tracked using plain git and GNU stow for symlink management.

Example of a basic i3 session

My choice of desktop environment has converged on a custom setup built around the i3 tiling window manager over the last couple of years. I have adopted the tiling concept to such a degree that I struggle to think of a single advantage a common - purely floating - window manager might hold over tiling window managers. This holds especially true if your workflow is similar to mine, i.e. you have lots and lots of terminals and a few full screen applications running at most times as well as a fondness for using the keyboard instead of shoving around a mouse.

More complex unstaged example of a i3 session

What I like about i3 in particular is that it doesn’t enforce a set of layouts to toggle between like some other tiling WMs but allows you full control over the tree structure representing its layout. Another nice feature is that it lends itself to Vim-style keybindings as well as offering good support for multi monitor setups.

Compared to a desktop environment such as KDE a custom setup built around a window manager obviously requires much more configuration. In exchange for that I gained full control over my workflow in addition to basically instantaneous interaction with the system and its applications.

I find it very useful to have a certain set of information available at all times. Examples for such information are dictionary definitions, language and framework documentation, login credentials and notes. For this purpose I wrote a set of scripts that enable me to query a local dict daemon6, note folders and passman entries using the dmenu-replacement Rofi. In addition to that i3’s scratchpad feature is very useful for keeping Zeal and Vim instances running in the background while preserving accessability in every context by displaying them in a floating window overlaying the current workspace.

Archiving, web browsing and note taking

An important purpose of the tool we call Computer is accessing information on the Internet. This task is achieved primarily using a web browser - in fact there is an argument to be made that web browsers are some of the most complex applications the average users comes into contact with.
I sometimes frown at the wasteful complexity many of today’s websites contain even if their actual contents still consist of mostly plaintext and some pictures. It is no longer the exception but the rule that a single load of common websites pulls more data over the wire than would be required to encode a whole book while often containing far less content. These are the moments where I wish that Gopher had gained wider usage or that something in the vein of ipfs will grow to encompass most of the sources I commonly use7.

Current Pentadactyl, TreeStyleTabs and ScrapBook X augmented Firefox setup

As one can see in the screeshot above the current browser setup I use to deal with this rather unsatisfying state of things is a quite customized version of Firefox. This is achieved using Pentadactyl, TreeStyleTabs and ScrapBook X.

Pentadactyl transforms Firefox into a fully keyboard driven browser with Vim-like keybindings and looks as well as the possibility of configuring the browser using a single dotfile .pentadactylrc.

TreeStyleTabs allows me to better manage my workflow of keeping all tabs related to my current projects and interests opened as well as grouped in a tree structure. While Vim-like keybindings can be configured in other browsers, usable TreeStyleTabs like tab management is currently only available on Firefox.

ScrapBook X is my current answer to a question I have spent quite some time thinking about: How to best archive and organize interesting documents on the web. ScrapBook allows reasonably good offline archival of websites, organizing archived pages in a folder structure and exporting a HTML version of itself for hosting a private Internet Archive. It doesn’t work perfectly on all websites and uses plain file mirroring instad of the more suitable WARC format but it is the best solution I was able to find short of investing most of my time into attemting to develop something more to my taste.
For now it is good enough and fullfills my primary use cases: Having access to most of my sources independent of network availability8, preventing the unpleasant situation of wanting to consult a vaguely remembered source only to find that it is not available anymore as well as full text search over all interesting pages I visited.

Archiving the web ties into the last section of this article: note taking. While I write all my lecture notes and excercise sheet solutions using pen input on one of Microsoft’s Surface9 devices I like to capture project and research notes as well as general thoughts using a keyboard on my normal computer.
When taking notes as plain text I preferably want to do so using Vim which rules out most of the already relatively limited selection of open source desktop wiki software. After quite some time using VimWiki I currently use markdown files stored in a flat directory structure. Desktop integration is solved using a background Vim instance running in a i3 scratchpad as well as a Rofi based note selection dialog that communicates with Vim using remote commands. Advanced markdown features, syntax highlighting and conversion is based on pandoc integration plugins.
In addition to that I also now and then play around with Emacs’ Org-mode which can probably fulfill most of my requirements but requires considerable upfront configuration efforts to make it work consistently with Evil for Vim-style usage.


While the setup outlined above is workable I definitely do not consider it a permanent solution. Even more so I feel that there is much unexplored potential in augmenting not just note taking but formal and exploratory thinking in general using computers. What I mean by that are systems as they were envisioned in the form of the Memex or as described by Engelbart in Augmenting Human Intellect10.

Such a system would not only encompass note taking and knowledge management but form an intergral part of the user facing operating system. Everything you see on a screen should be explorable in the vein of Lisp Machines or Smalltalk instances, annotatable in a fully interlinkable and version controlled knowledge base that transparently integrates information from the Internet in a uniform fashion as well as shareable with other beings where required.
Formal verification software could look over the figurative shoulder of the user and point out possible fallacies and formal errors. Even a single component of such a system like for example a math environment that pulls up appropriate definitions, theorems and examples depending on the current context or a literate development environment that automatically pulls up documentation, stackexchange threads and debugging output would be of great help.
Usage tracking similar to Gnome Zeitgeist in conjunction with full text archival of all visited websites and fully reproducible tracking of the desktop state would also complement such a system.

In conclusion: My current setup doesn’t even scratch the surface of what should be possible. Tinkering with my computing systems and workflow-optimization will continue to be an unbounded but at least somewhat productive leisure time sink.

  1. …that I have probably crossed frequently

  2. Section 16, Power Editing, The Pragmatic Programmer.

  3. Tabs for indentation, spaces for alignment it is for me - this is probably the reason for my dislike of whitespace-dependent languages that tend to interfere with this personal preference of mine.

  4. Desktop-only text editors - especially those Electron based ones written in HTML and JavaScript of all things - are not an alternative for my use cases. Although at least VSCode is developing nicely into a usable cross platform IDE.

  5. Luckily not that long in my case :-)

  6. Serving a collection of freely available dictionaries such as Webster’s 1913 and Wordnet

  7. Note to self: Make the raw markdown sources (?) of this blog available on ipfs

  8. e.g. when riding the black forest train from Karlsruhe to my home village in the Hegau.

  9. The only non-Linux device I regularly use. Pen input on tablet devices has reached a point of general viability (concerning power usage and form factor) in recent years. I am actually very satisfied with my Surface 4 and Windows 10 as both a note taking and tablet device. For tinkering and security reasons I still hope to one day be able to use an open operating system and open note taking software on such a device but for now it is a workable solution for my use case of studying (mostly) paperless.

  10. AUGMENTING HUMAN INTELLECT : A Conceptual Framework., October 1962, D. C. Engelbart