2 minute read

In the summer of 2020, I bid farewell to academia and transitioned into the pharmaceutical industry, where I have been ever since. Prior to that, from 2015 onward, I spent around six years as a PhD student and postdoc at the university. During this time, a significant part of my work revolved around developing software tools to analyse large-scale biomedical datasets. Some of these tools evolved into larger projects that were eventually published as R packages in the Bioconductor respository (MouseFM and Qtlizer) or made available as web-based services (Genehopper and Qtlizer).

However, the maintenance of academic software is often associated with a short term expiration date, as there is rarely successor to take over the maintenance. Prof. Jeanette Erdmann (โ€ 2023), former head of the Institute of Cardiogenetics at the University of Lรผbeck, where I spent my PhD, generously covered the hosting costs for as long as required by the journals for software availability. This time, however, has passed already for while. Since then, I wrestled with the decision of what to do next.

On the one hand, the simplest option would have been to shut everything down. On the other hand, these tools represent years of effort, dedication, and passion, an important part of my time in academia that I donโ€™t want to simply erase. More importantly, they are still being used. Though the underlying database is aging, the core functionalities remain intact, and I occasionally receive messages from users who rely on these tools, especially when the servers temporarily go offline due to delayed payments.

Ultimately, I decided not to let go but instead to package the server-side software in a way that makes it easier to maintain and run in the long term with minimal intervention. My solution? Put the software into Docker containers.

Docker is a containerization technology that Iโ€™ve been using Docker for a while now. It still brings a smile to my face every time I see how effortlessly it simplifies things by eliminating dependency issues. It allows applications to be packaged with all their dependencies, libraries, and configurations in an isolated environment. Unlike traditional server setups where software installations can become cumbersome and system dependencies may break over time, Docker ensures that the software runs exactly the same way, no matter where it is deployed, whether on a local machine, a cloud instance, or a different server altogether.

Using Docker, I was able to move my services to a new virtual private server (VPS) with minimal effort and, in the process, cut hosting costs by two-thirds. Now, my tools continue to run with significantly lower maintenance overhead, ensuring that they remain available to the research community for the foreseeable future. I will miss the last occasional contact with the few faces I still know at the institute.

Leave a comment