Red Hat Research Quarterly

Research project updates—August 2023

Red Hat Research Quarterly

Research project updates—August 2023

Each quarter, RHRQ highlights new and ongoing research collaborations from around the world in one or more of our key areas of interest: AI and machine learning, hybrid cloud/research infrastructure, edge computing, and trust.

This quarter we highlight collaborative projects with university partners at Boston University and the University of Massachusetts-Lowell. Contact for more information on any project described here, or explore research projects at

In the Red Hat Research group, we don’t just write papers and technology roadmaps—we actually get to build systems and software to try out crazy and not-so-crazy new ideas. We work directly with students, professors, and practicing engineers from inside and outside Red Hat on proof-of-concept demos and prototypes that let us evolve ideas, try them out in real-world research environments, and get the honest feedback needed to decide if a project can successfully transition to the open source project ecosystem. Our worldwide graduate and undergraduate student interns make essential contributions to open source research projects throughout the year, but the summer is an especially busy time for US projects, when both students and researchers can roll up their sleeves and dive into one project for a few months. Here is a sampling of work in progress in a few highlighted research areas. We share many of the final summer project presentations in live and recorded Research Interest Group sessions that can be found on the events page of our website ( and on the project and people pages. Come on in—the water’s fine!
—Heidi Picher Dempsey, US Research Director for Red Hat 

AI and machine learning

  • Open source education (OPE) tools
    Ke Li and Griffin Heyrich of Boston University are working with the Open Education (OPE) team at BU and Red Hat to improve tools for building and using dynamic textbooks based on Jupyter Notebooks and the Red Hat OpenShift Data Science Machine Learning platform. They are improving the way the team builds testing workflows and textbook images and working to make it easier to start and run environments for a live class of about 300 students. Danni Shi detailed the Open Education project in “Open source education: from philosophy to reality” (RHRQ May 2023).
  • ML pipelines and file-system-based vulnerability detection
    Zhongshun Zhang of Boston University expanded work done as part of the AI for Cloud Ops research project by using the Praxi pipeline in the Mass Open Cloud to detect filesystem changes in an OpenShift cluster, tokenize package installation logs, and train an ML model to predict what packages are installed on a given system. He used RHODS and Kubeflow pipelines to evaluate multiple inference models. This tool will eventually have applications in vulnerability detection and validated software supply chains.
  • Fine-tuning LLMs for documentation retrieval and question answering
    Christina Xu’s work bridges AI, cloud, and edge research areas to determine whether ML models trained on large amounts of detailed technical networking documentation can provide useful answers to questions from engineers attempting to create and optimize edge networks. Christina, a BU graduate, is working to understand the capabilities and limitations of LLMs in this context, investigating techniques to reduce model size without decreasing accuracy and testing the results of her work with knowledgeable engineers. She hopes this work will eventually produce an open source tool to share with network designers and maintainers.


  • Podman machine improvements
    Jake Correnti from UMass-Lowell works with the RHEL Platforms group to improve the Podman machine subcommands that manage Podman’s virtual machine, so that users can run Linux containers on Windows and MacOS as well as on Linux. (Podman is a daemonless container engine for developing, managing, and running OCI containers and container images.) Jake, who is completing his second internship, also refactored the Podman machine for improved usability and supportability, resulting in a 40% decrease in function length. 

Edge computing

  • Linux kernel
    Han Dong and Eric Munson from Boston University both explore different means of tuning the kernel for challenging edge environments. Han experiments with kernel tuning to maximize energy efficiency without compromising performance SLAs. He uses Bayesian optimization as an ML technique to dynamically adjust SLAs and energy goals for processing while supporting a real-world in-memory key-value store workload. Eric extends previous work on Unikernel Linux by investigating alternatives for kernel-side event loop handling that minimizes latency. 
  • Another BU PhD student, Arlo Albelli, also works on kernel improvements, starting with work to port the kElevate system call to the ARM architecture, which is especially relevant for energy-efficient services. This work aims to allow a user process to dynamically request and relinquish hardware privileges over time, with the expectation that this mechanism will be valuable for energy and performance monitoring.

More like this