Rebecca Faust Selected as a 2021 Computing Innovations Fellow
- August 4, 2021 -
Ph.D. candidate Rebeca Faust has been selected as a 2021 Computing Innovations (CI) Fellow. This award funds Rebecca for a 2-year postdoc, beginning in January 2022, with Dr. Chris North at Virginia Tech. She was one of 69 recent or soon-to-be Ph.D. graduates across the country selected for this award in 2021. The CI Fellows program aims to provide a career-enhancing bridge experience for recent and soon-to-be PhD graduates in computing to support maintaining the computing research pipeline amidst the hiring disruptions caused by COVID-19.
In her Ph.D. research, Rebecca has explored how to use automatic differentiation to create explanatory visualizations of non-linear projections. Dr. North has pioneered semantic interaction methods for manipulating deep learning models. During her postdoctoral work, Rebecca and Dr. North will combine their experiences to create interactive semantic explanations for deep learning visualizations. Their project will focus on enabling human-AI interactions to inject human expertise into deep learning systems and create explanatory visualizations of the effects of those interactions on the systems in the context of text analysis tasks.
For more information on the CI Fellows, see here.
Brandon Neth - International Conference on Supercomputing 2021
- July 9, 2021 -
This summer, PhD student Brandon Neth kicked off the International Conference on Supercomputing (ICS-2021) with a talk on their paper “RAJALC: Inter-loop Optimizations in RAJA.” Authored in collaboration with Professor Michelle Strout (Brandon's advisor) and Drs. Tom Scogland and Bronis de Supinski of Lawrence Livermore National Laboratory, the paper details RAJALC, an extension to the performance portability library RAJA. While RAJA is a powerful tool for writing portable kernels for high-performance computing applications, it leaves performance on the table by only considering kernels in isolation. RAJALC remedies this issue by introducing portable, easy to use optimizations that apply across multiple kernels, like loop fusion. Then, aided by RAJALC’s runtime symbolic evaluation, the safety of the requested optimization is assured. Using RAJALC, developers can achieve nearly the same performance improvements while writing up to 90% less code.
Staci Smith, David Lowenthal win Best Paper Award at HPDC'21
- June 25, 2021 -
Recent Ph.D graduate Staci Smith and her advisor, Professor David Lowenthal, have won the Best Paper Award at the recent High-Performance Distributed Computing (HPDC'21) conference. HPDC is one of the premiere conferences in high-performance computing. The paper authored by Smith and Lowenthal is entitled "Jigsaw: A High-Utilization, Interference-Free Job Scheduler for Fat-Tree Clusters".
The paper, based on Smith's Ph.D dissertation research, describes design and implementation of Jigsaw, a new job-isolating scheduler for three-level fat-trees. Jigsaw proactively enforces network isolation for every job, which completely avoids application performance degradation that would otherwise occur due to inter-job network interference. Unlike existing job-isolating schedulers, which suffer from lowered system utilization, Jigsaw typically achieves system utilization of 95-96%. In scenarios where jobs experience even modest performance improvements from interference-freedom, Jigsaw typically leads to lower job turnaround times and higher throughput than traditional job scheduling.
The award culminates Smith's decorated Ph.D student career. Her first paper, "Mitigating Inter-Job Interference Using Adaptive Flow-Aware Routing", was nominated for Best Student Paper at Supercomputing 2018. She also is the recipient of a 2019 ACM/IEEE-CS George Michael Memorial HPC Fellowship. Smith has taken a position as a software engineer at Google.
The paper is available here.