Font Size: a A A

Large eddy simulations of turbulent flows on graphics processing units: Application to film-cooling flows

Posted on:2012-02-14Degree:Ph.DType:Dissertation
University:University of Illinois at Urbana-ChampaignCandidate:Shinn, Aaron FFull Text:PDF
GTID:1462390011967300Subject:Engineering
Abstract/Summary:
Computational Fluid Dynamics (CFD) simulations can be very computationally expensive, especially for Large Eddy Simulations (LES) and Direct Numerical Simulations (DNS) of turbulent ows. In LES the large, energy containing eddies are resolved by the computational mesh, but the smaller (sub-grid) scales are modeled. In DNS, all scales of turbulence are resolved, including the smallest dissipative (Kolmogorov) scales. Clusters of CPUs have been the standard approach for such simulations, but an emerging approach is the use of Graphics Processing Units (GPUs), which deliver impressive computing performance compared to CPUs. Recently there has been great interest in the scientific computing community to use GPUs for general-purpose computation (such as the numerical solution of PDEs) rather than graphics rendering.;To explore the use of GPUs for CFD simulations, an incompressible Navier-Stokes solver was developed for a GPU. This solver is capable of simulating unsteady laminar flows or performing a LES or DNS of turbulent ows. The Navier-Stokes equations are solved via a fractional-step method and are spatially discretized using the finite volume method on a Cartesian mesh. An immersed boundary method based on a ghost cell treatment was developed to handle flow past complex geometries. The implementation of these numerical methods had to suit the architecture of the GPU, which is designed for massive multithreading. The details of this implementation will be described, along with strategies for performance optimization. Validation of the GPU-based solver was performed for fundamental bench-mark problems, and a performance assessment indicated that the solver was over an order-of-magnitude faster compared to a CPU.;The GPU-based Navier-Stokes solver was used to study film-cooling flows via Large Eddy Simulation. In modern gas turbine engines, the film-cooling method is used to protect turbine blades from hot combustion gases. Therefore, understanding the physics of this problem as well as techniques to improve it is important. Fundamentally, a film-cooling configuration is an inclined cooling jet in a hot cross-flow. A known problem in the film-cooling method is jet lift-off, where the jet of coolant moves away from the surface to be cooled due to mutual vortex induction by the counter-rotating vortex pair embedded in the jet, resulting in decreased cooling at the surface. To counteract this, a micro-ramp vortex generator was added downstream of the film-cooling jet, which generated near-wall counter-rotating vortices of opposite sense to the vortex pair in the jet. It was found that the micro-ramp vortices created a downwash effect toward the wall, which helped entrain coolant from the jet and transport it to the wall, resulting in better cooling. Results are reported using two film-cooling configurations, where the primary difference is the way the jet exit boundary conditions are prescribed. In the first configuration, the jet is prescribed using a precursor simulation and in the second the jet is modeled using a plenum/pipe configuration. The latter configuration was designed based on previous wind tunnel experiments at NASA Glenn Research Center, and the present results were meant to supplement those experiments.
Keywords/Search Tags:Large eddy, Simulations, Film-cooling, Flows, LES, Jet, Graphics, Turbulent
Related items