Home  |  Research  |  Group  |  Publications  |  Teaching

Integration of High-fidelity Monte Carlo and Deterministic Transport Codes into Workbench

In recent years, many advanced tools have been developed within DOE NEAMS program. They provide powerful nuclear reactor simulation capabilities, but they often require large computational resources, can be difficult to install, and require expert knowledge to operate. To mitigate the learning curve of using these tools and accommodate users' expectations for practical design and analysis work, we will integrate two important reactor physics analysis tools MCNP6 and PROTEUS into a unified model and workflow interface called the NEAMS Workbench. The Workbench will serve as the only operation environment that users need to interact with. Users only provide the problem definitions to Workbench, which will automatically handle the operations of different tools to run simulations, report outputs and visualize results. Thus, end users only need to focus on defining the problems to be solved, rather than learning how to run a tool to solve a problem.

This is a DOE funded project led by Dr. Ji with collaboration with researchers at ORNL, ANL and LANL.

back to top

Development of 2D and 3D Transient Electro-thermal Computational Models to Predict the Radiation Failures in SiC-based Schottky Diodes and Power Field-effect Transistors

High voltage (HV) power devices based on silicon carbide (SiC) semiconductor material may offer revolutionary transformations for future NASA space missions, due to the roughly three-fold increase in bandgap of SiC-based devices over traditional silicon (Si)-based devices. The wide bandgap feature enables the SiC device to operate at higher voltages, temperatures, and switching frequencies with greater efficiencies compared to existing Si devices. However, the unique space environment presents a great challenge to the device performance and reliability. To safely deploy SiC power devices for space missions, one has to first answer the question of how SiC-devices survive from the harsh radiation environment in space. Fundamental research into the radiation susceptibility and failure mechanisms of SiC is necessary. The overall goal of the project is to advance the understanding of radiation failure mechanism in silicon carbide (SiC) materials for power devices, and provide the guidelines to design and fabricate SiC-based devices with higher resistance to radiation single-event effects (SEEs).

This is a NASA funded project led by Dr. Ji and Dr. Chow (EECS) with collaboration with researchers at GE Global Research and NASA GRC.

back to top

Improving Prompt Temperature Feedback By Stimulating Doppler Broadening in Heterogeneous Composite Nuclear Fuel Forms

Nuclear fuels with similar aggregate material composition, but with different millimeter and micrometer spatial congurations of the component materials can have very different safety and performance characteristics. This research focuses on modeling and attempting to engineer heterogeneous combinations of nuclear fuels to improve negative prompt temperature feedback in response to reactivity insertion accidents.

Dr. Ji's research team have proposed improvements in negative prompt temperature feedback by developing a tailored thermal resistance in the nuclear fuel. In the event of a large reactivity insertion, the thermal resistance allows for a faster negative Doppler feedback by temporarily trapping heat in material zones with strong absorption resonances. A multi-physics simulation framework was created that could model large reactivity insertions. The framework was then used to model a comparison of a heterogeneous fuel with a tailored thermal resistance and a homogeneous fuel without the tailored thermal resistance. The results from the analysis confirmed the fundamental premise of prompt temperature feedback and provide insights into the neutron spectrum dynamics throughout the transient process.

back to top

Development of High-fidelity Methods to Simulate Coupled Granular Flow and Fluid Flow based on Multi-scale, Multi-physics Models

High temperature gas- or liquid salt-cooled pebble-bed nuclear reactors (PBRs) are deemed as the safest nuclear reactor designs up to date. In the safety analysis of PBRs, great computational challenges are presented due to its unique design features. In PBRs, tennis ball-sized spherical fuel pebbles are loaded and circulating through the reactor core region under the pressure of helium or fluoride salt coolant flow around each pebble. Interactions of pebble-pebble, pebble-coolant and pebble-reflector wall result in a complicated coupled pebble flow and coolant flow process in PBRs. This process is further complicated by the reactor power and temperature distributions, which have strong effect on pebble friction coefficient and coolant flow viscosities. To predict local power and temperature distribution accurately, especially under severe accident scenarios, high fidelity simulation of fully coupled pebble flow and coolant flow in PBRs is needed. The development of new methodology used in this high fidelity simulation can significantly improve the current reactor safety prediction capability and provide the safest design margin for PBRs.

Dr. Ji's research team have developed a high-fidelity code PEBble Fluid Dynamics (PEBFD), which tightly coupled the discrete element method (DEM) and the computational fluid dynamics (CFD) method to model the pebble flow and coolant flow simultaneously. A realistic simulation of pebble dynamics, including initial fuel loading process, dynamic fuel flowing (upward or downward) process, and fuel discharge and reloading process, in cylindrical and annular core designs in high temperature pebble-bed reactor designs has been implemented. Much effort has been made to verify and validate PEBFD. The effort includes the comparison with experimental data from public literature and reports. Critical safety design parameters, such as pebble flow rate distribution, void fraction (porosity) distribution, coolant flow speed distribution, and thermal temperature distribution, which are used for full core neutronic/thermal hydraulic analysis, can be provided using the developed code.

back to top

Development of an On-the-Fly Sampling Method to Decrease Memory Usage of the Temperature-Dependent Nuclear Data for Nuclear Reactor Analysis

In the analysis of nuclear reactor performance, coupled neutronic and thermal-hydraulic computations with temperature feedback are performed. Due to the strong temperature effect on the fission power evaluation, a large amount of nuclear data at a broad range of temperatures needs to be pre-stored prior to the coupled computations. The memory usage of these nuclear data can be on the order of gigabyte for a single isotope. This is not bearable for reactor analysis computations and has become a bottleneck on future super-computing architectures with limited memory storage capability on each computing node. Dr. Ji's research team has developed a totally new method of sampling thermal neutron differential scattering nuclear data at an arbitrary temperature, denoted as an on-the-fly sampling method, which is different from the conventional method in treating the huge amount of nuclear data for reactor analysis. In the conventional method, interpolation is used to obtain the nuclear data at the desired temperature from pre-stored data at many different temperatures, which requires much memory usage. In the new method, no pre-storage is needed. The nuclear data can be obtained on-the-fly at any desired temperature in the neutronic computation. The newly developed method has substantially decreased the memory usage by 90% and can provide accurate nuclear data calculation at arbitrary temperature. This research project has solved the big bottle neck problem presented in current nuclear reactor analysis codes and provided the potential capability for running these codes on future low-memory super-computing architectures.

back to top

Development of Analytical Models for Fast Simulation of Radiation Transport in Stochastic Media Systems

Several advanced nuclear energy system designs have presented characteristics of stochastic media, e.g. very high temperature gas-cooled reactors, pebble-bed reactors or advanced light water reactors with fully-ceramic micro-encapsulated fuel. In these systems, billions of microsphere fuel particles are utilized as the basic fuel element. These fuel particles are randomly packed in the fuel region of the system. In the analysis of these reactor systems, the radiation transport simulation usually takes very long time and requires a very high memory usage if every randomly distributed fuel particle is explicitly modeled. Dr. Ji's research team has developed a novel analytical model that characterizes the random distribution of these fuel particles and is used directly in radiation transport simulations. The developed analytical model avoids explicit modeling technique used in conventional simulations and allows implicitly modeling the position of a fuel particle on-the-fly during the track of radiation transport. The new model requires storing only one fuel particle's position at any time and the position is determined by a random sampling technique. This new model substantially decreases the memory need in modeling the stochastic media system and at the same time decreases the radiation transport simulation time. The simulation using the new analytical model has shown an increase in the speedup by a factor of 100-300 when analyzing advanced reactor systems that utilize microsphere fuel particles.

back to top

Modeling Electron Transport in Atmosphere under Electromagnetic Pulses (EMP)

This is a collaborative research with Los Alamos National Laboratory (LANL). The research focuses on improving upon an electron swarm model for modeling low temperature atmospheric plasmas in an electric field environment. Swarm electrons are low energy electrons that are produced from photoelectron ionization and Townsend impact ionization in molecular gases, specifically in atmospheric gases. The transport of swarm electrons in atmosphere can affect the air chemistry so to cause various atmospheric phenomena. Understanding the transport behavior of these low energy electrons subject to electromagnetic pulses is important in atmospheric physics. This is specifically key to unravel the mysterious lightening initiation process.

For the purpose of modeling the time evolution of electron temperature in an Electromagnetic Pulse, a diffusion code based on the swarm model has been developed at LANL. This code uses an adaptive time step and solves a system of coupled differential equations for the electric field, electron temperature, electron number density, and drift velocity. Comparisons with microwave and DC breakdown measurements have revealed that, for high values of electric field intensity and pressure ratio (E/p), the swarm model underestimates the equilibrium temperature that is achieved in experiments. The underestimation has been studied and is due to the usage of energy and momentum transfer collision frequencies that were reported by Higgins, Longmire, and O'Dell in 1973, as well as the invalid assumption of Maxwell-Boltzmann distribution of electron energy. In order to improve the model, updated electron-air cross sections reported in the LXcat database as a part of the Plasma Data Exchange Project are employed. New momentum and energy transfer collision frequencies, defined over a broader energy range, are evaluated using a two-term Boltzmann Equation solver, BOLSIG+. We find the importance of using these updated collision frequencies in the swarm code and the improvement is observed by comparing the results with experimental data.

back to top

Monte Carlo Simulation Acceleration based on Hybrid CPU-GPU Architecture for Nuclear Reactor Analysis

Monte Carlo simulation is ideally suited for solving radiation transport problems in complicated geometries. However, routine analysis for reactor core design requires the computation time to be within minutes in a desktop system, which limits the Monte Carlo simulation as a routine analysis method. Recently, the interest in adopting GPUs for Monte Carlo acceleration is rapidly rising, due to the excellent parallelism by the latest GPU technologies and the challenge to perform reactor full core analysis on a routine basis.

In this research effort with the collaboration of Dr. George Xu's research team at RPI, Monte Carlo codes for a fixed-source neutron transport and k-effective problems were developed for CPU and GPU environments to evaluate issues associated with computational speedup. The results suggest that a speedup factor of >30 in Monte Carlo radiation transport of neutrons is within reach using the state-of-the-art GPU technologies. For a task of voxelizing unstructured mesh phantom geometry that is more parallel in nature, the speedup of >45 is easily obtained. Successful implementation of Monte Carlo schemes on GPU will require considerable effort, especially when a production-scale Monte Carlo radiation transport code is considered. Given the prediction that future-generation GPU products will likely bring exponentially improved computing power and performances, innovative hardware and software solutions may make it possible to meet the "Kord Smith Challenge" in the next several years.

back to top

Last updated on September 2017