Open this publication in new window or tab >>2024 (English)In: Annals of Nuclear Energy, ISSN 0306-4549, E-ISSN 1873-2100, Vol. 208, article id 110746Article in journal (Refereed) Published
Abstract [en]
To compute few-group nodal data, lattice codes first need to generate multi-group cross-sections for each constituent material within the lattice model. This generation process utilizes continuous-energy cross-section libraries, which is expensive in terms of the computing cost. Moreover, any alteration in the nuclide compositions or other state parameters necessitates the repetition of this process. To reduce the computational demands, we propose the application of a pre-trained representational model. This model, which integrates Deep Neural Networks (DNN) and Principal Component Analysis (PCA) modules, is particularly beneficial in scenarios that require repeated multi-group data processing by the lattice code. In our previous research, we established that such a model could accurately generate multi-group data for fuel pellet materials. In the present study, we have broadened the scope of the model to encompass a more extensive range of materials typically found in pressurized water reactors, including zirc-alloy cladding and borated water moderators. We also show that the model can be trained on a wide spectrum of fuel enrichments. When integrated into lattice calculations, the errors introduced by the deep-learning-based representational model result in less than 1% deviation in the k eff and pin-power distribution. We have further refined the model to assess also the neutron fluxes in the fuel pellet and borated water. This refined model was subsequently employed to perform a flux-weighted collapse and generate few-group cross-section libraries for lattice calculation. The few-group libraries generated in this manner exhibited high accuracy and gave a low average k eff error and minimal errors in pin power distribution.
Place, publisher, year, edition, pages
Elsevier BV, 2024
Keywords
Cross section representation, Principal component analysis, Neural network, Deep learning, Lattice codes
National Category
Physical Sciences
Identifiers
urn:nbn:se:kth:diva-351417 (URN)10.1016/j.anucene.2024.110746 (DOI)001274336000001 ()2-s2.0-85198951292 (Scopus ID)
Note
QC 20240812
2024-08-122024-08-122025-09-25Bibliographically approved