You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I have some issues with the 3D electrostatic labframe solver. I am using the picmi python API so I will reference the respective picmi objects. My model consists of two electrodes (rectangular bar cathode and a planar anode) fully encapsulated in the cubic domain. Both electrodes are modelled through the picmi.EmbeddedBoundary class and are assigned different potential levels. I emit electrons with a constant source flux from the embedded cathode surface into the volume. The domain boundaries, as well as the embedded boundaries, are particle absorbing. I tried two setups with different field boundary conditions for the domain walls with mixed results.
1. Multigrid solver with dirichlet domain boundaries
I started off by assigning a fixed potential to the domain boundaries (i.e. 0V) and running it with the default MLMG multigrid solver. This setup initializes fine, meaning that I can see the correct potential distribution in the initial step. However, when injecting more electrons into the system, MLMG has trouble to converge and hits the iteration limit as seen in #6019 and #5533 . I adjusted the number of cells in each dimension to get a almost perfectly equidistant dx, dy, dz but this didn't help. I also played around with the solvers warpx_absolute_tolerance which does kind of fix the problem but I have to go up to quiet high absolute tolerances (~ 1e-5) in order to ensure MLMG convergence and I am worried that this might already have an non negligible effect on the field solutions and therefore the particle movements that I obtain.
I also observed that by increasing the timestep or cell size, I am able to achieve lower absolute tolerances with MLMG. I guess this is due to the effect described in #5533, where small changes in the charge density distribution lead to initial solver guesses that are so close to the solution that MLMG is not able to reduce the relative residual to the necessary threshold.
Since it is not nice to fiddle around with the absolute solver tolerance for each setup and I also need to look at open domain boundaries I decided to give the FFT solver a shot. For this I recompiled WarpX with WarpX_FFT=ON and the relevant dependencies.
2. FFT solver with open boundaries
When trying to run the same simulation setup with open domain boundaries and the FFT solver method the simulation does run but I get a initial 0 field solution even though I assigned non zero potentials to the EmbeddedBoundary electrodes, which worked for MLMG.
Those are technically two separate problems but maybe someone knows how to correctly run the FFT solver for open domain boundaries with fixed potential on the EBs and/or has an idea on how to optimize the MLMG (I was thinking about adding some artificial right hand side perturbance to the initial solution guess or similar).
As always many thanks in advance.
reacted with thumbs up emoji reacted with thumbs down emoji reacted with laugh emoji reacted with hooray emoji reacted with confused emoji reacted with heart emoji reacted with rocket emoji reacted with eyes emoji
Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hello WarpX Team,
I have some issues with the 3D electrostatic labframe solver. I am using the
picmipython API so I will reference the respective picmi objects. My model consists of two electrodes (rectangular bar cathode and a planar anode) fully encapsulated in the cubic domain. Both electrodes are modelled through thepicmi.EmbeddedBoundaryclass and are assigned different potential levels. I emit electrons with a constant source flux from the embedded cathode surface into the volume. The domain boundaries, as well as the embedded boundaries, are particle absorbing. I tried two setups with different field boundary conditions for the domain walls with mixed results.1.
Multigridsolver with dirichlet domain boundariesI started off by assigning a fixed potential to the domain boundaries (i.e. 0V) and running it with the default MLMG multigrid solver. This setup initializes fine, meaning that I can see the correct potential distribution in the initial step. However, when injecting more electrons into the system, MLMG has trouble to converge and hits the iteration limit as seen in #6019 and #5533 . I adjusted the number of cells in each dimension to get a almost perfectly equidistant
dx,dy,dzbut this didn't help. I also played around with the solverswarpx_absolute_tolerancewhich does kind of fix the problem but I have to go up to quiet high absolute tolerances (~ 1e-5) in order to ensure MLMG convergence and I am worried that this might already have an non negligible effect on the field solutions and therefore the particle movements that I obtain.I also observed that by increasing the timestep or cell size, I am able to achieve lower absolute tolerances with MLMG. I guess this is due to the effect described in #5533, where small changes in the charge density distribution lead to initial solver guesses that are so close to the solution that MLMG is not able to reduce the relative residual to the necessary threshold.
Since it is not nice to fiddle around with the absolute solver tolerance for each setup and I also need to look at open domain boundaries I decided to give the
FFTsolver a shot. For this I recompiledWarpXwithWarpX_FFT=ONand the relevant dependencies.2.
FFTsolver with open boundariesWhen trying to run the same simulation setup with
opendomain boundaries and theFFTsolver method the simulation does run but I get a initial 0 field solution even though I assigned non zero potentials to theEmbeddedBoundaryelectrodes, which worked for MLMG.Those are technically two separate problems but maybe someone knows how to correctly run the
FFTsolver for open domain boundaries with fixed potential on the EBs and/or has an idea on how to optimize theMLMG(I was thinking about adding some artificial right hand side perturbance to the initial solution guess or similar).As always many thanks in advance.
Cheers,
Alex
Beta Was this translation helpful? Give feedback.
All reactions