Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Substantially slow speed for small eigenmode problems #337

Open
ghazi221b opened this issue Mar 4, 2025 · 5 comments
Open

Substantially slow speed for small eigenmode problems #337

ghazi221b opened this issue Mar 4, 2025 · 5 comments
Labels
usage Related to usage

Comments

@ghazi221b
Copy link

**Description: A clear and concise description of what the bug is.***

-->I am a new Palace user trying to get better at using the package. I tried to replicate the "Eigenmodes of a Cylindrical Cavity" from the documentation by making my own mesh using gmsh with 9264 elements and simulating with Palace. It took 9728 seconds to solve the problem with 1 MPI process and 109 seconds to solve the problem with 128 MPI Processes.

I think these solution times are quite large, considering I wasn't using AMR or asking for any special post-processing. I just wanted to confirm if this is the speed that other people are getting or if I am screwing up.

To reproduce: Please provide minimal example that reproduces the error. For existing
examples, please provide a link.

I am attaching the mesh files, the json file and the output logs from both the 1 MPI and 128 MPI run to help better diagnose the issue
Cylinder_9264_file.json
Cylinder_9264_mesh.txt
Output_1_MPI.txt
Output_128_MPI.txt

Environment: Any environment details, such as operating system, compiler, etc.
--> I installed Palace using the install from source steps given in the documentation. I believe my school's HPC cluster uses the following cpu

Image

First time posting an issue, so please let me know if any other information needs to be provided.

@ghazi221b ghazi221b added the bug Something isn't working label Mar 4, 2025
@hughcars hughcars added the usage Related to usage label Mar 4, 2025
@hughcars
Copy link
Collaborator

hughcars commented Mar 4, 2025

HI @ghazi221b,

Looking at that log file, I am not seeing evidence of any bug. There's a few things to consider:

  1. You've increased the element count to 7528, where as the current examples have 80 tetrahedra for the cylinder. Your mesh is significantly larger, and you would reasonably expect it to take longer.
  2. The anisotropy of your elements (measured with the value of kappa in the log file) is ~14 compared to ~2 for the original mesh. This translates into more iterations required per linear system in the eigen solve. Yours takes ~20, compared to ~4 iterations for the one in the examples.
  3. You are solving on that mesh with a P4 solution, meaning you have 321856 unknowns, which for a single mpi processor is quite significant.
  4. An eigensolve is repeatedly solving the linear system, in your case ~64 times. So in that 109 s of your 128 ranks, you are solving 64 systems, coming to a little under 2s per linear solve.

In order to achieve good results from Palace it's important to consider your mesh, and whether or not you have appropriate resolution. You are significantly over refining the mesh, to no additional benefit. This can be particularly seen by looking at the error estimation, the example mesh as ~1.7e-3 error reported with 16544 DOF, you have ~2.35e-3 reported with 321856 DOF. So your system is ~20 times larger, and ~5 times worse conditioned, and gives less accurate results. I suggest that you start by improving your mesh.

@hughcars hughcars removed the bug Something isn't working label Mar 4, 2025
@ghazi221b
Copy link
Author

Thank you, Lowering the solution order to P2 gives me a reasonable number of unknowns and a good solution time.

I did have a few follow-up questions:

  1. Why are the mesh elements in my Cylinder_9264_mesh.txt file 9264 but Palace shows them to be 7528, shouldn't they match?
  2. Are there strategies when meshing using gmesh to ensure a good kappa, for this situation I used the following .geo file to make the geometry
    Cylinder_gmesh.txt
  3. Is there any HFSS field calculator type functionality in Palace. In Post-Processing, for evaluating the surface participation, you do evaluate some expressions. I was wondering if it was possible for the user to implement their own expressions and where they would start to do that?

Thanks again for all the help

@hughcars
Copy link
Collaborator

hughcars commented Mar 6, 2025

Hi @ghazi221b,

Glad you made some progress. It seems like your issues are primarily due to using gmsh, so I suggest you refer to their documentation

  1. You are misunderstanding the gmsh file format. If you read their documentation you'll realize that the first 1736 of those elements are triangles associated with attributes 4,5,6, i.e. they are boundary elements. 9264 - 1736 = 7528 volume elements.
  2. I would suggest you consult the gmsh documentation online, there's many videos on youtube for instance that explain various practical techniques for meshing with gmsh. The specification of mesh size fields within gmsh is a rich topic.
  3. There is no ability to allow arbitrary functions of the solution in Palace. This is because to do so would seriously compromise the performance of the postprocessing in any such evaluation if it were to handle the full range of general expressions. We do not support general expressions and instead focus on doing the most important with high accuracy and efficiency. There is the probe functionality described in the docs but this will only give field evaluations at points with which to perform calculations. This can be used to perform things like line integrals of the solution akin to those in the HFSS field calculator documentation, in particular a user is free to specify these probe points as calculated gauss points for a line quadrature. The other examples in their documentation for surface and volume integrals we mostly already support, the maximum value calculation could be performed within Paraview. The full field is effectively available in Paraview so ultimately any point evaluation based post processing is possible, we just do not intend to support this within Palace any more than we currently do with the Probe and Paraview functionality.

What expression are you attempting to evaluate?

@ghazi221b
Copy link
Author

Hey, thanks again for the help.

I am trying to improve my mesh workflow, and since I have always been more familiar with Cubit I am trying to use the Genesis or Nastran formats to use Palace however, the job doesn't run and I get errors reading the files for both formats
Output_Nastran.txt
Output_Genesis.txt
Cylinder_Nastran.txt
Cylinder_Nastran.json
(github won't let me attach genesis formats)

The settings I use for the formats is as follows:

Genesis
Image
Nastran

Image

Simultaneously, I have been using gmsh to explore eigenmode analysis with AMR for larger geometries. For this particular geometry, I can't get AMR to work. The first pass runs fine and gives similar results to HFSS, but then no adaptive meshing happens. I believe the issue lies with the fact that the indicator norm is 'nan' so the AMR is unable to continue

Eigenmode_AMR.json
Output_Eigenmode_AMR.txt
EM_AMR_mesh.txt

Any help on these issues would be great.

Thank you for providing the detailed response on the field calculation abilities of Palace. I very much appreciate the response. We are looking at the surface participation ratio as well. For which, I think there are several methodologies 1, 2. I know they broadly do similar things but we are interested in seeing which approach is most accurate and how they can be improved upon.
I think, broadly I was interested in whether it would be possible to go inside Palace and utilise the MFEM functionality to program a bespoke expression. How possible such an approach would be? and how much MFEM/Palace expertise it would require to program. I am not an expert on MFEM but it was just an idea that perhaps you could shed some light on.

Once again thanks for all the help!

@hughcars
Copy link
Collaborator

Hey, thanks again for the help.

I am trying to improve my mesh workflow, and since I have always been more familiar with Cubit I am trying to use the Genesis or Nastran formats to use Palace however, the job doesn't run and I get errors reading the files for both formats Output_Nastran.txt Output_Genesis.txt Cylinder_Nastran.txt Cylinder_Nastran.json (github won't let me attach genesis formats)

The settings I use for the formats is as follows:

Genesis Image Nastran

Image

I am not familiar with the Genesis or Nastran mesh exporting mechanism, for the genesis format I believe that is cubit, which means I suggest you open an issue with mfem in terms of reading that mesh, because we are using their capability. For the Nastran, I've reproduced the failure and it was caused by some small bug in treating the end of the mesh file, a fix is available at #347 if you give that a try. I managed to read in your mesh there. The issue was somehow during saving you were placing lots of meta characters on various lines, carriage returns, and ^M etc. I have provisionally fixed it by doing a bit more sanitization of the inputs, but you should check that.

Simultaneously, I have been using gmsh to explore eigenmode analysis with AMR for larger geometries. For this particular geometry, I can't get AMR to work. The first pass runs fine and gives similar results to HFSS, but then no adaptive meshing happens. I believe the issue lies with the fact that the indicator norm is 'nan' so the AMR is unable to continue

Eigenmode_AMR.json Output_Eigenmode_AMR.txt EM_AMR_mesh.txt

Looking at the log file here, you have a tetrahedron with anisotropy of 434569

            minimum     maximum
 h      1.33945e-05    0.147853
 kappa      1.05063      *434569*

which I think is causing the error estimation calculation, which involves a mass matrix inversion using a Jacobi method, to go functionally singular, giving the nan. I suggest you inspect your mesh for slivers and try and fix it. This is in part an issue to do with the current error estimation capability, but is mostly an issue with your mesh. We have some things we are working on to try address this type of issue, but there's no quick fix. HFSS uses a very different type of estimator (it's hard to find out exactly what, but it seems like a strong form residual estimator, for which there's no implicit solve that could throw a nan).

Any help on these issues would be great.

Thank you for providing the detailed response on the field calculation abilities of Palace. I very much appreciate the response. We are looking at the surface participation ratio as well. For which, I think there are several methodologies 1, 2. I know they broadly do similar things but we are interested in seeing which approach is most accurate and how they can be improved upon. I think, broadly I was interested in whether it would be possible to go inside Palace and utilise the MFEM functionality to program a bespoke expression. How possible such an approach would be? and how much MFEM/Palace expertise it would require to program. I am not an expert on MFEM but it was just an idea that perhaps you could shed some light on.

Once again thanks for all the help!

On alternative surface participation ratio calculations, all of those are performed in surfacepostoperator.cpp and you're free to try and modify the statements in there which are coming from J. Wenner et al., Surface loss simulations of superconducting coplanar waveguide resonators, Appl. Phys. Lett. (2011). You would need to modify the source code, but you ought to be able to get pretty good results by hacking in there, provided you are not too radically altering the formulas and thus changing the effectiveness of the quadrature.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
usage Related to usage
Projects
None yet
Development

No branches or pull requests

2 participants