Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update PETSc_jll with more solvers #146

Closed
3 tasks
jkozdon opened this issue Jul 14, 2021 · 8 comments
Closed
3 tasks

Update PETSc_jll with more solvers #146

jkozdon opened this issue Jul 14, 2021 · 8 comments

Comments

@jkozdon
Copy link
Member

jkozdon commented Jul 14, 2021

JuliaPackaging/Yggdrasil#3249 does not add the following libraries, so this should be revisited at some point

  • add mumps
  • add superlu
  • add superlu_dist

see: JuliaPackaging/Yggdrasil#3249 (comment)

@amartinhuertas
Copy link
Contributor

Hi @jkozdon ! Is there any chance that this issue gets sorted out soon? We are interested into having support for these libraries in PETSC_jll.

@jkozdon
Copy link
Member Author

jkozdon commented Dec 3, 2021

Hi @amartinhuertas!

I haven't looked at this any further, I'd be happy for someone to take this on!

In theory you can build a custom PETSc instance with this enabled.

In all honesty PETSc.jl development has stalled as my fall teaching started... hoping to get back to it in the new year since I should have a bit more time. (And at least get #149 merged)

A bigger problem that needs to be sorted out is those raised in JuliaPackaging/Yggdrasil#3801 where our current mechanism for supporting multiple PETSc libraries will fail.

@amartinhuertas
Copy link
Contributor

Thanks for your detailed response @jkozdon !

On another note, note that we are also developing our own wrappers for PETSc.jl, in particular, here https://github.com/gridap/GridapPETSc.jl. In these, we have some macros to generate C wrappers (it does not actually rely on Cbinding.jl package, anyway), and also a way to treat the deallocation of PETSc objects in a parallel context which is GC friendly.

@jkozdon
Copy link
Member Author

jkozdon commented Dec 7, 2021

@amartinhuertas I will have to take a look at what you all have done.

I wonder if the efforts could / should be combine?

It seems to me having two different PETSc wrappers floating around is not such a good thing.

You all also have more active development team which this package would certainly benefit from. If you'd like to chat about this possibility sometime, I'd be game.

@amartinhuertas
Copy link
Contributor

Please note that GridapPETSc.jl is not intended solely to be Julia wrappers for PETSc (although it has its own wrappers to PETSc, just as PETSc.jl, and thus there is some overlap here). It implements some of the abstract interfaces of our finite element packages Gridap.jl/GridapDistributed.jl , so that we can leverage them with PETSc as a solver library.

Perhaps @fverdugo and @stevengj may also want to contribute to this discussion.

@jkozdon
Copy link
Member Author

jkozdon commented Dec 7, 2021

Please note that GridapPETSc.jl is not intended solely to be Julia wrappers for PETSc

That's fair and I understand.

Sounds like the two efforts should remain separate.

I took a look at the GridapPETSc.jl parallel garbage collection. I'd thought of doing something similar for PETSc.jl and was thinking that it could still result in deadlock situations, but rethinking about it, I do now think that it's OK and not a bad way to go.

I was trying to think of a situation where you would end up accumulating a PETSc objects that need to be cleaned up in performance critical pieces of code, but I don't think that you should be destroying objects in performance critical codes that require collective clean up.

I'll have to consider putting this into PETSc.jl when I get some time to work on it again. Thanks for the suggestion.

@ViralBShah
Copy link
Member

@boriskaus I believe these solvers are now added to PETSc_jll. Should we close this issue or update its title?

@boriskaus
Copy link
Collaborator

yes these are now included in PETSc_jll. The problem of interacting with the GC remains, but there is a separate issue for that

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

4 participants