Getting started

Installation

GridapTopOpt.jl and additional dependencies can be installed in an existing Julia environment using the package manager. This can be accessed in the Julia REPL (read-eval–print loop) by pressing ]. We then add the required packages via:

pkg> add GridapTopOpt, Gridap, GridapDistributed, GridapPETSc, GridapSolvers, GridapEmbedded, PartitionedArrays, SparseMatricesCSR

Once installed, serial driver scripts can be run immediately, whereas parallel problems also require an MPI installation.

MPI

For basic users, MPI.jl provides such an implementation and a Julia wrapper for mpiexec - the MPI executor. This is installed via:

pkg> add MPI
julia> using MPI
julia> MPI.install_mpiexecjl()

Once the mpiexecjl wrapper has been added to the system PATH, MPI scripts can be executed in a terminal via

mpiexecjl -n P julia  main.jl

where main is a driver script, P denotes the number of processors.

PETSc

In GridapTopOpt.jl we rely on the GridapPETSc.jl satellite package to interface with the linear and nonlinear solvers provided by the PETSc (Portable, Extensible Toolkit for Scientific Computation) library. For basic users these solvers are provided by GridapPETSc.jl with no additional work.

Advanced installation

For more advanced installations, such as use of a custom MPI/PETSc installation on a HPC cluster, we refer the reader to the discussion for GridapPETSc.jl and the configuration page for MPI.jl.

Usage and tutorials

In order to get familiar with the library we recommend following the numerical examples described in:

Wegert, Z.J., Manyer, J., Mallon, C.N. et al. GridapTopOpt.jl: a scalable Julia toolbox for level set-based topology optimisation. Struct Multidisc Optim 68, 22 (2025). https://doi.org/10.1007/s00158-024-03927-3

Please note, there have been several breaking releases since the above publication was first submitted, please see Breaking Releases for a breakdown of these.

In addition to these, there are several driver scripts available in /scripts/.. along with some examples in this documentation.

More general tutorials for familiarising ones self with Gridap are available via the Gridap Tutorials.

Known issues

  • PETSc's GAMG preconditioner breaks for split Dirichlet DoFs (e.g., x constrained while y free for a single node). There is no simple fix for this. We recommend instead using MUMPS or another preconditioner for this case.
  • Analytic gradient breaks in parallel for integrals of certain measures – Issue #46