Conversation
There was a problem hiding this comment.
Pull request overview
Adds a new “parametric” AD interface to NLPModels so that model parameters can remain implicit (not lifted to variables), enabling more efficient implicit differentiation and parametric bounds support.
Changes:
- Introduces
ParametricNLPModelMetato describe parameter dimension, sparsity, and availability flags for param-derivative routines. - Adds a new parametric API (
grad_param,jac_param_*,hess_param_*,jpprod/jptprod,hpprod/hptprod, and param-derivatives for variable/constraint bounds). - Wires the new API/meta into the module via new
include(...)statements.
Reviewed changes
Copilot reviewed 3 out of 3 changed files in this pull request and generated 3 comments.
| File | Description |
|---|---|
src/nlp/param_meta.jl |
Adds metadata struct + autogenerated getters for parametric derivative capabilities. |
src/nlp/param_api.jl |
Adds the public parametric AD API (allocating wrappers + *_! extension points). |
src/NLPModels.jl |
Includes the new parametric API and meta files in the module. |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
|
There are currently two implementations, https://github.com/klamike/ExaModels.jl/tree/mk/param_ad and https://github.com/klamike/MadNLP.jl/tree/mk/moi_param |
|
This is ready for review/merge now. I've added docs, tests, and an API for get/set parameter values. |
|
|
||
| Note that `p` does not appear as an explicit argument to any of the functions below. | ||
| Implementations are responsible for storing the current parameter values internally (e.g., as a field of the model struct) and reading them when evaluating the functions. | ||
|
|
There was a problem hiding this comment.
I should add a section on the new meta fields
|
Friendly ping. Is there any interest in merging this, or should I go back to the separate |
|
Closing in favor of the separate repo approach. @dpo let me know how you want to do it. I'm thinking you should create the repo so that it has all the template/codecov/actions set up the way you like? |
This is needed to support efficient implicit differentiation, e.g. https://github.com/MadNLP/MadDiff.jl.
The idea is to have parameters be fully "implicit" in the eyes of NLPModels, instead of lifting them to "special variables" like in JuMP/DiffOpt. While the latter approach allows to reuse the existing AD APIs (slicing out the parametric parts), this approach should be more efficient. It also allows for variable/constraint bounds to be parametric.
cc @amontoison
closes #532