Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
9 changes: 9 additions & 0 deletions conda_package/docs/api.rst
Original file line number Diff line number Diff line change
Expand Up @@ -76,6 +76,15 @@ Mesh conversion
compute_mpas_flood_fill_mask
compute_lon_lat_region_masks

.. currentmodule:: mpas_tools.mesh.spherical

.. autosummary::
:toctree: generated/

recompute_angle_edge
calc_edge_normal_vector
calc_vector_east_north

.. currentmodule:: mpas_tools.merge_grids

.. autosummary::
Expand Down
4 changes: 2 additions & 2 deletions conda_package/docs/authors.rst
Original file line number Diff line number Diff line change
Expand Up @@ -23,5 +23,5 @@ Contributors
* Phillip J. Wolfram
* Tong Zhang

For a list of all the contributions:
https://github.com/MPAS-Dev/MPAS-Tools/graphs/contributors
For a list of all contributions, see the
`contributors graph <https://github.com/MPAS-Dev/MPAS-Tools/graphs/contributors>`_.
2 changes: 1 addition & 1 deletion conda_package/docs/building_docs.rst
Original file line number Diff line number Diff line change
Expand Up @@ -26,4 +26,4 @@ To preview the documentation locally, open the ``index.html`` file in the
cd _build/html
python -m http.server 8000

Then, open http://0.0.0.0:8000/master/ in your browser.
Then, open `<http://0.0.0.0:8000/master/>`_ in your browser.
12 changes: 6 additions & 6 deletions conda_package/docs/making_changes.rst
Original file line number Diff line number Diff line change
Expand Up @@ -21,9 +21,9 @@ things like whitespace at the end of lines.
The first time you set up the ``mpas_tools_dev`` environment, you will need to set up
``pre-commit``. This is done by running:

```bash
pre-commit install
```
.. code-block:: bash

pre-commit install

You only need to do this once when you create the ``mpas_tools_dev``
environment. If you create a new version of ``mpas_tools_dev``, then you will
Expand All @@ -43,9 +43,9 @@ PEP8 compliance, as well as sort, check and format imports,
f-strings, and `mypy <https://mypy-lang.org/>` to check for consistent variable
types. An example error might be:

```bash
example.py:77:1: E302 expected 2 blank lines, found 1
```
.. code-block:: bash

example.py:77:1: E302 expected 2 blank lines, found 1

For this example, we would just add an additional blank line after line 77 and
try the commit again to make sure we've resolved the issue.
Expand Down
5 changes: 5 additions & 0 deletions conda_package/docs/mesh_conversion.rst
Original file line number Diff line number Diff line change
Expand Up @@ -83,6 +83,11 @@ The converter also generates a ``graph.info`` file for graph partitioning
tools (e.g., Metis). In Python, this file is only written if the
``graphInfoFileName`` argument is provided.

For spherical meshes, the :py:mod:`mpas_tools.mesh.spherical` module provides
Python utilities for recomputing ``angleEdge`` and related local east/north
geometry directly from the mesh coordinates. This can be useful for
verification and diagnostics after conversion.

.. _cell_culler:

Cell Culler
Expand Down
18 changes: 9 additions & 9 deletions conda_package/docs/releasing.rst
Original file line number Diff line number Diff line change
Expand Up @@ -17,8 +17,8 @@ Version Bump and Dependency Updates
- ``conda_package/mpas_tools/__init__.py``
- ``conda_package/recipe/recipe.yaml``

- Make sure the version follows semantic versioning (see
https://semver.org/).
- Make sure the version follows
`semantic versioning <https://semver.org/>`_.
For release candidates, use versions like ``1.3.0rc1`` (no ``v`` prefix).

2. **Check and Update Dependencies**
Expand All @@ -38,7 +38,7 @@ Version Bump and Dependency Updates
- The dependencies in ``recipe.yaml`` are the ones that will be used for the
released package on conda-forge. The dependencies in ``pyproject.toml``
are for PyPI and should be kept in sync as much as possible but are only
there as a sanity check when we run ```pip check``. The ``dev-spec.txt``
there as a sanity check when we run ``pip check``. The ``dev-spec.txt``
file should include all dependencies needed for development and testing,
and ``pixi.toml`` should remain equivalent for pixi users.

Expand Down Expand Up @@ -100,8 +100,7 @@ Tagging and Publishing a Release Candidate
- Update dependencies if needed

- Commit, push to a new branch, and open a PR **against the ``dev`` branch**
of the feedstock:
https://github.com/conda-forge/mpas_tools-feedstock
of the `mpas_tools-feedstock <https://github.com/conda-forge/mpas_tools-feedstock>`_.

- Follow any instructions in the PR template and merge once approved

Expand All @@ -112,7 +111,7 @@ Publishing a Stable Release

- For stable releases, create a GitHub release page as follows:

- Go to https://github.com/MPAS-Dev/MPAS-Tools/releases
- Go to `the GitHub releases page <https://github.com/MPAS-Dev/MPAS-Tools/releases>`_

- Click "Draft a new release"

Expand All @@ -127,8 +126,8 @@ Publishing a Stable Release

7. **Updating the conda-forge Feedstock for a Stable Release**

- Wait for the ``regro-cf-autotick-bot`` to open a PR at:
https://github.com/conda-forge/mpas_tools-feedstock
- Wait for the ``regro-cf-autotick-bot`` to open a PR at the
`mpas_tools-feedstock repository <https://github.com/conda-forge/mpas_tools-feedstock>`_.

- This may take several hours to a day.

Expand All @@ -137,7 +136,8 @@ Publishing a Stable Release
- Merge once CI checks pass

**Note:** If you are impatient, you can accelerate this process by creating
a bot issue at: https://github.com/conda-forge/mpas_tools-feedstock/issues
a bot issue at the
`mpas_tools-feedstock issues page <https://github.com/conda-forge/mpas_tools-feedstock/issues>`_
with the subject ``@conda-forge-admin, please update version``. This
will open a new PR with the version within a few minutes.

Expand Down
23 changes: 18 additions & 5 deletions conda_package/docs/testing_changes.rst
Original file line number Diff line number Diff line change
Expand Up @@ -25,18 +25,30 @@ command:
cd conda_package
pixi install
pixi shell
rattler-build build -m ci/linux_64_python3.14.____cpython.yaml -r recipe/ --output-dir ../output
rattler-build build -m ci/linux_64_python3.14.____cpython.yaml -r recipe/ --output-dir output

This writes package artifacts to ``output/`` in the repository root.
This writes package artifacts to ``output/`` under ``conda_package``.

To install the locally built package into the pixi environment, add the local
build output as a channel and then add ``mpas_tools`` from that channel:

.. code-block:: bash

cd conda_package
pixi workspace channel add "file://$PWD/../output"
pixi add --platform linux-64 "mpas_tools [channel='file://$PWD/../output']"
pixi workspace channel add "file://$PWD/output"
pixi add --platform linux-64 "mpas_tools [channel='file://$PWD/output']"

.. important::

pixi, like other conda-family package managers, identifies a package by
its name, version and build string. If you rebuild a local package without
changing that identity, pixi may continue using an older cached artifact
even if the file in ``output/`` has changed.

If you rebuild ``mpas_tools`` locally and need pixi to pick up the new
package contents reliably, bump the conda recipe build number in
``conda_package/recipe/recipe.yaml`` before rebuilding. For Python-only
development, ``pixi run install-editable`` is often more convenient.

.. warning::

Expand Down Expand Up @@ -75,7 +87,8 @@ Then run tools within the pixi shell (for example ``pytest``).

A useful hybrid workflow is to install the latest release conda package
first (to get compiled tools), then install your branch in editable mode on
top for Python development.
top for Python development. This also avoids the need to bump the conda
build number for every local Python-only rebuild.

Legacy Method: Conda Editable Install
*************************************
Expand Down
2 changes: 1 addition & 1 deletion conda_package/docs/visualization.rst
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ centers to triangle nodes. ``dsTris`` includes variables ``triCellIndices``,
the cell that each triangle is part of; ``nodeCellIndices`` and
``nodeCellWeights``, the indices and weights used to interpolate from MPAS cell
centers to triangle nodes; Cartesian coordinates ``xNode``, ``yNode``, and
``zNode``; and ``lonNode``` and ``latNode`` in radians. ``lonNode`` is
``zNode``; and ``lonNode`` and ``latNode`` in radians. ``lonNode`` is
guaranteed to be within 180 degrees of the cell center corresponding to
``triCellIndices``. Nodes always have a counterclockwise winding.

Expand Down
105 changes: 105 additions & 0 deletions conda_package/mpas_tools/mesh/spherical.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,105 @@
import numpy as np
import xarray as xr

from mpas_tools.transects import lon_lat_to_cartesian


def recompute_angle_edge(ds_mesh):
"""
Recompute ``angleEdge`` from edge and vertex locations on the sphere.

Parameters
----------
ds_mesh : xarray.Dataset
An MPAS spherical mesh dataset containing edge and vertex locations.

Returns
-------
angle_edge : xarray.DataArray
``angleEdge`` recomputed from spherical geometry.
"""
normal_east_north = calc_edge_normal_vector(ds_mesh)
angle_edge = xr.zeros_like(ds_mesh.angleEdge)
angle_edge.values = np.atan2(
normal_east_north[:, 1], normal_east_north[:, 0]
)
return angle_edge


def calc_edge_normal_vector(ds_mesh):
"""
Compute edge-normal vectors projected onto local east/north coordinates.

Parameters
----------
ds_mesh : xarray.Dataset
An MPAS spherical mesh dataset containing edge and vertex locations.

Returns
-------
normal_east_north : numpy.ndarray
A ``(nEdges, 2)`` array of unit normal vectors in local east/north
coordinates.
"""
edge_cartesian = np.array(
lon_lat_to_cartesian(
ds_mesh.lonEdge, ds_mesh.latEdge, 1.0, degrees=False
)
)

vertex_1 = ds_mesh.verticesOnEdge.isel(TWO=0).values - 1
vertex_2 = ds_mesh.verticesOnEdge.isel(TWO=1).values - 1

lon_vertex_1 = ds_mesh.lonVertex.isel(nVertices=vertex_1)
lat_vertex_1 = ds_mesh.latVertex.isel(nVertices=vertex_1)
lon_vertex_2 = ds_mesh.lonVertex.isel(nVertices=vertex_2)
lat_vertex_2 = ds_mesh.latVertex.isel(nVertices=vertex_2)

vertex_1_cartesian = np.array(
lon_lat_to_cartesian(lon_vertex_1, lat_vertex_1, 1.0, degrees=False)
)
vertex_2_cartesian = np.array(
lon_lat_to_cartesian(lon_vertex_2, lat_vertex_2, 1.0, degrees=False)
)

dvertex_cartesian = vertex_2_cartesian - vertex_1_cartesian
normal_cartesian = np.cross(dvertex_cartesian, edge_cartesian, axis=0)

edge_east, edge_north = calc_vector_east_north(
edge_cartesian[0, :], edge_cartesian[1, :], edge_cartesian[2, :]
)

normal_east_north = np.zeros((ds_mesh.sizes['nEdges'], 2))
normal_east_north[:, 0] = np.sum(edge_east * normal_cartesian, axis=0)
normal_east_north[:, 1] = np.sum(edge_north * normal_cartesian, axis=0)

norm = np.linalg.norm(normal_east_north, axis=1)
nonzero = norm > 0.0
normal_east_north[nonzero, :] /= norm[nonzero, np.newaxis]

return normal_east_north


def calc_vector_east_north(x, y, z):
"""
Compute local east and north unit vectors on the sphere.

Parameters
----------
x, y, z : numpy.ndarray
Cartesian coordinates of points on the unit sphere.

Returns
-------
east, north : tuple of numpy.ndarray
Local east and north unit vectors, each with shape ``(3, nPoints)``.
"""
axis = np.array([0.0, 0.0, 1.0])
xyz = np.stack((x, y, z), axis=1)
east = np.cross(axis, np.transpose(xyz), axis=0)
north = np.cross(np.transpose(xyz), east, axis=0)

east /= np.linalg.norm(east, axis=0)
north /= np.linalg.norm(north, axis=0)

return east, north
18 changes: 18 additions & 0 deletions conda_package/tests/test_conversion.py
Original file line number Diff line number Diff line change
@@ -1,9 +1,11 @@
#!/usr/bin/env python

import matplotlib
import numpy as np

from mpas_tools.io import write_netcdf
from mpas_tools.mesh.conversion import convert, cull, mask
from mpas_tools.mesh.spherical import recompute_angle_edge

from .util import get_test_data_file

Expand All @@ -30,5 +32,21 @@ def test_conversion():
write_netcdf(dsMask, 'antarctic_mask.nc')


def test_conversion_angle_edge():
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nice!

ds_mesh = xarray.open_dataset(
get_test_data_file('mesh.QU.1920km.151026.nc')
)
ds_mesh = convert(dsIn=ds_mesh)

angle_edge_python = recompute_angle_edge(ds_mesh)
angle_diff = np.angle(
np.exp(1j * (angle_edge_python.values - ds_mesh.angleEdge.values))
)

assert np.all(np.isfinite(angle_diff))
assert np.max(np.abs(angle_diff)) < 1.0e-10


if __name__ == '__main__':
test_conversion()
test_conversion_angle_edge()
Loading
Loading