Skip to content

Commit e412eb6

Browse files
authored
Merge pull request #324 from ecmwf/develop
1.0.27
2 parents 5b5c882 + cd7524a commit e412eb6

File tree

12 files changed

+1874
-28
lines changed

12 files changed

+1874
-28
lines changed

docs/Service/Data_Portfolio.md

+81-5
Original file line numberDiff line numberDiff line change
@@ -1,10 +1,10 @@
11
# Data Portfolio
22

3-
Polytope feature extraction only has access to data that is stored on an FDB. The dataset currently available via Polyope feature extraction is the operational forecast. We plan to add Destination Earth Digital Twin data in the future.
3+
Polytope feature extraction only has access to data that is stored on an FDB. The datasets currently available via Polyope feature extraction are the operational ECMWF forecast, as well as the data produced by the Destination Earth Extremes and Climate digital twins.
44

55
## Operational Forecast Data
66

7-
The following values available for each field specified are:
7+
The following key value pairs are available via Polytope:
88

99
* `class` : `od`
1010
* `stream` : `enfo` `oper`
@@ -18,7 +18,7 @@ If `type` is `enfo`:
1818

1919
* `number` : `0/to/50`
2020

21-
If `levtype` is `pl` or `ml` a `levelist` must be provided:
21+
If `levtype` is `pl` or `ml`, a `levelist` must be provided:
2222

2323
* `levelist` : `1/to/1000`
2424

@@ -42,10 +42,86 @@ If `levtype` is `pl` or `ml` a `levelist` must be provided:
4242
* `crwe`
4343
* `ttpha`
4444

45-
For `sfc` most `params` will be available but not all.
45+
For `sfc`, most `params` will be available but not all.
4646

47-
Only data that is contained in the operational FDB can be requested via Polytope feature extraction, the FDB usually only contains the last two days of forecasts.
47+
Only data that is contained in the operational FDB can be requested via Polytope feature extraction. The FDB usually only contains the last two days of forecasts.
4848

4949
We sometimes limit the size of requests for area features such as bounding box and polygon to maintain quality of service.
5050

5151
Access to operational data is limited by our release schedule.
52+
53+
54+
## Extremes DT Data
55+
56+
The following values available for each field specified are:
57+
58+
* `class` : `d1`
59+
* `dataset` : `extremes-dt`
60+
* `stream` : `oper` `wave`
61+
* `type` : `fc`
62+
* `levtype` : `sfc` `pl` `hl`
63+
* `expver` : `0001`
64+
* `domain` : `g`
65+
* `step` : `0/to/96`
66+
67+
If `levtype` is `pl`, a `levelist` must be provided:
68+
69+
* `levelist` : `1/to/1000`
70+
71+
If `levtype` is `hl`, a `levelist` must be provided:
72+
73+
* `levtype` : `100`
74+
75+
`pl` and `hl` also only contain a subset of parameters that are available in grid point. These are:
76+
77+
* `pl`
78+
* `Geopotential`
79+
* `Temperature`
80+
* `U component of wind`
81+
* `V component of wind`
82+
* `Specific humidity`
83+
* `Relative humidity`
84+
* `hl`
85+
* `100 metre U wind component`
86+
* `100 metre V wind component `
87+
88+
For `sfc` most `params` are available.
89+
90+
For `stream` : `wave` the following parameters are available:
91+
92+
* `Mean zero-crossing wave period`
93+
* `Significant height of combined wind waves and swell`
94+
* `Mean wave direction`
95+
* `Peak wave period`
96+
* `Mean wave period`
97+
98+
Only Extremes-DT data from the past 15 days can be accessed by users.
99+
100+
101+
## Climate DT Data
102+
103+
The following values available for each field specified are:
104+
105+
* `class` : `d1`
106+
* `dataset` : `climate-dt`
107+
* `activity` : `ScenarioMIP` `story-nudging` `CMIP6`
108+
* `model`: `IFS-NEMO`
109+
* `generation` : `1`
110+
* `realization`: `1`
111+
* `resolution`: `standard` `high`
112+
* `time`: `0000/to/2300`
113+
* `stream` : `clte`
114+
* `type` : `fc`
115+
* `levtype` : `sfc` `pl` `o2d`
116+
* `expver` : `0001`
117+
* `domain` : `g`
118+
119+
If `levtype` is `pl`, a `levelist` must be provided:
120+
121+
* `levelist` : `1/to/1000`
122+
123+
`pl` is currently being scanned and new parameters will become available as time passes. This is also the case for `o2d`.
124+
125+
For `sfc`, most `params` are available.
126+
127+
Currently, only data for `dates` between `2020` and `2050` is available.

docs/Service/Examples/examples.md

+5-1
Original file line numberDiff line numberDiff line change
@@ -5,4 +5,8 @@
55
* <a href="../vertical_profile_example">Vertical Profile</a>
66
* <a href="../boundingbox_example">Bounding Box</a>
77
* <a href="../trajectory_example">Trajectory</a>
8-
* <a href="../country_example">Country Cut-Out</a>
8+
* <a href="../country_example">Country Cut-Out</a>
9+
10+
For examples of Polytope Feature Extraction on Destination Earth Digital Twin Data please visit the following Github Repo: https://github.com/destination-earth-digital-twins/polytope-examples
11+
12+
It contains examples for both the Climate DT and the Extremes DT.

docs/Service/Examples/timeseries_example.ipynb

+65-8
Large diffs are not rendered by default.

docs/Service/Examples/vertical_profile_example.ipynb

+25-1
Original file line numberDiff line numberDiff line change
@@ -73,6 +73,30 @@
7373
"chart.fig.update_layout(yaxis2={\"title\": \"hPa\"})\n",
7474
"chart.show(renderer=\"png\") # Replace with chart.show() in an interactive session!"
7575
]
76+
},
77+
{
78+
"cell_type": "markdown",
79+
"metadata": {},
80+
"source": [
81+
"# Convert to Xarray"
82+
]
83+
},
84+
{
85+
"cell_type": "code",
86+
"execution_count": null,
87+
"metadata": {},
88+
"outputs": [],
89+
"source": [
90+
"da = ds.to_xarray()\n",
91+
"print(da)"
92+
]
93+
},
94+
{
95+
"cell_type": "code",
96+
"execution_count": null,
97+
"metadata": {},
98+
"outputs": [],
99+
"source": []
76100
}
77101
],
78102
"metadata": {
@@ -91,7 +115,7 @@
91115
"name": "python",
92116
"nbconvert_exporter": "python",
93117
"pygments_lexer": "ipython3",
94-
"version": "3.10.6"
118+
"version": "3.11.8"
95119
}
96120
},
97121
"nbformat": 4,

docs/Service/Features/timeseries.md

+13-6
Original file line numberDiff line numberDiff line change
@@ -21,11 +21,12 @@ request = {
2121
"feature" : {
2222
"type" : "timeseries",
2323
"points": [[-9.10, 38.78]],
24-
"axes": "step",
24+
"time_axis": "step",
2525
"range" : {
2626
"start" : 0,
2727
"end" : 360,
2828
}
29+
"axes" : ["latitude", "longitude"]
2930
},
3031
"format": "covjson",
3132
}
@@ -42,18 +43,18 @@ For a timeseries within the `feature` dictionary three fields are required
4243

4344
* `type`
4445
* `points`
45-
* `axes`
46+
* `time_axis`
4647

4748
For a timeseries `type` must be `timeseries`.
4849

4950
`points` must be a nested list with a points containing a latitude and a longitude.
5051

51-
`axes` refers to the axis on which to generate the timeseries. In this case the timeseries is generated across `step` based on the inputted `range`. However if the data requested was a climate dataset the `axess` may be `datetime` denoting that the timeseries is generated across that axis.
52+
`time_axis` refers to the axis on which to generate the timeseries. In this case the timeseries is generated across `step` based on the inputted `range`. However if the data requested was a climate dataset the `time_axis` may be `datetime` denoting that the timeseries is generated across that axis.
5253

5354

5455
## Optional Fields
5556

56-
`range` is an optional field within `feature`. It refers to the extent of the `axes` on which the timeseries will be generated. In the above case where:
57+
`range` is an optional field within `feature`. It refers to the extent of the `time_axis` on which the timeseries will be generated. In the above case where:
5758

5859
```python
5960
"axes": "step",
@@ -75,7 +76,7 @@ A timeseries across `step` will start at step `0` and end at step `360` with all
7576
```
7677
In this case every second step will be returned if it exists.
7778

78-
As `range` is an optional field it can be left out, however there is not a default value. Instead the user has to include the timeseries `axes` in the main body of the request like below:
79+
As `range` is an optional field it can be left out, however there is not a default value. Instead the user has to include the timeseries `time_axis` in the main body of the request like below:
7980

8081
```python
8182
request = {
@@ -93,7 +94,7 @@ request = {
9394
"feature" : {
9495
"type" : "timeseries",
9596
"points": [[-9.10, 38.78]],
96-
"axes": "step",
97+
"time_axis": "step",
9798
},
9899
"format": "covjson",
99100
}
@@ -104,3 +105,9 @@ This is equivalent to the first request presented.
104105
At least one of `range` or `step` must be included in the request, but not both. In this case an error will be provided telling the user that `step` is overspecified.
105106

106107
Conversely at least one of `range` or `step` must be included.
108+
109+
`axes` can also be provided which defines the spatial `axes` on which the request is made. For example if the user provides points in the order `longitude`, `latitude` they can add `axes` : `["longitude", "latitude"]`.
110+
111+
## Note:
112+
113+
Previously the `axes` keyword was used for `time_axis`. We still allow this behavior to satisfy backwards compatibility with previous requests.

polytope_feature/datacube/backends/fdb.py

+14-3
Original file line numberDiff line numberDiff line change
@@ -80,10 +80,21 @@ def check_branching_axes(self, request):
8080
(upper, lower, idx) = polytope.extents(ax)
8181
if "sfc" in polytope.points[idx]:
8282
self.fdb_coordinates.pop("levelist", None)
83+
84+
if ax == "param":
85+
(upper, lower, idx) = polytope.extents(ax)
86+
if "140251" not in polytope.points[idx]:
87+
self.fdb_coordinates.pop("direction", None)
88+
self.fdb_coordinates.pop("frequency", None)
89+
else:
90+
# special param with direction and frequency
91+
if len(polytope.points[idx]) > 1:
92+
raise ValueError(
93+
"Param 251 is part of a special branching of the datacube. Please request it separately." # noqa: E501
94+
)
8395
self.fdb_coordinates.pop("quantile", None)
84-
# TODO: When do these not appear??
85-
self.fdb_coordinates.pop("direction", None)
86-
self.fdb_coordinates.pop("frequency", None)
96+
self.fdb_coordinates.pop("year", None)
97+
self.fdb_coordinates.pop("month", None)
8798

8899
# NOTE: verify that we also remove the axis object for axes we've removed here
89100
axes_to_remove = set(self.complete_axes) - set(self.fdb_coordinates.keys())

polytope_feature/datacube/transformations/datacube_mappers/datacube_mappers.py

+1
Original file line numberDiff line numberDiff line change
@@ -137,4 +137,5 @@ def unmap_tree_node(self, node, unwanted_path):
137137
"reduced_ll": "ReducedLatLonMapper",
138138
"local_regular": "LocalRegularGridMapper",
139139
"healpix_nested": "NestedHealpixGridMapper",
140+
"reduced_gaussian": "ReducedGaussianGridMapper",
140141
}

0 commit comments

Comments
 (0)