Skip to content

Commit b421d61

Browse files
authored
Merge pull request #655 from jbouffard/0.4.0-release
0.4.0 Release
2 parents 1c87c11 + e11a709 commit b421d61

File tree

8 files changed

+146
-10
lines changed

8 files changed

+146
-10
lines changed

.travis.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -65,7 +65,7 @@ script:
6565

6666
before_deploy:
6767
- export GEOPYSPARK_VERSION_SUFFIX="-${TRAVIS_COMMIT:0:7}"
68-
- aws s3 rm s3://geopyspark-dependency-jars/geotrellis-backend-assembly-0.3.0.jar
68+
- aws s3 rm s3://geopyspark-dependency-jars/geotrellis-backend-assembly-0.4.0.jar
6969

7070
deploy:
7171
- provider: script

Makefile

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -6,11 +6,11 @@ rwildcard=$(foreach d,$(wildcard $1*),$(call rwildcard,$d/,$2) $(filter $(subst
66

77
JAR-PATH := geopyspark/jars
88

9-
ASSEMBLYNAME := geotrellis-backend-assembly-0.3.0.jar
9+
ASSEMBLYNAME := geotrellis-backend-assembly-0.4.0.jar
1010
BUILD-ASSEMBLY := geopyspark-backend/geotrellis/target/scala-2.11/${ASSEMBLYNAME}
1111
DIST-ASSEMBLY := ${JAR-PATH}/${ASSEMBLYNAME}
1212

13-
WHEELNAME := geopyspark-0.3.0-py3-none-any.whl
13+
WHEELNAME := geopyspark-0.4.0-py3-none-any.whl
1414
WHEEL := dist/${WHEELNAME}
1515

1616
SCALA_SRC := $(call rwildcard, geopyspark-backend/geotrellis/src/, *.scala)

docs/CHANGELOG.rst

Lines changed: 136 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -2,6 +2,142 @@ Changelog
22
==========
33

44

5+
0.4.0
6+
------
7+
8+
New Features
9+
^^^^^^^^^^^^
10+
11+
Rasterizing an RDD[Geometry]
12+
*****************************
13+
14+
Users can now rasterize an ``RDD[shapely.geometry]`` via the
15+
``rasterize`` method.
16+
17+
.. code:: python3
18+
19+
# A Python RDD that contains shapely geomtries
20+
geometry_rdd = ...
21+
22+
gps.rasterize(geoms=geometry_rdd, crs="EPSG:3857", zoom=11, fill_value=1)
23+
24+
ZFactor Calculator
25+
*******************
26+
27+
``zfactor_lat_lng_caculator`` and ``zfactor_caclulator`` are two
28+
new functions that will caculate the the the ``zfactor`` for each
29+
``Tile`` in a layer during the ``slope`` or ``hillshade`` operations.
30+
This is better than using a single ``zfactor`` for all ``Tile``\s as
31+
``Tile``\s at different lattitdues require different ``zfactor``\s.
32+
33+
As mentioned above, there are two different forms of the calculator:
34+
``zfactor_lat_lng_calculator`` and ``zfactor_calculator``. The former
35+
being used for layers that are in the LatLng projection while the
36+
latter for layers in all other projections.
37+
38+
.. code:: python3
39+
40+
# Using the zfactor_lat_lng_calculator
41+
42+
# Create a zfactor_lat_lng_calculator which uses METERS for its calcualtions
43+
calculator = gps.zfactor_lat_lng_calculator(gps.METERS)
44+
45+
# A TiledRasterLayer which contains elevation data
46+
tiled_layer = ...
47+
48+
# Calcualte slope of the layer using the calcualtor
49+
tiled_layer.slope(calculator)
50+
51+
# Using the zfactor_calculator
52+
53+
# We must provide a dict that maps lattitude to zfactor for our
54+
# given projection. Linear interpolation will be used on these
55+
# values to produce the correct zfactor for each Tile in the
56+
# layer.
57+
58+
mapped_factors = {
59+
0.0: 0.1,
60+
10.0: 1.5,
61+
15.0: 2.0,
62+
20.0, 2.5
63+
}
64+
65+
# Create a zfactor_calculator using the given mapped factors
66+
calculator = gps.zfactor_calculator(mapped_factors)
67+
68+
PartitionStragies
69+
*****************
70+
71+
With this release of GeoPySpark comes three different parition
72+
strategies: ``HashPartitionStrategy``, ``SpatialPartitionStrategy``,
73+
and ``SpaceTimePartitionStrategy``. All three of these are used
74+
to partition a layer given their specified inputs.
75+
76+
HashPartitionStrategy
77+
######################
78+
79+
``HashPartitionStrategy`` is a partition strategy that uses
80+
Spark's ``HashPartitioner`` to partition a layer. This can
81+
be used on either ``SPATIAL`` or ``SPACETIME`` layers.
82+
83+
.. code:: python3
84+
85+
# Creates a HashPartitionStrategy with 128 partitions
86+
gps.HashPartitionStrategy(num_partitions=128)
87+
88+
SpatialPartitionStrategy
89+
#########################
90+
91+
``SpatialPartitionStrategy`` uses GeoPySpark's ``SpatialPartitioner``
92+
during partitioning of the layer. This strategy will try and
93+
partition the ``Tile``\s of a layer so that those which are near each
94+
other spatially will be in the same partition. This will
95+
only work on ``SPATIAL`` layers.
96+
97+
.. code:: python3
98+
99+
# Creates a SpatialPartitionStrategy with 128 partitions
100+
gps.SpatialPartitionStrategy(num_partitions=128)
101+
102+
SpaceTimePartitionStrategy
103+
###########################
104+
105+
``SpaceTimePartitionStrategy`` uses GeoPySpark's ``SpaceTimePartitioner``
106+
during partitioning of the layer. This strategy will try and
107+
partition the ``Tile``\s of a layer so that those which are near each
108+
other spatially and temporally will be in the same partition. This will
109+
only work on ``SPACETIME`` layers.
110+
111+
.. code:: python3
112+
113+
# Creates a SpaceTimePartitionStrategy with 128 partitions
114+
# and temporal resolution of 5 weeks. This means that
115+
# it will try and group the data in units of 5 weeks.
116+
gps.SpaceTimePartitionStrategy(time_unit=gps.WEEKS, num_partitions=128, time_resolution=5)
117+
118+
Other New Features
119+
*******************
120+
121+
- `tobler method for TiledRasterLayer <https://github.com/locationtech-labs/geopyspark/pull/567>`__
122+
- `slope method for TiledRasterLayer <https://github.com/locationtech-labs/geopyspark/pull/595>`__
123+
- `local_max method for TiledRasterLayer <https://github.com/locationtech-labs/geopyspark/pull/602>`__
124+
- `mask layers by RDD[Geometry] <https://github.com/locationtech-labs/geopyspark/pull/629>`__
125+
- `with_no_data method for RasterLayer and TiledRasterLayer <https://github.com/locationtech-labs/geopyspark/pull/631>`__
126+
- ``partitionBy`` method for ``RasterLayer`` and ``TiledRasterLayer``
127+
- ``get_partition_strategy`` method for ``CachableLayer``
128+
129+
Bug Fixes
130+
^^^^^^^^^
131+
132+
- `TiledRasterLayer reproject bug fix <https://github.com/locationtech-labs/geopyspark/pull/581>`__
133+
- `TMS display fix <https://github.com/locationtech-labs/geopyspark/pull/589>`__
134+
- `CellType representation and conversion fixes <https://github.com/locationtech-labs/geopyspark/pull/606>`__
135+
- `get_point_values will now return the correct number of results for temporal layers <https://github.com/locationtech-labs/geopyspark/pull/620>`__
136+
- `Reading layers and values from Accumulo fix <https://github.com/locationtech-labs/geopyspark/pull/621>`__
137+
- `time_intervals will now enumerate correctly in catalog.query <https://github.com/locationtech-labs/geopyspark/pull/623>`__
138+
- `TileReader will now read the correct attribures file <https://github.com/locationtech-labs/geopyspark/pull/637>`__
139+
140+
5141
0.3.0
6142
------
7143

docs/conf.py

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -32,7 +32,7 @@
3232
jar = 'geotrellis-backend-assembly-0.2.2.jar'
3333

3434
if not path.isfile(path.join('geopyspark/jars', jar)):
35-
url = 'https://github.com/locationtech-labs/geopyspark/releases/download/v0.3.0/'
35+
url = 'https://github.com/locationtech-labs/geopyspark/releases/download/v0.4.0/'
3636
subprocess.call(['curl', '-L', url+jar, '-o', path.join('geopyspark/jars', jar)])
3737

3838
sys.path.insert(0, path.abspath(os.curdir))
@@ -73,9 +73,9 @@
7373
# built documents.
7474
#
7575
# The short X.Y version.
76-
version = '0.3.0'
76+
version = '0.4.0'
7777
# The full version, including alpha/beta/rc tags.
78-
release = '0.3.0'
78+
release = '0.4.0'
7979

8080
# The language for content autogenerated by Sphinx. Refer to documentation
8181
# for a list of supported languages.

geopyspark-backend/project/Version.scala

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
object Version {
2-
val geopyspark = "0.3.0"
2+
val geopyspark = "0.4.0"
33
val geotrellis = "2.0.0-SNAPSHOT"
44
val scala = "2.11.11"
55
val scalaTest = "2.2.0"

geopyspark/command/configuration.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -10,7 +10,7 @@
1010
from geopyspark.geopyspark_constants import JAR, CWD
1111

1212

13-
JAR_URL = 'https://github.com/locationtech-labs/geopyspark/releases/download/v0.3.0/' + JAR
13+
JAR_URL = 'https://github.com/locationtech-labs/geopyspark/releases/download/v0.4.0/' + JAR
1414
DEFAULT_JAR_PATH = path.join(CWD, 'jars')
1515
CONF = path.join(CWD, 'command', 'geopyspark.conf')
1616

geopyspark/geopyspark_constants.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@
22
from os import path
33

44
"""GeoPySpark version."""
5-
VERSION = '0.3.0'
5+
VERSION = '0.4.0'
66

77
"""Backend jar name."""
88
JAR = 'geotrellis-backend-assembly-' + VERSION + '.jar'

setup.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -7,7 +7,7 @@
77

88
setup_args = dict(
99
name='geopyspark',
10-
version='0.3.0',
10+
version='0.4.0',
1111
author='Jacob Bouffard, James McClain',
1212
1313
download_url='http://github.com/locationtech-labs/geopyspark',

0 commit comments

Comments
 (0)