Skip to content

add polars LazyFrame generic type, element-wise checks, add docs #1521

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 10 commits into from
Mar 11, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions Makefile
Original file line number Diff line number Diff line change
Expand Up @@ -21,10 +21,10 @@ requirements:
pip install -r requirements-dev.txt

docs-clean:
rm -rf docs/**/generated docs/**/methods docs/_build docs/source/_contents
rm -rf docs/source/reference/generated docs/**/generated docs/**/methods docs/_build docs/source/_contents

docs: docs-clean
python -m sphinx -E "docs/source" "docs/_build" && make -C docs doctest
python -m sphinx -W -E "docs/source" "docs/_build" && make -C docs doctest

quick-docs:
python -m sphinx -E "docs/source" "docs/_build" -W && \
Expand Down
7 changes: 5 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -42,8 +42,9 @@ This is useful in production-critical or reproducible research settings. With

1. Define a schema once and use it to validate
[different dataframe types](https://pandera.readthedocs.io/en/stable/supported_libraries.html)
including [pandas](http://pandas.pydata.org), [dask](https://dask.org),
[modin](https://modin.readthedocs.io/), and [pyspark](https://spark.apache.org/docs/3.2.0/api/python/user_guide/pandas_on_spark/index.html).
including [pandas](http://pandas.pydata.org), [polars](https://docs.pola.rs/),
[dask](https://dask.org), [modin](https://modin.readthedocs.io/),
and [pyspark](https://spark.apache.org/docs/3.2.0/api/python/user_guide/pandas_on_spark/index.html).
1. [Check](https://pandera.readthedocs.io/en/stable/checks.html) the types and
properties of columns in a `DataFrame` or values in a `Series`.
1. Perform more complex statistical validation like
Expand Down Expand Up @@ -100,6 +101,7 @@ pip install pandera[modin] # validate modin dataframes
pip install pandera[modin-ray] # validate modin dataframes with ray
pip install pandera[modin-dask] # validate modin dataframes with dask
pip install pandera[geopandas] # validate geopandas geodataframes
pip install pandera[polars] # validate polars dataframes
```

</details>
Expand All @@ -120,6 +122,7 @@ conda install -c conda-forge pandera-modin # validate modin dataframes
conda install -c conda-forge pandera-modin-ray # validate modin dataframes with ray
conda install -c conda-forge pandera-modin-dask # validate modin dataframes with dask
conda install -c conda-forge pandera-geopandas # validate geopandas geodataframes
conda install -c conda-forge pandera-polars # validate polars dataframes
```

</details>
Expand Down
2 changes: 1 addition & 1 deletion docs/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -132,7 +132,7 @@
# documentation.

announcement = """
📢 Pandera 0.16.0 now supports <a href="pyspark_sql.html">Pyspark SQL</a> 🎉.
📢 Pandera 0.19.0 now supports <a href="/polars.html">Polars</a> 🎉.
If you like this project, <a href='https://github.com/unionai-oss/pandera' target='_blank'>give us a star ⭐️! </a>
"""

Expand Down
8 changes: 4 additions & 4 deletions docs/source/dataframe_models.rst
Original file line number Diff line number Diff line change
Expand Up @@ -96,7 +96,7 @@ As you can see in the examples above, you can define a schema by sub-classing
The :func:`~pandera.decorators.check_types` decorator is required to perform validation of the dataframe at
run-time.

Note that :class:`~pandera.api.pandas.model_components.Field` s apply to both
Note that :class:`~pandera.api.dataframe.model_components.Field` s apply to both
:class:`~pandera.api.pandas.components.Column` and :class:`~pandera.api.pandas.components.Index`
objects, exposing the built-in :class:`Check` s via key-word arguments.

Expand Down Expand Up @@ -716,7 +716,7 @@ Column/Index checks
* Similarly to ``pydantic``, :func:`classmethod` decorator is added behind the scenes
if omitted.
* You still may need to add the ``@classmethod`` decorator *after* the
:func:`~pandera.api.pandas.model_components.check` decorator if your static-type checker or
:func:`~pandera.api.dataframe.model_components.check` decorator if your static-type checker or
linter complains.
* Since ``checks`` are class methods, the first argument value they receive is a
DataFrameModel subclass, not an instance of a model.
Expand Down Expand Up @@ -839,7 +839,7 @@ Aliases
-------

:class:`~pandera.api.pandas.model.DataFrameModel` supports columns which are not valid python variable names via the argument
`alias` of :class:`~pandera.api.pandas.model_components.Field`.
`alias` of :class:`~pandera.api.dataframe.model_components.Field`.

Checks must reference the aliased names.

Expand Down Expand Up @@ -887,7 +887,7 @@ the class scope, and it will respect the alias.
.. note::

To access a variable from the class scope, you need to make it a class attribute,
and therefore assign it a default :class:`~pandera.api.pandas.model_components.Field`.
and therefore assign it a default :class:`~pandera.api.dataframe.model_components.Field`.

.. testcode:: dataframe_schema_model

Expand Down
2 changes: 1 addition & 1 deletion docs/source/dtype_validation.rst
Original file line number Diff line number Diff line change
Expand Up @@ -123,7 +123,7 @@ express this same type with the class-based API, we need to use an
dt: Series[Annotated[pd.DatetimeTZDtype, "ns", "UTC"]]

Or alternatively, you can pass in the ``dtype_kwargs`` into
:py:func:`~pandera.api.pandas.model_components.Field`:
:py:func:`~pandera.api.dataframe.model_components.Field`:

.. testcode:: dtype_validation

Expand Down
4 changes: 3 additions & 1 deletion docs/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -78,7 +78,7 @@ This is useful in production-critical data pipelines or reproducible research
settings. With ``pandera``, you can:

#. Define a schema once and use it to validate :ref:`different dataframe types <supported-dataframe-libraries>`
including `pandas <http://pandas.pydata.org>`_, `dask <https://dask.org/>`_,
including `pandas <http://pandas.pydata.org>`_, `polars <https://docs.pola.rs/>`, `dask <https://dask.org/>`_,
`modin <https://modin.readthedocs.io/>`_, and
`pyspark.pandas <https://spark.apache.org/docs/3.2.0/api/python/user_guide/pandas_on_spark/index.html>`_.
#. :ref:`Check<checks>` the types and properties of columns in a
Expand Down Expand Up @@ -137,6 +137,7 @@ Installing additional functionality:
pip install pandera[modin-ray] # validate modin dataframes with ray
pip install pandera[modin-dask] # validate modin dataframes with dask
pip install pandera[geopandas] # validate geopandas geodataframes
pip install pandera[polars] # validate polars dataframes

.. tabbed:: conda

Expand All @@ -153,6 +154,7 @@ Installing additional functionality:
conda install -c conda-forge pandera-modin-ray # validate modin dataframes with ray
conda install -c conda-forge pandera-modin-dask # validate modin dataframes with dask
conda install -c conda-forge pandera-geopandas # validate geopandas geodataframes
conda install -c conda-forge pandera-polars # validate polars dataframes

Quick Start
-----------
Expand Down
11 changes: 0 additions & 11 deletions docs/source/koalas.rst

This file was deleted.

Loading