Skip to content

pre-commit: add black, isort, pyupgrade, autoflake, flake8, and misc hooks #104

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 4 commits into from
Apr 5, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 12 additions & 0 deletions .flake8
Original file line number Diff line number Diff line change
@@ -0,0 +1,12 @@
# flake8 is used for linting Python code setup to automatically run with
# pre-commit.
#
# ref: https://flake8.pycqa.org/en/latest/user/configuration.html
#

[flake8]
# E: style errors
# W: style warnings
# C: complexity
# D: docstring warnings (unused pydocstyle extension)
ignore = E, C, W, D
71 changes: 65 additions & 6 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -1,7 +1,66 @@
# pre-commit is a tool to perform a predefined set of tasks manually and/or
# automatically before git commits are made.
#
# Config reference: https://pre-commit.com/#pre-commit-configyaml---top-level
#
# Common tasks
#
# - Run on all files: pre-commit run --all-files
# - Register git hooks: pre-commit install --install-hooks
#
repos:
- repo: https://github.com/google/go-jsonnet
rev: v0.20.0
hooks:
- id: jsonnet-format
# jsonnet-lint hook doesn't work
# - id: jsonnet-lint
- repo: https://github.com/google/go-jsonnet
rev: v0.20.0
hooks:
- id: jsonnet-format
# jsonnet-lint hook doesn't work
# - id: jsonnet-lint

# Autoformat: Python code, syntax patterns are modernized
- repo: https://github.com/asottile/pyupgrade
rev: v3.15.2
hooks:
- id: pyupgrade
args:
- --py38-plus

# Autoformat: Python code
- repo: https://github.com/PyCQA/autoflake
rev: v2.3.1
hooks:
- id: autoflake
# args ref: https://github.com/PyCQA/autoflake#advanced-usage
args:
- --in-place

# Autoformat: Python code
- repo: https://github.com/pycqa/isort
rev: 5.13.2
hooks:
- id: isort

# Autoformat: Python code
- repo: https://github.com/psf/black
rev: 24.3.0
hooks:
- id: black

# Misc...
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.5.0
# ref: https://github.com/pre-commit/pre-commit-hooks#hooks-available
hooks:
- id: end-of-file-fixer
- id: requirements-txt-fixer
- id: check-case-conflict
- id: check-executables-have-shebangs

# Lint: Python code
- repo: https://github.com/PyCQA/flake8
rev: "7.0.0"
hooks:
- id: flake8

# pre-commit.ci config reference: https://pre-commit.ci/#configuration
ci:
autoupdate_schedule: monthly
1 change: 1 addition & 0 deletions dashboards/cluster.jsonnet
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
#!/usr/bin/env -S jsonnet -J ../vendor
// Deploys a dashboard showing cluster-wide information
local grafana = import '../vendor/grafonnet/grafana.libsonnet';
local dashboard = grafana.dashboard;
Expand Down
2 changes: 1 addition & 1 deletion dashboards/jupyterhub.jsonnet
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
#!/usr/bin/env jsonnet -J ../vendor
#!/usr/bin/env -S jsonnet -J ../vendor
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I had to use -S to not have env parse the content above as a command with spaces in it.

// Deploys one dashboard - "JupyterHub dashboard",
// with useful stats about usage & diagnostics.
local grafana = import 'grafonnet/grafana.libsonnet';
Expand Down
Empty file modified dashboards/jupyterhub.libsonnet
100755 → 100644
Empty file.
1 change: 1 addition & 0 deletions dashboards/support.jsonnet
100644 → 100755
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
#!/usr/bin/env -S jsonnet -J ../vendor
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I had to ensure shebangs were added to executable files.

I removed the executable flag for the libsonnet file that renders blank anyhow.

// Deploys a dashboard showing information about support resources
local grafana = import '../vendor/grafonnet/grafana.libsonnet';
local dashboard = grafana.dashboard;
Expand Down
1 change: 1 addition & 0 deletions dashboards/usage-report.jsonnet
100644 → 100755
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
#!/usr/bin/env -S jsonnet -J ../vendor
local grafana = import 'grafonnet/grafana.libsonnet';
local dashboard = grafana.dashboard;
local prometheus = grafana.prometheus;
Expand Down
2 changes: 1 addition & 1 deletion dashboards/user.jsonnet
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
#!/usr/bin/env jsonnet -J ../vendor
#!/usr/bin/env -S jsonnet -J ../vendor
local grafana = import 'grafonnet/grafana.libsonnet';
local dashboard = grafana.dashboard;
local singlestat = grafana.singlestat;
Expand Down
93 changes: 57 additions & 36 deletions deploy.py
Original file line number Diff line number Diff line change
@@ -1,26 +1,23 @@
#!/usr/bin/env python3
import json
import argparse
import json
import os
from glob import glob
from functools import partial
import subprocess
from urllib.request import urlopen, Request
from urllib.parse import urlencode
from urllib.error import HTTPError
from copy import deepcopy
import re
import ssl
import subprocess
from copy import deepcopy
from functools import partial
from glob import glob
from urllib.error import HTTPError
from urllib.parse import urlencode
from urllib.request import Request, urlopen

# UID for the folder under which our dashboards will be setup
DEFAULT_FOLDER_UID = '70E5EE84-1217-4021-A89E-1E3DE0566D93'


def grafana_request(endpoint, token, path, data=None, no_tls_verify=False):
headers = {
'Authorization': f'Bearer {token}',
'Content-Type': 'application/json'
}
headers = {'Authorization': f'Bearer {token}', 'Content-Type': 'application/json'}
method = 'GET' if data is None else 'POST'
req = Request(f'{endpoint}/api{path}', headers=headers, method=method)

Expand All @@ -44,10 +41,7 @@ def ensure_folder(name, uid, api):
except HTTPError as e:
if e.code == 404:
# We got a 404 in
folder = {
'uid': uid,
'title': name
}
folder = {'uid': uid, 'title': name}
return api('/folders', folder)
else:
raise
Expand All @@ -60,12 +54,18 @@ def build_dashboard(dashboard_path, api, global_dash=False):

# We pass the list of all datasources because the global dashboards
# use this information to show info about all datasources in the same panel
return json.loads(subprocess.check_output(
[
"jsonnet", "-J", "vendor", dashboard_path,
"--tla-code", f"datasources={datasources_names}"
]
).decode())
return json.loads(
subprocess.check_output(
[
"jsonnet",
"-J",
"vendor",
dashboard_path,
"--tla-code",
f"datasources={datasources_names}",
]
).decode()
)


def layout_dashboard(dashboard):
Expand Down Expand Up @@ -108,11 +108,7 @@ def deploy_dashboard(dashboard_path, folder_uid, api, global_dash=False):
db = layout_dashboard(db)
db = populate_template_variables(api, db)

data = {
'dashboard': db,
'folderId': folder_uid,
'overwrite': True
}
data = {'dashboard': db, 'folderId': folder_uid, 'overwrite': True}
api('/dashboards/db', data)


Expand All @@ -125,15 +121,17 @@ def get_label_values(api, ds_id, template_query):
in a dashboard
"""
# re.DOTALL allows the query to be multi-line
match = re.match(r'label_values\((?P<query>.*),\s*(?P<label>.*)\)', template_query, re.DOTALL)
match = re.match(
r'label_values\((?P<query>.*),\s*(?P<label>.*)\)', template_query, re.DOTALL
)
query = match.group('query')
label = match.group('label')
query = {'match[]': query}
# Send a request to the backing prometheus datastore
proxy_url = f'/datasources/proxy/{ds_id}/api/v1/series?{urlencode(query)}'

metrics = api(proxy_url)['data']
return sorted(set(m[label] for m in metrics))
return sorted({m[label] for m in metrics})


def populate_template_variables(api, db):
Expand All @@ -151,7 +149,9 @@ def populate_template_variables(api, db):
for var in db.get('templating', {}).get('list', []):
datasources = api("/datasources")
if var["type"] == "datasource":
var["options"] = [{"text": ds["name"], "value": ds["name"]} for ds in datasources]
var["options"] = [
{"text": ds["name"], "value": ds["name"]} for ds in datasources
]

# default selection: first datasource in list
if datasources and not var.get("current"):
Expand Down Expand Up @@ -188,17 +188,38 @@ def populate_template_variables(api, db):
def main():
parser = argparse.ArgumentParser()
parser.add_argument('grafana_url', help='Grafana endpoint to deploy dashboards to')
parser.add_argument('--dashboards-dir', default="dashboards", help='Directory of jsonnet dashboards to deploy')
parser.add_argument('--folder-name', default='JupyterHub Default Dashboards', help='Name of Folder to deploy to')
parser.add_argument('--folder-uid', default=DEFAULT_FOLDER_UID, help='UID of grafana folder to deploy to')
parser.add_argument('--no-tls-verify', action='store_true', default=False,
help='Whether or not to skip TLS certificate validation')
parser.add_argument(
'--dashboards-dir',
default="dashboards",
help='Directory of jsonnet dashboards to deploy',
)
parser.add_argument(
'--folder-name',
default='JupyterHub Default Dashboards',
help='Name of Folder to deploy to',
)
parser.add_argument(
'--folder-uid',
default=DEFAULT_FOLDER_UID,
help='UID of grafana folder to deploy to',
)
parser.add_argument(
'--no-tls-verify',
action='store_true',
default=False,
help='Whether or not to skip TLS certificate validation',
)

args = parser.parse_args()

grafana_token = os.environ['GRAFANA_TOKEN']

api = partial(grafana_request, args.grafana_url, grafana_token, no_tls_verify=args.no_tls_verify)
api = partial(
grafana_request,
args.grafana_url,
grafana_token,
no_tls_verify=args.no_tls_verify,
)
folder = ensure_folder(args.folder_name, args.folder_uid, api)

for dashboard in glob(f'{args.dashboards_dir}/*.jsonnet'):
Expand Down
2 changes: 1 addition & 1 deletion docs/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
myst-parser[linkify]
sphinx-book-theme
sphinx-autobuild
sphinx-book-theme
2 changes: 1 addition & 1 deletion global-dashboards/README.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,3 @@
# Dashboards across all datasources

Contains "global" dashboards with useful stats computed across all datasources.
Contains "global" dashboards with useful stats computed across all datasources.
1 change: 1 addition & 0 deletions noxfile.py
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,7 @@
- Install nox: pip install nox
- Start a live reloading docs server: nox -s docs -- live
"""

import nox

nox.options.reuse_existing_virtualenvs = True
Expand Down
33 changes: 33 additions & 0 deletions pyproject.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,33 @@
# autoflake is used for autoformatting Python code
#
# ref: https://github.com/PyCQA/autoflake#readme
#
[tool.autoflake]
ignore-init-module-imports = true
remove-all-unused-imports = true
remove-duplicate-keys = true
remove-unused-variables = true


# black is used for autoformatting Python code
#
# ref: https://black.readthedocs.io/en/stable/
#
[tool.black]
skip-string-normalization = true
# target-version should be all supported versions, see
# https://github.com/psf/black/issues/751#issuecomment-473066811
target_version = [
"py38",
"py39",
"py310",
"py311",
]


# isort is used for autoformatting Python code
#
# ref: https://pycqa.github.io/isort/
#
[tool.isort]
profile = "black"