Skip to content

Sync staging prod may 1 #681

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 130 commits into from
May 1, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
130 commits
Select commit Hold shift + click to select a range
b9880a1
[sitemap]: remove diallow all from sitemap
steveoni Oct 23, 2024
d631f49
Merge branch 'dev' of https://github.com/wri/wri-odp into dev
steveoni Nov 12, 2024
fcd12d7
Merge branch 'dev' of https://github.com/wri/wri-odp into dev
steveoni Nov 14, 2024
562a244
Merge branch 'dev' of https://github.com/wri/wri-odp into dev
steveoni Nov 15, 2024
eefd6d9
Merge branch 'dev' of https://github.com/wri/wri-odp into dev
steveoni Nov 21, 2024
f80457c
Add harvest configs and supervisorctl
mpolidori Dec 6, 2024
81484dc
Merge branch 'dev' of https://github.com/wri/wri-odp into dev
steveoni Dec 8, 2024
6faf557
Force rerun deployment
mpolidori Dec 9, 2024
447359a
Merge branch 'dev' of https://github.com/wri/wri-odp into dev
steveoni Dec 11, 2024
4397209
Force rerun deployment
mpolidori Dec 11, 2024
1f98003
Merge branch 'dev' of github.com:wri/wri-odp into add-harvest-configs
mpolidori Dec 11, 2024
fdaf7e3
Merge branch 'dev' of https://github.com/wri/wri-odp into dev
steveoni Dec 13, 2024
d260f43
empty
steveoni Dec 15, 2024
d9b21de
Fix supervisord file permissions
mpolidori Dec 15, 2024
5934805
Add harvester to initial plugins
mpolidori Dec 15, 2024
21108c0
Merge pull request #600 from wri/add-harvest-configs
mpolidori Dec 15, 2024
889a0f0
Trigger CI
luccasmmg Dec 16, 2024
54ceef4
Searchcard (#605)
steveoni Dec 16, 2024
e2603cb
Fix supervisord.conf path
mpolidori Dec 16, 2024
b383498
Merge branch 'dev' of github.com:wri/wri-odp into add-harvest-configs
mpolidori Dec 16, 2024
d1db3c6
Fix supervisord file path in Dockerfile.dev
mpolidori Dec 16, 2024
9f8b1ab
Revert "Merge branch 'dev' of https://github.com/wri/wri-odp into dev…
mpolidori Dec 16, 2024
7100f40
Re apply multiple applications (#609)
mpolidori Dec 16, 2024
d08a34a
Merge branch 'dev' of https://github.com/wri/wri-odp into dev
steveoni Dec 17, 2024
d8c345d
Merge branch 'dev' of github.com:wri/wri-odp into add-harvest-configs
mpolidori Dec 17, 2024
68d46fe
Force rerun deployment
mpolidori Dec 17, 2024
3270279
Add log output after unit tests
mpolidori Dec 17, 2024
046b4e0
Move error logs in GitHub Actions
mpolidori Dec 17, 2024
34b8cf5
Improve group package count api call (#611)
steveoni Dec 17, 2024
ac59f1a
Move log print; Fix timeout
mpolidori Dec 17, 2024
a163520
Merge branch 'dev' of https://github.com/wri/wri-odp into dev
steveoni Dec 18, 2024
a407166
update auto-url field in dataset form
steveoni Dec 18, 2024
5e6408f
Merge pull request #613 from wri/editdataset
luccasmmg Dec 18, 2024
0ac2fb0
ODP-399 (#614)
luccasmmg Dec 18, 2024
e9981cf
scroll to the top on state change (#615)
steveoni Dec 18, 2024
c22b950
Swap back to Azure SSO from Okta SSO
mpolidori Dec 18, 2024
6bb1d57
Merge pull request #616 from wri/azure-returns
luccasmmg Dec 18, 2024
77e3dc2
ODP-397 (#617)
luccasmmg Dec 19, 2024
b0ff795
Fix requesteds by Nesma
luccasmmg Dec 19, 2024
3c42fba
ODP-380 (#620)
luccasmmg Dec 19, 2024
104dd5e
Allow CORS in tests
mpolidori Jan 2, 2025
4cd7590
Remove unnecessary build section from docker-compose.test
mpolidori Jan 2, 2025
7b714f5
Update docs
luccasmmg Jan 6, 2025
9b92be2
update application and others (#624)
steveoni Jan 7, 2025
6cbb712
update images and doc (#623)
steveoni Jan 7, 2025
c1f3d0a
update userdoc search page
steveoni Jan 8, 2025
a2e7dac
Merge pull request #625 from wri/userdocsearch
luccasmmg Jan 8, 2025
3f41113
Applications docs
luccasmmg Jan 9, 2025
86bd3ce
[ODP-400] Homepage search bar improvements (#612)
steveoni Jan 9, 2025
490c5fc
Merge branch 'dev' into sync-dev-staging-jan-13
luccasmmg Jan 13, 2025
9858cf1
update cta button (#627)
steveoni Jan 13, 2025
85e41c9
Fix build
luccasmmg Jan 13, 2025
29d3576
User info (#628)
luccasmmg Jan 15, 2025
3746440
Rm poetry
luccasmmg Jan 15, 2025
da84b72
Merge branch 'dev' of https://github.com/wri/wri-odp into dev
luccasmmg Jan 15, 2025
128e316
QA Fixes
luccasmmg Jan 15, 2025
98e35b8
Fix ortto signup (Only when user accepts)
luccasmmg Jan 15, 2025
7f5cd7a
Fixes onSuccess mutation download batch
luccasmmg Jan 16, 2025
3e87196
Downgrade git
luccasmmg Jan 16, 2025
400243a
QA Fixes
luccasmmg Jan 16, 2025
550a389
Delete test thats no longer relevant
luccasmmg Jan 16, 2025
6938f8c
Fix test cost splitting
luccasmmg Jan 16, 2025
cf18e05
Fix test cost splitting
luccasmmg Jan 17, 2025
3d503bc
Rm console.log
luccasmmg Jan 17, 2025
8e004b1
Change wording for button
luccasmmg Jan 20, 2025
c28c7d5
Marketing + Vulnerability
luccasmmg Jan 20, 2025
dfe4edc
fix issue with weird scrolling behaviour (#629)
steveoni Jan 21, 2025
e993a06
Update to have timestamp
luccasmmg Jan 22, 2025
5b0c7b0
Merge branch 'dev' of https://github.com/wri/wri-odp into dev
luccasmmg Jan 22, 2025
3724221
Fix vulnerability
luccasmmg Jan 22, 2025
a5f6186
[run_unit_tests.sh] Give verbose output
MuhammadIsmailShahzad Jan 22, 2025
883a173
Merge branch 'dev' into sync-dev-staging-jan-22
luccasmmg Jan 22, 2025
a577aab
fix search input alignment
steveoni Jan 23, 2025
ebd5345
update search input fields
steveoni Jan 23, 2025
6eeecc8
Merge pull request #630 from wri/miscfix-odp400
mpolidori Jan 23, 2025
976f664
Kim comments
luccasmmg Jan 27, 2025
794da8d
[main.yml] Temp. stop build logs
MuhammadIsmailShahzad Jan 27, 2025
e94d008
Fix vulnerability
luccasmmg Jan 27, 2025
0e30389
[main.yml] Fix broken build
MuhammadIsmailShahzad Jan 27, 2025
c32c6d0
[main.yml] Fix broken build
MuhammadIsmailShahzad Jan 27, 2025
2ff5d69
[main.yml] Typo fix
MuhammadIsmailShahzad Jan 27, 2025
d56aaf6
[main.yml] Fix image name
MuhammadIsmailShahzad Jan 27, 2025
98333a2
Fix alignment recent added/updated
luccasmmg Jan 27, 2025
92255fd
Fix broken link
luccasmmg Jan 27, 2025
46799cb
ODP-397 Kim request
luccasmmg Jan 27, 2025
4aafeee
update dataset edit url placeholder (#631)
steveoni Jan 27, 2025
8e3d4f8
[docker-compose.test.yml] Temp. use redis 3
MuhammadIsmailShahzad Jan 29, 2025
22ed327
[main.yml] Run build
MuhammadIsmailShahzad Jan 29, 2025
4bdff08
[main.yml] Disable build
MuhammadIsmailShahzad Jan 29, 2025
58156a1
[docker-compose.test.yml] Add redis retry
MuhammadIsmailShahzad Jan 29, 2025
7197ab6
[docker-compose.test.yml] Dont run frontend
MuhammadIsmailShahzad Jan 29, 2025
bfb923c
Merge branch 'dev' into add-harvest-configs
MuhammadIsmailShahzad Jan 29, 2025
7c5a980
[main.yml] Don't run frontend docker
MuhammadIsmailShahzad Jan 29, 2025
6df764b
[docker-compose.test.yml] Remove redis health check
MuhammadIsmailShahzad Jan 29, 2025
8582caf
[docker-compose.test.yml] Remove datapusher temporarily
MuhammadIsmailShahzad Jan 29, 2025
d6623a5
[docker-compose.test.yml] Fix redis container name
MuhammadIsmailShahzad Jan 29, 2025
916d936
[run_unit_tests.sh] Run traceback as long
MuhammadIsmailShahzad Jan 29, 2025
22cdcae
[run_unit_tests.sh] Update test command
MuhammadIsmailShahzad Jan 29, 2025
f0d0ac3
Small UI Fixes
luccasmmg Jan 29, 2025
1fd4416
Fix font size
luccasmmg Jan 29, 2025
c5cce96
Merge branch 'dev' into sync-dev-staging-jan-30
luccasmmg Jan 30, 2025
aeca810
[supervisor.harvest.conf] Temp. run harvester as root
MuhammadIsmailShahzad Jan 31, 2025
ac8fede
[main.yml] Run the build and push the images
MuhammadIsmailShahzad Jan 31, 2025
1888e51
[docker-compose.test.yml] Revert changes
MuhammadIsmailShahzad Jan 31, 2025
587215e
[main.yml] Run tests
MuhammadIsmailShahzad Jan 31, 2025
5edc863
Merge pull request #607 from wri/add-harvest-configs
mpolidori Jan 31, 2025
b51ae00
[supervisor.*.conf] Use env variable to dynamically set user based on…
mpolidori Jan 31, 2025
4da54a2
update filter scrolltop (#633)
steveoni Feb 3, 2025
d1686f9
Kim requests(Download id, new fomratting date)
luccasmmg Feb 3, 2025
a1d3156
Fix fonts
luccasmmg Feb 3, 2025
611d528
Merge branch 'dev' into sync-dev-staging-feb-3
luccasmmg Feb 3, 2025
7fe11c9
Merge pull request #634 from wri/harvest-config-jan-31-2025
mpolidori Feb 3, 2025
77fb2d4
Fix timezone
luccasmmg Feb 3, 2025
6c07711
Merge branch 'dev' of https://github.com/wri/wri-odp into dev
luccasmmg Feb 3, 2025
1728197
Ortto forms
luccasmmg Feb 3, 2025
4c55584
Fix timezone
luccasmmg Feb 3, 2025
b3e28b7
Fix font size
luccasmmg Feb 3, 2025
e6cfdd1
Fix typo
luccasmmg Feb 3, 2025
3db53ab
Trigger CI
luccasmmg Feb 3, 2025
b8c65f2
Merge branch 'dev' into sync-dev-staging-feb-3-2
luccasmmg Feb 3, 2025
68c5423
Fix build
luccasmmg Feb 4, 2025
2cbd839
Merge branch 'dev' into sync-dev-staging-feb-4
luccasmmg Feb 4, 2025
5adf41d
Sync dev staging 02 17 25 (#639)
mpolidori Feb 19, 2025
6356c6b
Sync dev staging 03 03 25 (#644)
mpolidori Mar 3, 2025
bcb11f3
Sync dev staging mar 19 (#653)
luccasmmg Mar 20, 2025
5b29a5e
Sync dev staging mar 26 (#659)
luccasmmg Mar 27, 2025
3a2baac
Sync dev staging apr 14 (#671)
luccasmmg Apr 14, 2025
9fa26b9
Trigger CI
luccasmmg Apr 14, 2025
54dfb1a
Fix ODP-430 on Staging (#677)
luccasmmg Apr 25, 2025
b61f41e
Merge branch 'staging' into sync-staging-prod-may-1
luccasmmg May 1, 2025
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion ckan-backend-dev/.env.example
Original file line number Diff line number Diff line change
Expand Up @@ -101,7 +101,7 @@ CKANEXT__S3FILESTORE__HOST_NAME=http://minio:9000

# scheming
CKAN___SCHEMING__DATASET_SCHEMAS=ckanext.wri.schema:ckan_dataset.yaml
CKAN___SCHEMING__ORGANIZATION_SCHEMAS=ckanext.scheming:custom_org_with_address.json
CKAN___SCHEMING__ORGANIZATION_SCHEMAS=ckanext.wri.schema:custom_org_with_address.json
CKAN___SCHEMING__GROUP_SCHEMAS=ckanext.wri.schema:wri_application.json
CKAN___SCHEMING__PRESETS=ckanext.wri.schema:presets.json

Expand Down
8 changes: 8 additions & 0 deletions ckan-backend-dev/ckan/Dockerfile.dev
Original file line number Diff line number Diff line change
Expand Up @@ -137,6 +137,14 @@ COPY setup/supervisord.conf /etc/supervisord.conf
COPY setup/supervisor.worker.conf /etc/supervisord.d/worker.conf
COPY setup/supervisor.harvest.conf /etc/supervisord.d/harvest.conf

RUN SUPERVISOR_USER="root"; \
SUPERVISOR_HOME="/tmp"; \
sed -i "s|SUPERVISOR_USER|${SUPERVISOR_USER}|g" /etc/supervisord.d/harvest.conf && \
sed -i "s|SUPERVISOR_HOME|${SUPERVISOR_HOME}|g" /etc/supervisord.d/harvest.conf && \
sed -i "s|SUPERVISOR_USER|${SUPERVISOR_USER}|g" /etc/supervisord.d/worker.conf && \
sed -i "s|SUPERVISOR_HOME|${SUPERVISOR_HOME}|g" /etc/supervisord.d/worker.conf && \
sed -i "s|SUPERVISOR_HOME|${SUPERVISOR_HOME}|g" /etc/supervisord.conf

RUN chown -R ckan:ckan /etc/supervisord.d
RUN chown -R ckan:ckan /etc/supervisord.conf

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -905,4 +905,4 @@ def organization_create(context, data_dict):


result = old_organization_create(context, data_dict)
return result
return result
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
import ckanext.s3filestore.uploader as uploader
from ckan.lib.mailer import mail_recipient
from ckan.common import config
from .datapusher_download_zip import fetch_dataset_name
from .datapusher_download_zip import fetch_dataset_name, get_admin_emails_for_dataset

import datetime
import requests
Expand Down Expand Up @@ -75,8 +75,21 @@ def download_request(context: Context, data_dict: dict[str, Any]):
"error": "{}",
}

dataset_id = fetch_dataset_name({
"entity_id": res_id,
"entity_type": "resource"
})
admin_email= get_admin_emails_for_dataset(dataset_id)

value = {}
if admin_email:
value["admin_emails"] = admin_email

if email:
task["value"] = json.dumps({"emails": [email]})
value["emails"] = [email]

task["value"] = json.dumps(value)


try:
existing_task = p.toolkit.get_action("task_status_show")(
Expand Down Expand Up @@ -105,6 +118,7 @@ def download_request(context: Context, data_dict: dict[str, Any]):

if update_emails:
existing_task_values['emails'] = existing_task_emails
existing_task_values["admin_emails"] = admin_email
existing_task['value'] = json.dumps(existing_task_values)
p.toolkit.get_action("task_status_update")({ "ignore_auth": True }, existing_task)

Expand Down Expand Up @@ -182,7 +196,7 @@ def download_request(context: Context, data_dict: dict[str, Any]):
"entity_id": res_id,
"entity_type": "resource"
})
send_error([email], resource_title, dataset_name)
send_error([email]+admin_email, resource_title, dataset_name)
raise p.toolkit.ValidationError(error)

try:
Expand All @@ -208,11 +222,11 @@ def download_request(context: Context, data_dict: dict[str, Any]):
"entity_id": res_id,
"entity_type": "resource"
})
send_error([email], resource_title, dataset_name)
send_error([email]+admin_email, resource_title, dataset_name)
raise p.toolkit.ValidationError(error)

value = {"job_id": r.json()["id"]}

value["admin_emails"] = admin_email
if email:
value["emails"] = [email]

Expand Down Expand Up @@ -251,6 +265,7 @@ def download_callback(context: Context, data_dict: dict[str, Any]):

value = json.loads(task["value"])
emails = value.get("emails", [])
admin_email = value.get("admin_emails", [])
download_filename = value.get("download_filename")

log.info("Preparing to send email...")
Expand All @@ -263,7 +278,7 @@ def download_callback(context: Context, data_dict: dict[str, Any]):
"entity_id": entity_id,
"entity_type": "resource"
})
send_error(emails, download_filename, dataset_name)
send_error(emails+admin_email, download_filename, dataset_name)
log.error(error)


Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@
import ckanext.s3filestore.uploader as uploader
from ckan.lib.mailer import mail_recipient
from ckan.common import config
from .datapusher_download_zip import fetch_dataset_name
from .datapusher_download_zip import fetch_dataset_name, get_admin_emails_for_dataset

import datetime
import requests
Expand Down Expand Up @@ -143,8 +143,15 @@ def subset_download_request(context: Context, data_dict: dict[str, Any]):
"error": "{}",
}

admin_email= get_admin_emails_for_dataset(dataset_id)
value = {}
if admin_email:
value["admin_emails"] = admin_email

if email:
task["value"] = json.dumps({"emails": [email]})
value["emails"] = [email]

task["value"] = json.dumps(value)

try:
existing_task = p.toolkit.get_action("task_status_show")(
Expand Down Expand Up @@ -176,6 +183,7 @@ def subset_download_request(context: Context, data_dict: dict[str, Any]):

if update_emails:
existing_task_values["emails"] = existing_task_emails
existing_task_values["admin_emails"] = admin_email
existing_task["value"] = json.dumps(existing_task_values)
p.toolkit.get_action("task_status_update")(
{"ignore_auth": True}, existing_task
Expand Down Expand Up @@ -260,7 +268,7 @@ def subset_download_request(context: Context, data_dict: dict[str, Any]):
"entity_id": id if provider == "datastore" else dataset_id,
"entity_type": "resource" if provider == "datastore" else "dataset"
})
send_error([email], "Subset of data", dataset_name)
send_error([email]+admin_email, "Subset of data", dataset_name)
raise p.toolkit.ValidationError(error)

try:
Expand All @@ -286,11 +294,11 @@ def subset_download_request(context: Context, data_dict: dict[str, Any]):
"entity_id": id if provider == "datastore" else dataset_id,
"entity_type": "resource" if provider == "datastore" else "dataset"
})
send_error([email], "Subset of data", dataset_name)
send_error([email]+admin_email, "Subset of data", dataset_name)
raise p.toolkit.ValidationError(error)

value = {"job_id": r.json()["id"]}

value["admin_emails"] = admin_email
if email:
value["emails"] = [email]

Expand Down Expand Up @@ -325,6 +333,7 @@ def subset_download_callback(context: Context, data_dict: dict[str, Any]):

value = json.loads(task["value"])
emails = value.get("emails", [])
admin_emails = value.get("admin_emails", [])
download_filename = value.get("download_filename")


Expand All @@ -336,7 +345,7 @@ def subset_download_callback(context: Context, data_dict: dict[str, Any]):
"entity_id": entity_id,
"entity_type": data_dict.get("entity_type", "resource")
})
send_error(emails, download_filename, dataset_name)
send_error(emails + admin_emails, download_filename, dataset_name)
log.error(error)


Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -66,6 +66,31 @@ def build_download_filename(dataset_id: str, context) -> str:
return False



def get_admin_emails_for_dataset(dataset_id: str) -> list[str]:
package = p.toolkit.get_action("package_show")(
{"ignore_auth": True}, {"id": dataset_id}
)
organization = package.get("organization", None)
admin_email = []
if organization:
organization = organization.get("name")
org = p.toolkit.get_action("organization_show")(
{"ignore_auth": True}, {"id": organization, "include_users": True}
)
users = org.get("users", [])
if users:
for user in users:
if user.get("capacity") == "admin":
users_obj = p.toolkit.get_action("user_show")(
{"ignore_auth": True}, {"id": user.get("id")}
)
if users_obj.get("email", False):
admin_email.append(users_obj.get("email"))
return admin_email



def zipped_download_request(context: Context, data_dict: dict[str, Any]):
prefect_url: str = config.get("ckanext.wri.prefect_url")
deployment_name: str = config.get("ckanext.wri.datapusher_deployment_name")
Expand Down Expand Up @@ -93,11 +118,21 @@ def zipped_download_request(context: Context, data_dict: dict[str, Any]):
"state": "submitting",
"key": filename,
"value": "{}",
"error": "{}",
"error": "{}",
}

admin_email= get_admin_emails_for_dataset(dataset_id)
log.error(f"admin_email: {admin_email}")


value = {}
if admin_email:
value["admin_emails"] = admin_email

if email:
task["value"] = json.dumps({"emails": [email]})
value["emails"] = [email]

task["value"] = json.dumps(value)

try:
existing_task = p.toolkit.get_action("task_status_show")(
Expand Down Expand Up @@ -129,6 +164,8 @@ def zipped_download_request(context: Context, data_dict: dict[str, Any]):

if update_emails:
existing_task_values["emails"] = existing_task_emails
if admin_email:
existing_task_values["admin_emails"] = admin_email
existing_task["value"] = json.dumps(existing_task_values)
p.toolkit.get_action("task_status_update")(
{"ignore_auth": True}, existing_task
Expand Down Expand Up @@ -204,7 +241,7 @@ def zipped_download_request(context: Context, data_dict: dict[str, Any]):
task["state"] = "error"
task["last_updated"] = (str(datetime.datetime.utcnow()),)
p.toolkit.get_action("task_status_update")(context, task)
send_error([email], "Zipped data")
send_error([email]+admin_email, "Zipped data")
raise p.toolkit.ValidationError(error)

try:
Expand All @@ -226,14 +263,17 @@ def zipped_download_request(context: Context, data_dict: dict[str, Any]):
task["state"] = "error"
task["last_updated"] = (str(datetime.datetime.utcnow()),)
p.toolkit.get_action("task_status_update")(context, task)
send_error([email], "Zipped data")
send_error([email] + admin_email, "Zipped data")
raise p.toolkit.ValidationError(error)

value = {"job_id": r.json()["id"]}

if email:
value["emails"] = [email]

if admin_email:
value["admin_emails"] = admin_email

value["download_filename"] = download_filename

task["value"] = json.dumps(value)
Expand Down Expand Up @@ -265,28 +305,32 @@ def zipped_download_callback(context: Context, data_dict: dict[str, Any]):

value = json.loads(task["value"])
emails = value.get("emails", [])
admin_email = value.get("admin_emails", [])
download_filename = value.get("download_filename")

if state == "complete":
url = data_dict.get("url")
send_email(emails, url, download_filename)
else:
send_error(emails, download_filename)
send_error(emails+admin_email, download_filename)
log.error(error)

def send_error_callback(context: Context, data_dict: dict[str, Any]):
entity_id = data_dict.get("entity_id")
task_type = data_dict.get("task_type")
key = data_dict.get("key")
task = p.toolkit.get_action("task_status_show")(
context,
{"entity_id": entity_id, "task_type": "download_zipped", "key": key},
{"entity_id": entity_id, "task_type": task_type, "key": key},
)

if not task:
raise logic.NotFound("Task not found")

value = json.loads(task["value"])
emails = value.get("emails", [])
admin_emails = value.get("admin_emails", [])
emails += admin_emails
download_filename = value.get("download_filename")
dataset_name = fetch_dataset_name({
"entity_id": entity_id,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -1848,7 +1848,6 @@ def organization_show(context, data_dict):
data_dict = old_organization_show(context, data_dict)
user = context.get("user")


if not authz.is_sysadmin(user):
is_authorized = get_action("organization_list")(context, {"q": data_dict.get("name")})
if not is_authorized:
Expand Down
1 change: 1 addition & 0 deletions ckan-backend-dev/src/ckanext-wri/ckanext/wri/plugin.py
Original file line number Diff line number Diff line change
Expand Up @@ -275,6 +275,7 @@ def get_actions(self):
"download_event_create": download_event_create,
"download_event_list": get_download_events,
"organization_create": organization_create,

"prefect_send_error_callback": send_error_callback,
}

Expand Down
8 changes: 4 additions & 4 deletions datapusher/main.py
Original file line number Diff line number Diff line change
Expand Up @@ -131,7 +131,6 @@ def convert_store_to_file(
try:
logger = get_run_logger()
ckan_url = config.get("CKAN_URL")

logger.info("Fetching data...")
data = query_datastore(
api_key, ckan_url, sql, provider, rw_id, carto_account, format
Expand Down Expand Up @@ -169,7 +168,8 @@ def convert_store_to_file(
"prefect_send_error_callback",
{
"task_id": task_id,
"url": url,
"url": "",
"task_type": "download",
"state": "complete",
"entity_id": resource_id,
"entity_type": "resource",
Expand All @@ -195,7 +195,6 @@ async def download_subset_of_data(
try:
logger = get_run_logger()
ckan_url = config.get("CKAN_URL")

logger.info("Fetching data...")
data = []
if provider == "datastore":
Expand Down Expand Up @@ -237,6 +236,7 @@ async def download_subset_of_data(
"prefect_send_error_callback",
{
"task_id": task_id,
"task_type": "download_subset",
"url": "",
"state": "failed",
"entity_id": id if provider == "datastore" else dataset_id,
Expand All @@ -260,7 +260,6 @@ async def download_resources_zipped(
try:
logger = get_run_logger()
ckan_url = config.get("CKAN_URL")
print("Filename", filename)
with tempfile.TemporaryDirectory() as temp_dir:
tasks = await download_keys(keys, filename, temp_dir)
data = list(tasks)
Expand Down Expand Up @@ -295,6 +294,7 @@ async def download_resources_zipped(
{
"task_id": task_id,
"url": "",
"task_type": "download_zipped",
"state": "failed",
"entity_id": dataset_id,
"entity_type": "dataset",
Expand Down
5 changes: 1 addition & 4 deletions deployment/ckan/Dockerfile
Original file line number Diff line number Diff line change
Expand Up @@ -32,6 +32,7 @@ RUN chmod 777 ${APP_DIR}/src/ckanext-s3filestore
# Required by shapely dependency
RUN apk --update add build-base libxslt-dev python3-dev
RUN apk update
RUN apk add git=2.40.4-r0
RUN apk add postgresql15-client=15.11-r0

RUN pip install --force-reinstall -v "Pillow==11.0.0"
Expand Down Expand Up @@ -96,10 +97,6 @@ RUN ckan config-tool ${CKAN_INI} "ckan.plugins = ${CKAN__PLUGINS}"
RUN if [ "$GITHUB_ACTIONS" = "true" ]; then ckan config-tool ${CKAN_INI} "ckan.cors.origin_allow_all = True"; fi


RUN if [ "$GITHUB_ACTIONS" = "true" ]; then \
ckan config-tool ${CKAN_INI} "ckan.cors.origin_allow_all = True"; \
fi

USER root

RUN chown -R ckan:ckan ${APP_DIR}/src/
Expand Down
Loading
Loading