Skip to content

Commit 402e595

Browse files
authored
Review1 (#181)
* my workflow * example templates for manuscript. * manuscript info * update citations * cleanup text * cite * update readme * contribute * add website badge * update web link * Update README.md * Update README.md * add video link * auto update * utility files device specific * move the pdf generator to workflows * test error in paper.bib * test citation style * add citations * add a few more citations * try without space * add all * no spaces allowed in citation names * add demo images * Update paper.md * give images some space * image captions * add mention of wade * update authors and acknowledgements * Updated with Dan's Recs * Mary Comments #69 (comment) * SteveO comments #69 (review) * Walter's comments #69 (review) * Create config.yml * Create bug.yml * Create feature.yml * hyperlink * Update bug.yml * Update feature.yml * Added in Kristiina and Kris's comments * Update README.md * #69 (comment) * #69 (comment) * #69 (comment) * #69 (comment) * #69 (comment) * #69 (comment) * add kris's comments * add WSL2 link * Update paper.md * add acknowledgements * #69 (comment) * #69 (comment) * #69 (comment) * add submitted badge * update dois * Update paper.md #122 * Update paper.md add Elizabeth to ack * Update paper.md add funder possibility lab * Update paper.md add funder. * Update README.md * add r code for analyzing data * remove unnecessary code. * trying to fix the unexpected period issue, not sure where it is coming from. * revert bib to test * add dois * update paper acknowledgments and links. * add comments. * remove empty line * Update about.vue update about * Update about.vue add tutorial * Update README.md update video * Update paper.md update video
1 parent 4db6bfa commit 402e595

File tree

6 files changed

+76
-19
lines changed

6 files changed

+76
-19
lines changed

README.md

+5-1
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,7 @@
11
# Trash AI: Web application for serverless image classification of trash
22

33
[![Website](https://img.shields.io/badge/Web-TrashAI.org-blue)](https://www.trashai.org)
4+
[![status](https://joss.theoj.org/papers/6ffbb0f89e6c928dad6908a02639789b/status.svg)](https://joss.theoj.org/papers/6ffbb0f89e6c928dad6908a02639789b)
45

56
### Project Information
67

@@ -14,7 +15,7 @@ Trash AI is a web application where users can upload photos of litter, which wil
1415

1516
#### Demo
1617

17-
[![image](https://user-images.githubusercontent.com/26821843/188515526-33e1196b-6830-4187-8fe4-e68b2bd4019e.png)](https://youtu.be/HHrjUpQynUM)
18+
[![image](https://user-images.githubusercontent.com/26821843/188515526-33e1196b-6830-4187-8fe4-e68b2bd4019e.png)](https://youtu.be/u0DxGrbPOC0)
1819

1920
## Deployment
2021

@@ -71,6 +72,9 @@ docker rm -v $id
7172

7273
- Runs the complex stuff so you don't have to.
7374

75+
### Tests
76+
Instructions for automated and manual tests [here](https://github.com/code4sac/trash-ai/tree/production/frontend/__tests__).
77+
7478
## Contribute
7579

7680
We welcome contributions of all kinds.

docs/localdev.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -16,12 +16,12 @@ These values can be adjusted by editing the localdev env file [.env](../localdev
1616
It's suggested you work in branch `local` by creating your own local branch when developing
1717
Pushing / merging PR's to any branches with a prefix of `aws/` will trigger deployment actions
1818
For full functionality you will want to get a Google Maps API key and name it VITE_GOOGLE_MAPS_API_KEY, but it is not required
19-
=======
2019

2120
Pushing / merging PR's to any branches with a prefix of `aws/` will
2221
trigger deployment actions, when developing locally, create a new branch
2322
and submit a pull request to `aws/trashai-staging`
2423

24+
2525
---
2626
# Set up
2727

frontend/src/views/about.vue

+6-3
Original file line numberDiff line numberDiff line change
@@ -23,6 +23,10 @@
2323
To get started, visit the Upload Tab or
2424
<a href="/uploads/0">click here</a>.
2525
</p>
26+
<h2>Tutorial</h2>
27+
<p>
28+
<iframe width="560" height="315" src="https://www.youtube.com/embed/u0DxGrbPOC0" title="YouTube video player" frameborder="0" allow="accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share" allowfullscreen></iframe>
29+
</p>
2630

2731
<h2>What is it?</h2>
2832
<p>
@@ -54,9 +58,8 @@
5458
<h2>Disclaimer about uploaded images</h2>
5559
<p>
5660
The current version of Trash AI and the model we are using is just a
57-
start! When you upload an image, we are storing the image and the
58-
classification in an effort to expand the trash dataset and improve
59-
the model over time.
61+
start! The tool works best for images of individual pieces of trash imaged less than 1 meter away from the camera.
62+
We are looking for collaborators who can help us improve this project.
6063
</p>
6164

6265
<h2>Reporting issues and improvements</h2>

notebooks/data_reader/data_reader.R

+50
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,50 @@
1+
#Working directory ----
2+
setwd("notebooks/data_reader") #Change this to your working directory
3+
4+
#Libraries ----
5+
library(rio)
6+
library(jsonlite)
7+
library(ggplot2)
8+
library(data.table)
9+
10+
# Data import ----
11+
json_list <- import_list("example_data_download2.zip")
12+
13+
# Get path of the summary table.
14+
summary_metadata <- names(json_list)[grepl("summary.json", names(json_list))]
15+
16+
# Get path of the image metadata.
17+
image_metadata <- names(json_list)[!grepl("(.jpg)|(.png)|(.tif)|(schema)|(summary)", names(json_list))][-1]
18+
19+
# Filter the summary data.
20+
summary_json <- json_list[[summary_metadata]]
21+
22+
# Flatten the summary data.
23+
flattened_summary <- data.frame(name = summary_json$detected_objects$name,
24+
count = summary_json$detected_objects$count)
25+
# Filter the image data.
26+
image_json <- json_list[image_metadata]
27+
28+
# Flatten the image data.
29+
flattened_images <- lapply(1:length(image_json), function(i){
30+
print(i)
31+
data.frame(hash = image_json[[i]]$hash,
32+
filename = image_json[[i]]$filename,
33+
datetime = if(!is.null(image_json[[i]]$exifdata$DateTimeOriginal)){image_json[[i]]$exifdata$DateTimeOriginal} else{NA},
34+
latitude = if(!is.null(image_json[[i]]$exifdata$GPSLatitude)){image_json[[i]]$exifdata$GPSLatitude} else{NA},
35+
longitude = if(!is.null(image_json[[i]]$exifdata$GPSLongitude)){image_json[[i]]$exifdata$GPSLongitude} else{NA},
36+
score = if(!is.null(image_json[[i]]$metadata$score)){image_json[[i]]$metadata$score} else{NA},
37+
label = if(!is.null(image_json[[i]]$metadata$label)){image_json[[i]]$metadata$label} else{NA})
38+
}) |>
39+
rbindlist()
40+
41+
# Test equivalence in counts.
42+
nrow(flattened_images[!is.na(flattened_images$label),]) == sum(flattened_summary$count)
43+
44+
# Figure creation ----
45+
ggplot(flattened_summary, aes(y = reorder(name, count), x = count, fill = name)) +
46+
geom_bar(stat = "identity") +
47+
theme_classic(base_size = 15) +
48+
theme(legend.position = "none") +
49+
labs(x = "Count", y = "Type")
50+

paper.bib

+6-6
Original file line numberDiff line numberDiff line change
@@ -88,7 +88,8 @@ @ARTICLE{Hapich:2022
8888
number = 1,
8989
pages = "15",
9090
month = jun,
91-
year = 2022
91+
year = 2022,
92+
doi = "10.1186/s43591-022-00035-1"
9293
}
9394

9495
@misc{Waterboards:2018,
@@ -108,11 +109,10 @@ @article{vanLieshout:2020
108109
number = {8},
109110
pages = {e2019EA000960},
110111
keywords = {plastic pollution, object detection, automated monitoring, deep learning, artificial intelligence, river plastic},
111-
doi = {https://doi.org/10.1029/2019EA000960},
112+
doi = {10.1029/2019EA000960},
112113
url = {https://agupubs.onlinelibrary.wiley.com/doi/abs/10.1029/2019EA000960},
113114
eprint = {https://agupubs.onlinelibrary.wiley.com/doi/pdf/10.1029/2019EA000960},
114115
note = {e2019EA000960 10.1029/2019EA000960},
115-
abstract = {Abstract Quantifying plastic pollution on surface water is essential to understand and mitigate the impact of plastic pollution to the environment. Current monitoring methods such as visual counting are labor intensive. This limits the feasibility of scaling to long-term monitoring at multiple locations. We present an automated method for monitoring plastic pollution that overcomes this limitation. Floating macroplastics are detected from images of the water surface using deep learning. We perform an experimental evaluation of our method using images from bridge-mounted cameras at five different river locations across Jakarta, Indonesia. The four main results of the experimental evaluation are as follows. First, we realize a method that obtains a reliable estimate of plastic density (68.7\% precision). Our monitoring method successfully distinguishes plastics from environmental elements, such as water surface reflection and organic waste. Second, when trained on one location, the method generalizes well to new locations with relatively similar conditions without retraining (≈50\% average precision). Third, generalization to new locations with considerably different conditions can be boosted by retraining on only 50 objects of the new location (improving precision from ≈20\% to ≈42\%). Fourth, our method matches visual counting methods and detects ≈35\% more plastics, even more so during periods of plastic transport rates of above 10 items per meter per minute. Taken together, these results demonstrate that our method is a promising way of monitoring plastic pollution. By extending the variety of the data set the monitoring method can be readily applied at a larger scale.},
116116
year = {2020}
117117
}
118118

@@ -145,7 +145,8 @@ @ARTICLE{Lynch:2018
145145
number = 1,
146146
pages = "6",
147147
month = jun,
148-
year = 2018
148+
year = 2018,
149+
doi = "10.1186/s40965-018-0050-y"
149150
}
150151

151152

@@ -176,7 +177,7 @@ @article{Majchrowska:2022
176177
pages = {274-284},
177178
year = {2022},
178179
issn = {0956-053X},
179-
doi = {https://doi.org/10.1016/j.wasman.2021.12.001},
180+
doi = {10.1016/j.wasman.2021.12.001},
180181
url = {https://www.sciencedirect.com/science/article/pii/S0956053X21006474},
181182
author = {Sylwia Majchrowska and Agnieszka Mikołajczyk and Maria Ferlin and Zuzanna Klawikowska and Marta A. Plantykow and Arkadiusz Kwasigroch and Karol Majek},
182183
keywords = {Object detection, Semi-supervised learning, Waste classification benchmarks, Waste detection benchmarks, Waste localization, Waste recognition},
@@ -193,4 +194,3 @@ @misc{Proença:2020
193194
year = {2020},
194195
copyright = {arXiv.org perpetual, non-exclusive license}
195196
}
196-

0 commit comments

Comments
 (0)