Skip to content

Creating pull request for 10.21105.joss.03652 #2842

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 2 commits into from
Dec 21, 2021
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
131 changes: 131 additions & 0 deletions joss.03652/10.21105.joss.03652.crossref.xml
Original file line number Diff line number Diff line change
@@ -0,0 +1,131 @@
<?xml version="1.0" encoding="UTF-8"?>
<doi_batch xmlns="http://www.crossref.org/schema/4.4.0" xmlns:ai="http://www.crossref.org/AccessIndicators.xsd" xmlns:rel="http://www.crossref.org/relations.xsd" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" version="4.4.0" xsi:schemaLocation="http://www.crossref.org/schema/4.4.0 http://www.crossref.org/schemas/crossref4.4.0.xsd">
<head>
<doi_batch_id>76f42d3b7d43e95bee961ee9f4b1be8d</doi_batch_id>
<timestamp>20211221171744</timestamp>
<depositor>
<depositor_name>JOSS Admin</depositor_name>
<email_address>[email protected]</email_address>
</depositor>
<registrant>The Open Journal</registrant>
</head>
<body>
<journal>
<journal_metadata>
<full_title>Journal of Open Source Software</full_title>
<abbrev_title>JOSS</abbrev_title>
<issn media_type="electronic">2475-9066</issn>
<doi_data>
<doi>10.21105/joss</doi>
<resource>https://joss.theoj.org</resource>
</doi_data>
</journal_metadata>
<journal_issue>
<publication_date media_type="online">
<month>12</month>
<year>2021</year>
</publication_date>
<journal_volume>
<volume>6</volume>
</journal_volume>
<issue>68</issue>
</journal_issue>
<journal_article publication_type="full_text">
<titles>
<title>TX$^2$: Transformer eXplainability and eXploration</title>
</titles>
<contributors>
<person_name sequence="first" contributor_role="author">
<given_name>Nathan</given_name>
<surname>Martindale</surname>
<ORCID>http://orcid.org/0000-0002-5036-5433</ORCID>
</person_name>
<person_name sequence="additional" contributor_role="author">
<given_name>Scott</given_name>
<surname>Stewart</surname>
<ORCID>http://orcid.org/0000-0003-4320-5818</ORCID>
</person_name>
</contributors>
<publication_date>
<month>12</month>
<day>21</day>
<year>2021</year>
</publication_date>
<pages>
<first_page>3652</first_page>
</pages>
<publisher_item>
<identifier id_type="doi">10.21105/joss.03652</identifier>
</publisher_item>
<ai:program name="AccessIndicators">
<ai:license_ref applies_to="vor">http://creativecommons.org/licenses/by/4.0/</ai:license_ref>
<ai:license_ref applies_to="am">http://creativecommons.org/licenses/by/4.0/</ai:license_ref>
<ai:license_ref applies_to="tdm">http://creativecommons.org/licenses/by/4.0/</ai:license_ref>
</ai:program>
<rel:program>
<rel:related_item>
<rel:description>Software archive</rel:description>
<rel:inter_work_relation relationship-type="references" identifier-type="doi">“https://doi.org/10.5281/zenodo.5796089”</rel:inter_work_relation>
</rel:related_item>
<rel:related_item>
<rel:description>GitHub review issue</rel:description>
<rel:inter_work_relation relationship-type="hasReview" identifier-type="uri">https://github.com/openjournals/joss-reviews/issues/3652</rel:inter_work_relation>
</rel:related_item>
</rel:program>
<doi_data>
<doi>10.21105/joss.03652</doi>
<resource>https://joss.theoj.org/papers/10.21105/joss.03652</resource>
<collection property="text-mining">
<item>
<resource mime_type="application/pdf">https://joss.theoj.org/papers/10.21105/joss.03652.pdf</resource>
</item>
</collection>
</doi_data>
<citation_list>
<citation key="ref1">
<unstructured_citation>Transformer eXplainability and eXploration , Martindale, Nathan and Stewart, Scott L., The Transformer eXplainability and eXploration library is intended to aid in the explorability and explainability of transformer classification networks, or transformer language models with sequence classification heads. The basic function of this library is to take a trained transformer and test/train dataset and produce an ipywidget dashboard which can be displayed in a jupyter notebook or in jupyter lab., [Computer Software] https://doi.org/10.11578/dc.20210129.1, 2021, jan, 1</unstructured_citation>
</citation>
<citation key="ref2">
<doi>10.18653/v1/2020.emnlp-demos.15</doi>
</citation>
<citation key="ref3">
<unstructured_citation>Positioning and Power in Academic Publishing: Players, Agents and Agendas, Loizides, Fernando and Scmidt, Birgit, Jupyter Notebooks - a publishing format for reproducible computational workflows, Kluyver, Thomas and Ragan-Kelley, Benjamin and Pérez, Fernando and Granger, Brian and Bussonnier, Matthias and Frederic, Jonathan and Kelley, Kyle and Hamrick, Jessica and Grout, Jason and Corlay, Sylvain and Ivanov, Paul and Avila, Damián and Abdalla, Safia and Willing, Carol and development team, Jupyter, IOS Press, Netherlands, 2016, 87–90, https://eprints.soton.ac.uk/403913/, It is increasingly necessary for researchers in all fields to write computer code, and in order to reproduce research results, it is important that this code is published. We present Jupyter notebooks, a document format for publishing code, results and explanations in a form that is both readable and executable. We discuss various tools and use cases for notebook documents.</unstructured_citation>
</citation>
<citation key="ref4">
<unstructured_citation>PyTorch: An Imperative Style, High-Performance Deep Learning Library, Paszke, Adam and Gross, Sam and Massa, Francisco and Lerer, Adam and Bradbury, James and Chanan, Gregory and Killeen, Trevor and Lin, Zeming and Gimelshein, Natalia and Antiga, Luca and Desmaison, Alban and Kopf, Andreas and Yang, Edward and DeVito, Zachary and Raison, Martin and Tejani, Alykhan and Chilamkurthy, Sasank and Steiner, Benoit and Fang, Lu and Bai, Junjie and Chintala, Soumith, Advances in Neural Information Processing Systems 32, Wallach, H. and Larochelle, H. and Beygelzimer, A. and d’ Alché-Buc, F. and Fox, E. and Garnett, R., 8024–8035, 2019, Curran Associates Inc., http://papers.neurips.cc/paper/9015-pytorch-an-imperative-style-high-performance-deep-learning-library.pdf</unstructured_citation>
</citation>
<citation key="ref5">
<unstructured_citation>Transformers: State-of-the-Art Natural Language Processing, Wolf, Thomas and Debut, Lysandre and Sanh, Victor and Chaumond, Julien and Delangue, Clement and Moi, Anthony and Cistac, Pierric and Rault, Tim and Louf, Rémi and Funtowicz, Morgan and Davison, Joe and Shleifer, Sam and von Platen, Patrick and Ma, Clara and Jernite, Yacine and Plu, Julien and Xu, Canwen and Scao, Teven Le and Gugger, Sylvain and Drame, Mariama and Lhoest, Quentin and Rush, Alexander M., Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations, oct, 2020, Online, Association for Computational Linguistics, https://www.aclweb.org/anthology/2020.emnlp-demos.6, 38–45, 10</unstructured_citation>
</citation>
<citation key="ref6">
<doi>10.21105/joss.00861</doi>
</citation>
<citation key="ref7">
<doi>10.18653/v1/P19-3007</doi>
</citation>
<citation key="ref8">
<unstructured_citation>Attention is not explanation, Jain, Sarthak and Wallace, Byron C, arXiv preprint arXiv:1902.10186, 2019</unstructured_citation>
</citation>
<citation key="ref9">
<doi>10.1145/3366424.3383542</doi>
</citation>
<citation key="ref10">
<unstructured_citation>Scikit-learn: Machine Learning in Python, Pedregosa, F. and Varoquaux, G. and Gramfort, A. and Michel, V. and Thirion, B. and Grisel, O. and Blondel, M. and Prettenhofer, P. and Weiss, R. and Dubourg, V. and Vanderplas, J. and Passos, A. and Cournapeau, D. and Brucher, M. and Perrot, M. and Duchesnay, E., Journal of Machine Learning Research, 12, 2825–2830, 2011</unstructured_citation>
</citation>
<citation key="ref11">
<unstructured_citation>Python Software Foundation, Black - The Uncompromising Code Formatter, https://github.com/psf/black, 2021</unstructured_citation>
</citation>
<citation key="ref12">
<unstructured_citation>Project Jupyter Contributors, ipywidgets: Interactive HTML Widgets, https://github.com/jupyter-widgets/ipywidgets, 2021</unstructured_citation>
</citation>
<citation key="ref13">
<unstructured_citation>Sphinx Team, Sphinx, https://github.com/sphinx-doc/sphinx, 2021</unstructured_citation>
</citation>
<citation key="ref14">
<unstructured_citation>Attention is all you need, Vaswani, Ashish and Shazeer, Noam and Parmar, Niki and Uszkoreit, Jakob and Jones, Llion and Gomez, Aidan N and Kaiser, Lukasz and Polosukhin, Illia, arXiv preprint arXiv:1706.03762, 2017</unstructured_citation>
</citation>
</citation_list>
</journal_article>
</journal>
</body>
</doi_batch>
Binary file added joss.03652/10.21105.joss.03652.pdf
Binary file not shown.