default search action
Jonathan Frankle
Person information
SPARQL queries
Refine list
refinements active!
zoomed in on ?? of ?? records
view refined list in
export refined list as
2020 – today
- 2024
- [c21]Aaron Gokaslan, A. Feder Cooper, Jasmine Collins, Landan Seguin, Austin Jacobson, Mihir Patel, Jonathan Frankle, Cory Stephenson, Volodymyr Kuleshov:
Common Canvas: Open Diffusion Models Trained on Creative-Commons Images. CVPR 2024: 8250-8260 - [c20]Zachary Ankner, Naomi Saphra, Davis W. Blalock, Jonathan Frankle, Matthew L. Leavitt:
Dynamic Masking Rate Schedules for MLM Pretraining. EACL (2) 2024: 477-487 - [c19]Nikhil Sardana, Jacob Portes, Sasha Doubov, Jonathan Frankle:
Beyond Chinchilla-Optimal: Accounting for Inference in Language Model Scaling Laws. ICML 2024 - [i38]Nikhil Sardana, Jonathan Frankle:
Beyond Chinchilla-Optimal: Accounting for Inference in Language Model Scaling Laws. CoRR abs/2401.00448 (2024) - [i37]Devin Kwok, Nikhil Anand, Jonathan Frankle, Gintare Karolina Dziugaite, David Rolnick:
Dataset Difficulty and the Role of Inductive Bias. CoRR abs/2401.01867 (2024) - [i36]Elliot Bolton, Abhinav Venigalla, Michihiro Yasunaga, David Hall, Betty Xiong, Tony Lee, Roxana Daneshjou, Jonathan Frankle, Percy Liang, Michael Carbin, Christopher D. Manning:
BioMedLM: A 2.7B Parameter Language Model Trained On Biomedical Text. CoRR abs/2403.18421 (2024) - [i35]Dan Biderman, Jose Javier Gonzalez Ortiz, Jacob Portes, Mansheej Paul, Philip Greengard, Connor Jennings, Daniel King, Sam Havens, Vitaliy Chiley, Jonathan Frankle, Cody Blakeney, John P. Cunningham:
LoRA Learns Less and Forgets Less. CoRR abs/2405.09673 (2024) - [i34]Cody Blakeney, Mansheej Paul, Brett W. Larsen, Sean Owen, Jonathan Frankle:
Does your data spark joy? Performance gains from domain upsampling at the end of training. CoRR abs/2406.03476 (2024) - 2023
- [b1]Jonathan Frankle:
The Lottery Ticket Hypothesis: On Sparse, Trainable Neural Networks. MIT, USA, 2023 - [c18]Mansheej Paul, Feng Chen, Brett W. Larsen, Jonathan Frankle, Surya Ganguli, Gintare Karolina Dziugaite:
Unmasking the Lottery Ticket Hypothesis: What's Encoded in a Winning Ticket's Mask? ICLR 2023 - [c17]Jacob Portes, Alexander Trott, Sam Havens, Daniel King, Abhinav Venigalla, Moin Nadeem, Nikhil Sardana, Daya Khudia, Jonathan Frankle:
MosaicBERT: A Bidirectional Encoder Optimized for Fast Pretraining. NeurIPS 2023 - [i33]Xingyu Liu, Alex Leonardi, Lu Yu, Chris Gilmer-Hill, Matthew L. Leavitt, Jonathan Frankle:
Knowledge Distillation for Efficient Sequences of Training Runs. CoRR abs/2303.06480 (2023) - [i32]Zachary Ankner, Naomi Saphra, Davis W. Blalock, Jonathan Frankle, Matthew L. Leavitt:
Dynamic Masking Rate Schedules for MLM Pretraining. CoRR abs/2305.15096 (2023) - [i31]Aaron Gokaslan, A. Feder Cooper, Jasmine Collins, Landan Seguin, Austin Jacobson, Mihir Patel, Jonathan Frankle, Cory Stephenson, Volodymyr Kuleshov:
CommonCanvas: An Open Diffusion Model Trained with Creative-Commons Images. CoRR abs/2310.16825 (2023) - [i30]A. Feder Cooper, Katherine Lee, James Grimmelmann, Daphne Ippolito, Christopher Callison-Burch, Christopher A. Choquette-Choo, Niloofar Mireshghallah, Miles Brundage, David Mimno, Madiha Zahrah Choksi, Jack M. Balkin, Nicholas Carlini, Christopher De Sa, Jonathan Frankle, Deep Ganguli, Bryant Gipson, Andres Guadamuz, Swee Leng Harris, Abigail Z. Jacobs, Elizabeth Joh, Gautam Kamath, Mark Lemley, Cass Matthews, Christine McLeavey, Corynne McSherry, Milad Nasr, Paul Ohm, Adam Roberts, Tom Rubin, Pamela Samuelson, Ludwig Schubert, Kristen Vaccaro, Luis Villa, Felix Wu, Elana Zeide:
Report of the 1st Workshop on Generative AI and Law. CoRR abs/2311.06477 (2023) - [i29]Jacob Portes, Alex Trott, Sam Havens, Daniel King, Abhinav Venigalla, Moin Nadeem, Nikhil Sardana, Daya Khudia, Jonathan Frankle:
MosaicBERT: A Bidirectional Encoder Optimized for Fast Pretraining. CoRR abs/2312.17482 (2023) - 2022
- [c16]A. Feder Cooper, Jonathan Frankle, Christopher De Sa:
Non-Determinism and the Lawlessness of Machine Learning Code. CSLAW 2022: 1-8 - [c15]Tiffany J. Vlaar, Jonathan Frankle:
What Can Linear Interpolation of Neural Network Loss Landscapes Tell Us? ICML 2022: 22325-22341 - [c14]Tian Jin, Michael Carbin, Daniel M. Roy, Jonathan Frankle, Gintare Karolina Dziugaite:
Pruning's Effect on Generalization Through the Lens of Training and Regularization. NeurIPS 2022 - [c13]Mansheej Paul, Brett W. Larsen, Surya Ganguli, Jonathan Frankle, Gintare Karolina Dziugaite:
Lottery Tickets on a Data Diet: Finding Initializations with Sparse Trainable Networks. NeurIPS 2022 - [i28]Andi Peng, Jessica Zosa Forde, Yonadav Shavit, Jonathan Frankle:
Strengthening Subcommunities: Towards Sustainable Growth in AI Research. CoRR abs/2204.08377 (2022) - [i27]Jacob Portes, Davis W. Blalock, Cory Stephenson, Jonathan Frankle:
Fast Benchmarking of Accuracy vs. Training Time with Cyclic Learning Rates. CoRR abs/2206.00832 (2022) - [i26]Mansheej Paul, Brett W. Larsen, Surya Ganguli, Jonathan Frankle, Gintare Karolina Dziugaite:
Lottery Tickets on a Data Diet: Finding Initializations with Sparse Trainable Networks. CoRR abs/2206.01278 (2022) - [i25]A. Feder Cooper, Jonathan Frankle, Christopher De Sa:
Non-Determinism and the Lawlessness of ML Code. CoRR abs/2206.11834 (2022) - [i24]Mansheej Paul, Feng Chen, Brett W. Larsen, Jonathan Frankle, Surya Ganguli, Gintare Karolina Dziugaite:
Unmasking the Lottery Ticket Hypothesis: What's Encoded in a Winning Ticket's Mask? CoRR abs/2210.03044 (2022) - [i23]Tian Jin, Michael Carbin, Daniel M. Roy, Jonathan Frankle, Gintare Karolina Dziugaite:
Pruning's Effect on Generalization Through the Lens of Training and Regularization. CoRR abs/2210.13738 (2022) - [i22]Cody Blakeney, Jessica Zosa Forde, Jonathan Frankle, Ziliang Zong, Matthew L. Leavitt:
Reduce, Reuse, Recycle: Improving Training Efficiency with Distillation. CoRR abs/2211.00683 (2022) - [i21]Zachary Ankner, Alex Renda, Gintare Karolina Dziugaite, Jonathan Frankle, Tian Jin:
The Effect of Data Dimensionality on Neural Network Prunability. CoRR abs/2212.00291 (2022) - 2021
- [c12]Tianlong Chen, Jonathan Frankle, Shiyu Chang, Sijia Liu, Yang Zhang, Michael Carbin, Zhangyang Wang:
The Lottery Tickets Hypothesis for Supervised and Self-Supervised Pre-Training in Computer Vision Models. CVPR 2021: 16306-16316 - [c11]Jonathan Frankle, Gintare Karolina Dziugaite, Daniel M. Roy, Michael Carbin:
Pruning Neural Networks at Initialization: Why Are We Missing the Mark? ICLR 2021 - [c10]Jonathan Frankle, David J. Schwab, Ari S. Morcos:
Training BatchNorm and Only BatchNorm: On the Expressive Power of Random Features in CNNs. ICLR 2021 - [c9]Jonathan S. Rosenfeld, Jonathan Frankle, Michael Carbin, Nir Shavit:
On the Predictability of Pruning Across Scales. ICML 2021: 9075-9083 - [i20]Rajiv Movva, Jonathan Frankle, Michael Carbin:
Studying the Consistency and Composability of Lottery Ticket Pruning Masks. CoRR abs/2104.14753 (2021) - [i19]Tiffany Vlaar, Jonathan Frankle:
What can linear interpolation of neural network loss landscapes tell us? CoRR abs/2106.16004 (2021) - [i18]Jose Javier Gonzalez Ortiz, Jonathan Frankle, Mike Rabbat, Ari S. Morcos, Nicolas Ballas:
Trade-offs of Local SGD at Scale: An Empirical Study. CoRR abs/2110.08133 (2021) - 2020
- [c8]Jonathan Frankle, David J. Schwab, Ari S. Morcos:
The Early Phase of Neural Network Training. ICLR 2020 - [c7]Alex Renda, Jonathan Frankle, Michael Carbin:
Comparing Rewinding and Fine-tuning in Neural Network Pruning. ICLR 2020 - [c6]Jonathan Frankle, Gintare Karolina Dziugaite, Daniel M. Roy, Michael Carbin:
Linear Mode Connectivity and the Lottery Ticket Hypothesis. ICML 2020: 3259-3269 - [c5]Davis W. Blalock, Jose Javier Gonzalez Ortiz, Jonathan Frankle, John V. Guttag:
What is the State of Neural Network Pruning? MLSys 2020 - [c4]Tianlong Chen, Jonathan Frankle, Shiyu Chang, Sijia Liu, Yang Zhang, Zhangyang Wang, Michael Carbin:
The Lottery Ticket Hypothesis for Pre-trained BERT Networks. NeurIPS 2020 - [i17]Jonathan Frankle, David J. Schwab, Ari S. Morcos:
The Early Phase of Neural Network Training. CoRR abs/2002.10365 (2020) - [i16]Jonathan Frankle, David J. Schwab, Ari S. Morcos:
Training BatchNorm and Only BatchNorm: On the Expressive Power of Random Features in CNNs. CoRR abs/2003.00152 (2020) - [i15]Alex Renda, Jonathan Frankle, Michael Carbin:
Comparing Rewinding and Fine-tuning in Neural Network Pruning. CoRR abs/2003.02389 (2020) - [i14]Davis W. Blalock, Jose Javier Gonzalez Ortiz, Jonathan Frankle, John V. Guttag:
What is the State of Neural Network Pruning? CoRR abs/2003.03033 (2020) - [i13]Riyadh Baghdadi, Abdelkader Nadir Debbagh, Kamel Abdous, Fatima-Zohra Benhamida, Alex Renda, Jonathan Elliott Frankle, Michael Carbin, Saman P. Amarasinghe:
TIRAMISU: A Polyhedral Compiler for Dense and Sparse Deep Learning. CoRR abs/2005.04091 (2020) - [i12]Jonathan S. Rosenfeld, Jonathan Frankle, Michael Carbin, Nir Shavit:
On the Predictability of Pruning Across Scales. CoRR abs/2006.10621 (2020) - [i11]Tianlong Chen, Jonathan Frankle, Shiyu Chang, Sijia Liu, Yang Zhang, Zhangyang Wang, Michael Carbin:
The Lottery Ticket Hypothesis for Pre-trained BERT Networks. CoRR abs/2007.12223 (2020) - [i10]Jonathan Frankle, Gintare Karolina Dziugaite, Daniel M. Roy, Michael Carbin:
Pruning Neural Networks at Initialization: Why are We Missing the Mark? CoRR abs/2009.08576 (2020) - [i9]Tiffany Tianhui Cai, Jonathan Frankle, David J. Schwab, Ari S. Morcos:
Are all negatives created equal in contrastive instance discrimination? CoRR abs/2010.06682 (2020) - [i8]Jonathan Frankle:
Revisiting "Qualitatively Characterizing Neural Network Optimization Problems". CoRR abs/2012.06898 (2020) - [i7]Tianlong Chen, Jonathan Frankle, Shiyu Chang, Sijia Liu, Yang Zhang, Michael Carbin, Zhangyang Wang:
The Lottery Tickets Hypothesis for Supervised and Self-supervised Pre-training in Computer Vision Models. CoRR abs/2012.06908 (2020)
2010 – 2019
- 2019
- [c3]Jonathan Frankle, Michael Carbin:
The Lottery Ticket Hypothesis: Finding Sparse, Trainable Neural Networks. ICLR 2019 - [i6]Jonathan Frankle, Gintare Karolina Dziugaite, Daniel M. Roy, Michael Carbin:
The Lottery Ticket Hypothesis at Scale. CoRR abs/1903.01611 (2019) - [i5]Jonathan Frankle, David Bau:
Dissecting Pruned Neural Networks. CoRR abs/1907.00262 (2019) - [i4]Jonathan Frankle, Gintare Karolina Dziugaite, Daniel M. Roy, Michael Carbin:
Linear Mode Connectivity and the Lottery Ticket Hypothesis. CoRR abs/1912.05671 (2019) - 2018
- [c2]Jonathan Frankle, Sunoo Park, Daniel Shaar, Shafi Goldwasser, Daniel J. Weitzner:
Practical Accountability of Secret Processes. USENIX Security Symposium 2018: 657-674 - [i3]Jonathan Frankle, Michael Carbin:
The Lottery Ticket Hypothesis: Training Pruned Neural Networks. CoRR abs/1803.03635 (2018) - [i2]Jonathan Frankle, Sunoo Park, Daniel Shaar, Shafi Goldwasser, Daniel J. Weitzner:
Practical Accountability of Secret Processes. IACR Cryptol. ePrint Arch. 2018: 697 (2018) - 2016
- [c1]Jonathan Frankle, Peter-Michael Osera, David Walker, Steve Zdancewic:
Example-directed synthesis: a type-theoretic interpretation. POPL 2016: 802-815 - 2015
- [i1]Jonathan Frankle:
Type-Directed Synthesis of Products. CoRR abs/1510.08121 (2015)
Coauthor Index
manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.
Unpaywalled article links
Add open access links from to the list of external document links (if available).
Privacy notice: By enabling the option above, your browser will contact the API of unpaywall.org to load hyperlinks to open access articles. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Unpaywall privacy policy.
Archived links via Wayback Machine
For web page which are no longer available, try to retrieve content from the of the Internet Archive (if available).
Privacy notice: By enabling the option above, your browser will contact the API of archive.org to check for archived content of web pages that are no longer available. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Internet Archive privacy policy.
Reference lists
Add a list of references from , , and to record detail pages.
load references from crossref.org and opencitations.net
Privacy notice: By enabling the option above, your browser will contact the APIs of crossref.org, opencitations.net, and semanticscholar.org to load article reference information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the Crossref privacy policy and the OpenCitations privacy policy, as well as the AI2 Privacy Policy covering Semantic Scholar.
Citation data
Add a list of citing articles from and to record detail pages.
load citations from opencitations.net
Privacy notice: By enabling the option above, your browser will contact the API of opencitations.net and semanticscholar.org to load citation information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the OpenCitations privacy policy as well as the AI2 Privacy Policy covering Semantic Scholar.
OpenAlex data
Load additional information about publications from .
Privacy notice: By enabling the option above, your browser will contact the API of openalex.org to load additional information. Although we do not have any reason to believe that your call will be tracked, we do not have any control over how the remote server uses your data. So please proceed with care and consider checking the information given by OpenAlex.
last updated on 2024-10-04 20:57 CEST by the dblp team
all metadata released as open data under CC0 1.0 license
see also: Terms of Use | Privacy Policy | Imprint