AnanyaKumar/transfer_learning

Framework code with wandb, checkpointing, logging, configs, experimental protocols. Useful for fine-tuning models or training from scratch, and testing them on a variety of datasets (transfer learning)

Jupyter NotebookPythonShell
This is stars and forks stats for /AnanyaKumar/transfer_learning repository. As of 27 Apr, 2024 this repository has 136 stars and 12 forks.

Code for our paper: Fine-Tuning can Distort Pretrained Features and Underperform Out-of-Distribution. Ananya Kumar, Aditi Raghunathan, Robbie Jones, Tengyu Ma, Percy Liang. ICLR 2022. This paper will be published as an Oral at ICLR 2022. This repository is still being improved and will be updated without backwards compatibility for now. Setup and installation The first time you run this project, in the current directory, which contains README, create a virtualenv: python3 -m venv .env source .env/bin/activate pip...
Read on GithubGithub Stats Page
repotechsstarsweeklyforksweekly
za233/AilProtectLLVMPythonC++260120
plus3it/terraform-aws-org-new-account-delete-default-vpcsPythonHCLShell0020
gvolpe/nmdNixXSLTCSS14000
riskable/keycap_playgroundOpenSCADPython281+228+1
marsupialtail/quokkaPythonOther1k+2510
testerSunshine/12306PythonOther32.7k09.7k0
GiorgosXou/TUIFIManagerPythonNix495+7110
pjialin/py12306PythonOther12.4k+83.2k0
WorldHealthOrganization/xform-covid-casecount-ecdcRMakefileShell0010
libreddit/libredditRustHTMLCSS4.9k02560