Skip to content

Commit e290bde

Browse files
committed
deploy: 10a5ed9
1 parent 5965634 commit e290bde

File tree

193 files changed

+17116
-6153
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

193 files changed

+17116
-6153
lines changed

index.html

Lines changed: 6 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -144,20 +144,24 @@ <h4 id="-browse-papers-by-tag">🏷 Browse Papers by Tag</h4>
144144
<tag><a href="/tags.html#feature location">feature location</a></tag>
145145
<tag><a href="/tags.html#fuzzing">fuzzing</a></tag>
146146
<tag><a href="/tags.html#generalizability">generalizability</a></tag>
147+
<tag><a href="/tags.html#generation">generation</a></tag>
147148
<tag><a href="/tags.html#GNN">GNN</a></tag>
148149
<tag><a href="/tags.html#grammar">grammar</a></tag>
149150
<tag><a href="/tags.html#human evaluation">human evaluation</a></tag>
150151
<tag><a href="/tags.html#information extraction">information extraction</a></tag>
152+
<tag><a href="/tags.html#instruction tuning">instruction tuning</a></tag>
151153
<tag><a href="/tags.html#interpretability">interpretability</a></tag>
152154
<tag><a href="/tags.html#language model">language model</a></tag>
153155
<tag><a href="/tags.html#large language models">large language models</a></tag>
156+
<tag><a href="/tags.html#LLM">LLM</a></tag>
154157
<tag><a href="/tags.html#logging">logging</a></tag>
155158
<tag><a href="/tags.html#memorization">memorization</a></tag>
156159
<tag><a href="/tags.html#metrics">metrics</a></tag>
157160
<tag><a href="/tags.html#migration">migration</a></tag>
158161
<tag><a href="/tags.html#naming">naming</a></tag>
159162
<tag><a href="/tags.html#natural language generation">natural language generation</a></tag>
160163
<tag><a href="/tags.html#natural language processing">natural language processing</a></tag>
164+
<tag><a href="/tags.html#notebook">notebook</a></tag>
161165
<tag><a href="/tags.html#optimization">optimization</a></tag>
162166
<tag><a href="/tags.html#pattern mining">pattern mining</a></tag>
163167
<tag><a href="/tags.html#pretraining">pretraining</a></tag>
@@ -181,6 +185,8 @@ <h4 id="-browse-papers-by-tag">🏷 Browse Papers by Tag</h4>
181185
<tag><a href="/tags.html#topic modelling">topic modelling</a></tag>
182186
<tag><a href="/tags.html#traceability">traceability</a></tag>
183187
<tag><a href="/tags.html#Transformer">Transformer</a></tag>
188+
<tag><a href="/tags.html#Transformers">Transformers</a></tag>
189+
<tag><a href="/tags.html#translation">translation</a></tag>
184190
<tag><a href="/tags.html#types">types</a></tag>
185191
<tag><a href="/tags.html#variable misuse">variable misuse</a></tag>
186192
<tag><a href="/tags.html#verification">verification</a></tag>

paper-abstracts.json

Lines changed: 17 additions & 0 deletions
Large diffs are not rendered by default.

papers.html

Lines changed: 4015 additions & 3760 deletions
Large diffs are not rendered by default.
Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
[["ling2016latent", "Latent Predictor Networks for Code Generation"], ["karpathy2015visualizing", "Visualizing and Understanding Recurrent Networks"], ["yin2017syntactic", "A Syntactic Neural Model for General-Purpose Code Generation"], ["xu2020incorporating", "Incorporating External Knowledge through Pre-training for Natural Language to Code Generation"]]
1+
[["yin2022natural", "Natural Language to Code Generation in Interactive Data Science Notebooks"], ["ling2016latent", "Latent Predictor Networks for Code Generation"], ["karpathy2015visualizing", "Visualizing and Understanding Recurrent Networks"], ["yin2017syntactic", "A Syntactic Neural Model for General-Purpose Code Generation"]]
Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
[["xu2022systematic", "A Systematic Evaluation of Large Language Models of Code"], ["svyatkovskiy2020fast", "Fast and Memory-Efficient Neural Code Completion"], ["zhang2023repocoder", "RepoCoder: Repository-Level Code Completion Through Iterative Retrieval and Generation"], ["wei2023typet5", "TypeT5: Seq2seq Type Inference using Static Analysis"]]
1+
[["xu2022systematic", "A Systematic Evaluation of Large Language Models of Code"], ["shrivastava2023repofusion", "RepoFusion: Training Code Models to Understand Your Repository"], ["ding2023static", "A Static Evaluation of Code Completion by Large Language Models"], ["svyatkovskiy2020fast", "Fast and Memory-Efficient Neural Code Completion"]]
Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
[["bhatia2016automated", "Automated Correction for Syntax Errors in Programming Assignments using Recurrent Neural Networks"], ["hellendoorn2018deep", "Deep Learning Type Inference"], ["wu2021prototransformer", "ProtoTransformer: A Meta-Learning Approach to Providing Student Feedback"], ["santos2018syntax", "Syntax and Sensibility: Using language models to detect and correct syntax errors"]]
1+
[["bhatia2016automated", "Automated Correction for Syntax Errors in Programming Assignments using Recurrent Neural Networks"], ["wu2021prototransformer", "ProtoTransformer: A Meta-Learning Approach to Providing Student Feedback"], ["hellendoorn2018deep", "Deep Learning Type Inference"], ["santos2018syntax", "Syntax and Sensibility: Using language models to detect and correct syntax errors"]]
Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
[["feng2020codebert", "CodeBERT: A Pre-Trained Model for Programming and Natural Languages"], ["hellendoorn2015will", "Will they like this? Evaluating Code Contributions With Language Models"], ["raychev2016learning", "Learning Programs from Noisy Data"], ["saberi2023model", "Model-Agnostic Syntactical Information for Pre-Trained Programming Language Models"]]
1+
[["feng2020codebert", "CodeBERT: A Pre-Trained Model for Programming and Natural Languages"], ["hellendoorn2015will", "Will they like this? Evaluating Code Contributions With Language Models"], ["saberi2023model", "Model-Agnostic Syntactical Information for Pre-Trained Programming Language Models"], ["raychev2016learning", "Learning Programs from Noisy Data"]]
Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1 @@
1+
[["garg2022deepperf", "DeepPERF: A Deep Learning-Based Approach For Improving Software Performance"], ["wang2023codet5", "CodeT5+: Open Code Large Language Models for Code Understanding and Generation"], ["gupta2023grace", "Grace: Language Models Meet Code Edits"], ["li2023hitchhiker", "The Hitchhiker's Guide to Program Analysis: A Journey with Large Language Models"]]
Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
[["shrivastava2020repository", "Repository-Level Prompt Generation for Large Language Models of Code"], ["ye2020leveraging", "Leveraging Code Generation to Improve Code Retrieval and Summarization via Dual Learning"], ["lomshakov2023fine", "Fine-Tuning Large Language Models for Answering Programming Questions with Code Snippets"], ["hindle2012naturalness", "On the Naturalness of Software"]]
1+
[["ye2020leveraging", "Leveraging Code Generation to Improve Code Retrieval and Summarization via Dual Learning"], ["shrivastava2020repository", "Repository-Level Prompt Generation for Large Language Models of Code"], ["hindle2012naturalness", "On the Naturalness of Software"], ["bui2021efficient", "Self-Supervised Contrastive Learning for Code Retrieval and Summarization via Semantic-Preserving Transformations"]]
Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1 +1 @@
1-
[["fried2022incoder", "InCoder: A Generative Model for Code Infilling and Synthesis"], ["kocetkov2022stack", "The Stack: 3TB of permissively licensed source code"], ["mir2021manytypes4py", "ManyTypes4Py: A Benchmark Python Dataset for Machine Learning-based Type Inference"], ["bavarian2022efficient", "Efficient Training of Language Models to Fill in the Middle"]]
1+
[["li2023starcoder", "StarCoder: may the source be with you!"], ["fried2022incoder", "InCoder: A Generative Model for Code Infilling and Synthesis"], ["kocetkov2022stack", "The Stack: 3TB of permissively licensed source code"], ["mir2021manytypes4py", "ManyTypes4Py: A Benchmark Python Dataset for Machine Learning-based Type Inference"]]

0 commit comments

Comments
 (0)