Top Related Projects
Simple, Pythonic, text processing--Sentiment analysis, part-of-speech tagging, noun phrase extraction, translation, and more.
💫 Industrial-strength Natural Language Processing (NLP) in Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Stanford NLP Python library for tokenization, sentence segmentation, NER, and parsing of many human languages
Quick Overview
The MLNLP-World/Paper-Writing-Tips repository is a comprehensive collection of resources and guidelines for writing academic papers in the fields of Machine Learning (ML) and Natural Language Processing (NLP). It aims to help researchers and students improve their paper writing skills by providing tips, templates, and best practices.
Pros
- Offers a wide range of tips covering various aspects of paper writing, from structure to language
- Includes templates and examples for different sections of academic papers
- Regularly updated with contributions from the community
- Provides guidance specific to ML and NLP fields
Cons
- May not cover all specific requirements for every conference or journal
- Some tips might be subjective or not universally applicable
- Lacks interactive elements or tools for direct implementation of the tips
- Could benefit from more extensive examples of well-written papers
As this is not a code library, we'll skip the code examples and getting started instructions sections.
Competitor Comparisons
Simple, Pythonic, text processing--Sentiment analysis, part-of-speech tagging, noun phrase extraction, translation, and more.
Pros of TextBlob
- Provides a simple API for common natural language processing (NLP) tasks
- Includes built-in models for sentiment analysis and part-of-speech tagging
- Offers easy-to-use text processing functions like noun phrase extraction and word inflection
Cons of TextBlob
- Limited to basic NLP tasks, not suitable for advanced research or complex language models
- May not be as up-to-date with the latest NLP techniques compared to more specialized libraries
- Lacks specific features for academic paper writing or formatting
Code Comparison
TextBlob:
from textblob import TextBlob
text = "TextBlob is simple to use."
blob = TextBlob(text)
print(blob.sentiment)
Paper-Writing-Tips:
# Title of Your Paper
## Abstract
Your abstract goes here.
## Introduction
Start your introduction...
While TextBlob focuses on providing code for NLP tasks, Paper-Writing-Tips offers markdown templates and guidelines for academic paper structure. The repositories serve different purposes, with TextBlob being a practical NLP tool and Paper-Writing-Tips being a resource for improving academic writing skills.
💫 Industrial-strength Natural Language Processing (NLP) in Python
Pros of spaCy
- Comprehensive NLP library with production-ready capabilities
- Extensive documentation and community support
- Optimized for performance and efficiency in processing large volumes of text
Cons of spaCy
- Steeper learning curve for beginners compared to Paper-Writing-Tips
- Focused on NLP tasks rather than academic writing guidance
- Requires more computational resources and setup
Code Comparison
Paper-Writing-Tips (no code examples available)
spaCy:
import spacy
nlp = spacy.load("en_core_web_sm")
doc = nlp("This is a sample sentence.")
for token in doc:
print(token.text, token.pos_, token.dep_)
Summary
While Paper-Writing-Tips is a collection of guidelines for academic writing, spaCy is a full-fledged NLP library. Paper-Writing-Tips offers valuable advice for researchers and students, whereas spaCy provides tools for text processing and analysis. The choice between them depends on whether you need writing guidance or NLP capabilities.
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Pros of transformers
- Comprehensive library for state-of-the-art NLP models
- Extensive documentation and community support
- Regularly updated with new models and features
Cons of transformers
- Steeper learning curve for beginners
- Larger codebase and dependencies
- Focused on model implementation rather than research writing
Code comparison
transformers:
from transformers import AutoTokenizer, AutoModel
tokenizer = AutoTokenizer.from_pretrained("bert-base-uncased")
model = AutoModel.from_pretrained("bert-base-uncased")
Paper-Writing-Tips:
# Tips for Writing ML/NLP Papers
1. Start with a clear outline
2. Use concise and precise language
3. Include relevant visualizations
The transformers repository provides a powerful toolkit for working with NLP models, while Paper-Writing-Tips offers guidance on academic writing in the ML/NLP field. transformers is more code-focused, providing implementations of various models, while Paper-Writing-Tips is a collection of markdown files with writing advice. The code examples reflect this difference, with transformers showing model usage and Paper-Writing-Tips presenting markdown-formatted tips.
Stanford NLP Python library for tokenization, sentence segmentation, NER, and parsing of many human languages
Pros of Stanza
- Comprehensive NLP toolkit with support for multiple languages
- Well-documented API and extensive examples for easy integration
- Actively maintained with regular updates and improvements
Cons of Stanza
- Focused on NLP tasks, not specifically tailored for academic paper writing
- Steeper learning curve for non-technical users
- Requires more computational resources due to its comprehensive nature
Code Comparison
Paper-Writing-Tips is primarily a collection of markdown files with writing advice, so there's no relevant code to compare. However, here's a sample of how to use Stanza for basic NLP tasks:
import stanza
nlp = stanza.Pipeline('en')
doc = nlp("Hello world!")
for sentence in doc.sentences:
print([word.text for word in sentence.words])
Summary
Stanza is a powerful NLP toolkit suitable for various language processing tasks, while Paper-Writing-Tips is a curated collection of advice for academic writing. Stanza offers more technical capabilities but requires programming knowledge, whereas Paper-Writing-Tips provides accessible guidance for improving writing skills without any coding requirements.
Convert
designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual CopilotREADME
Paper Writing Tips
项ç®å¨æº
å¾å¤åå¦è åå¦å¨æç¨¿çæ¶åç»å¸¸ä¼åºç°ä¸äºå ±æçå°é误ï¼ä¸ºäºèçå¤§å®¶çæ¶é´å帮å©å¤§å®¶è½å¤å°½å¿«çå®ä½ä¸äºå°çé®é¢ãæ¬é¡¹ç®æ»ç»äºæä»¬å¨èªå·±æç¨¿è¿ç¨ä¸çç»éªåä¸äºèº«è¾¹èå¸åå¦çæç¨¿ç»éªï¼å¸æè½å¯¹å¤§å®¶ææå¸®å©ï¼ç±äºæä»¬çæ°´å¹³æéï¼å¦æçæ¼ï¼è¿æè° è§£ã谢谢大家ã
æ¬é¡¹ç®çç¹è²ï¼
- ååå¿ çï¼å å«ä¸äºå¸¸è§çéè¯¯ï¼æ¯ä¸ªé误åé æä¾åï¼å¯ä»¥å¨å¨æå论æä¹åå¿«éæµè§ã
- **ç»ç¨¿å¿ æ¥**ï¼å å«ä¸äºä¾åï¼æ¹ä¾¿å¿«éå®ä½æ¯å¦èªå·±ç论ææé误ã
- ç¾å®¶ä¹è¨ï¼æ´çäºä¸äºç½ç»ä¸å ¬å¼çåä½èµæºï¼å¹¶ä¸å®å ¨ï¼æ¬¢è¿è¡¥å ï¼ï¼æ¹ä¾¿å¤§å®¶ç³»ç»å¦ä¹ ã
å 责声æ
æ¬é¡¹ç®åä¸¾çæææå·§ä» ä¾åèï¼å¹¶ä¸ä¿è¯æ£ç¡®ãæ¬æä¸»è¦å ³æ³¨äºé¡¶ä¼è®ºæï¼è®ºæåä½ä»¥å®é éæ±ä¸ºåãçæå使巧å¯è½ä¼ä½¿å使²¡æææ¾ç失误ï¼èä¼ç§ç论æéè¦ä¸ææç£¨ã ææå å®¹ä» ä» æ¥èªäºç¬è ç个人ç»éªãäºèç½æ°æ®ãç¬è å¢éæ¥å¸¸ç§ç å·¥ä½ä¸çç¸å ³ç§¯ç´¯ï¼ä»¥åç¬è å¢é身边åä½å¤§ä½¬çè¨ä¼ 身æãæä»»ä½é®é¢ï¼æ¬¢è¿æäº¤ Issue æ PRã妿¬é¡¹ç®æç¨å¾½ç« æ¥èªäºèç½ï¼å¦ä¾µç¯äºæ¨çå¾ççæè¯·èç³»æä»¬å é¤ï¼è°¢è°¢ã
欢è¿è´¡ç®
Paper Writing Tipsç®åæ¯ä¸ä¸ªæ£å¨è¿è¡çä¸é¡¹ç®ï¼å¦æçæ¼å¨æé¾å ï¼æ¬¢è¿ä»»ä½çPRåissue讨论ã
è§£é
䏿ä¸ï¼æ 注"Attention"çå å®¹ï¼æ¯ç®åç»ç»è 认为æï¼è¾ä¸ºï¼ææ¾äºè®®ç建议æ¡ç®ã
ååå¿
ç
å ¬å¼ç¬¦å·
1. æ é符å·ç¨å°åæä¸åæ¯è¡¨ç¤º
- è¦ç¹: 为é¿å æ··æ·åæ¯ l åæ°å 1 ï¼åæ¯ l å¯ç¨ \ell æ¿ä»£ã
2. æç»æçå¼ä½¿ç¨ \boldsymbolï¼Attentionï¼
- è¦ç¹: æç»æçå¼ä¾å¦å¥ååºåãæ ãå¾ç ï¼ä¸å¾ä» å±ç¤ºä¸ºå¥ååºåæ åµï¼
3. \boldsymbol çéåå¯ç¨ \mathcal ï¼Attentionï¼
4. åéå¼å°åå ç²ï¼ç©éµå¤§åå ç²
- è¦ç¹: æä¸åæ¯ç¨\mathbfï¼å¸è 忝ç¨\boldsymbolã
5. æ°åãææç使ç¨\mathbb
6. ä¿æå ç´ ä¸éåç符å·å¯¹åº
7. åä½é£æ ¼è¦æ£å¼ï¼é¿å 缩å
don't
æå¼åædo not
s- æææ ¼
's
å°½é转å为of
8. æä¸ææ¯ç¨è¯
e.g.,
表示for example,
i.e.,
表示that is,
et al.
表示and others of the same kind,
etc.
表示and others,
ï¼ä¸ç¨äºå举人et al.
æetc.
å¨å¥æ«æ¶ï¼ä¸ç¨åæ·»å é¢å¤çå¥å·
9. è±æå¼å·
é®ä½å¦å¾æç¤ºï¼ä½¿ç¨ `` å '' åå«è¡¨ç¤ºå·¦å³å¼å·ï¼è䏿¯å ¶ä»ç¬¦å·æä»»ä½ä¸æå¼å·ã
10. ä¸é´æç©ºæ ¼ "~"
ä½¿ç¨ ~
表示ä¸é´æç©ºæ ¼ï¼ä¸é´æç©ºæ ¼ä¸ä¼å¯¼è´æå¤çæ¢è¡ï¼ä¾å¦ï¼
Figure~\ref{} shows the model performance.
Table~\ref{} shows dataset details.
We use BERT~\cite{bert} model.
Section~\ref{} concludes this paper.
11. URL 龿¥
ä½¿ç¨ \url{}
å½ä»¤ï¼éè¦å¯¼å
¥å
ï¼
\usepackage{hyperref}
12. å¼å·åªè¡¨ç¤ºæè°ï¼ä¸è¡¨ç¤ºå¼ç¨ï¼Attentionï¼
å¼ç¨ç表述èèä½¿ç¨æä½ \textit{}
è䏿¯å¼å·ã
13. éåä¸ªåæ¯çåéå
å
¬å¼ä¸ç softmax
ï¼proj
ï¼enc
çè¶
è¿ä¸ä¸ªåæ¯çåéæç¬¦å·ï¼ä½¿ç¨æ£æåä½ï¼å³ä½¿ç¨ \textrm
æ \textit
å½ä»¤ã
14. 使ç¨å½æ°å½ä»¤
许å¤å½æ°åç¬¦å·æç°æçå½ä»¤ï¼ä¾å¦ï¼\arg{}
ï¼\max{}
ï¼\sin{}
ï¼\tanh{}
ï¼\inf
ï¼ \det{}
ï¼ \exp{}
.
15. å ¬å¼ä¸çæ¬å·ï¼åºéè¿\leftï¼\rightè¿è¡æ è®°
-
å¦ \left(\right), \left{\right}, \left<\right>, \left|\right|çã
-
æ¬å·ä¸çåå²éè¿\middleå®ç°ã
-
Latex代ç å¦ä¸ï¼
\begin{gather} \bold{s} = \left(\sum_{i=0}^{N-1}{\alpha_{i} \bold{h}_i}\right) + \bold{h}_N\\ \bold{s} = (\sum_{i=0}^{N-1}{\alpha_{i} \bold{h}_i}) + \bold{h}_N \\ \end{gather} \begin{gather} \left\{ x \middle| x\ne\frac{1}{2}\right\} \\ \{ x | x\ne\frac{1}{2}\} \end{gather}
16. ä½¿ç¨ align 表示ä¸ç»å ¬å¼ï¼çå·å¯¹é½
-
ä½¿ç¨ align 表示ä¸ç»å ¬å¼ï¼çå·å¯¹é½ã
-
Latex代ç å¦ä¸ï¼
\begin{gather} E = m c^2 \\ C = B \log_2\left(1+\frac{S}{N}\right) \end{gather} \begin{align} E &= m c^2 \\ C &= B \log_2\left(1+\frac{S}{N}\right) \end{align}
17. åªå¯¹referçå ¬å¼ä¸å ç¼å·ï¼Attentionï¼
-
æ¨èï¼åªå¯¹referçå ¬å¼å ç¼å·ï¼\nonumberå»ç¼å·ã
-
Latex代ç å¦ä¸ï¼
\begin{equation} E = m c^2 \end{equation} \begin{equation} E = m c^2 \nonumber \end{equation}
è¡¨æ ¼å¾ç
18. 使ç¨Booktabsç»å¶æ´å¥½ççè¡¨æ ¼
-
ç»å¶è¡¨æ ¼æ¶ï¼ä½¿ç¨ \usepackage{booktabs}ï¼ä»èåå© \toprule, \bottomrule, \midrule, \cmidrule å½ä»¤ï¼ç»åºå¥½ççåé线ã
-
Latex代ç å¦ä¸ï¼
% Example of a table with booktabs from https://nhigham.com/2019/11/19/better-latex-tables-with-booktabs/. % First version of table. \begin{table}[htbp] \centering \begin{tabular}{|l|c|c|c|c|c|l|} \hline & \multicolumn{3}{c|}{E} & \multicolumn{3}{c|}{F}\\ \hline & $mv$ & Rel.~err & Time & $mv$ & Rel.~err & Time \\\hline A & 11034 & 1.3e-7 & 3.9 & 15846 & 2.7e-11 & 5.6 \\ B & 21952 & 1.3e-7 & 6.2 & 31516 & 2.7e-11 & 8.8 \\ C & 15883 & 5.2e-8 & 7.1 & 32023 & 1.1e-11 & 1.4 \\ D & 11180 & 8.0e-9 & 4.3 & 17348 & 1.5e-11 & 6.6 \\ \hline \end{tabular} \caption{Without booktabs.} \label{tab:without-booktabs} \end{table} % Second version of table, with booktabs. \begin{table}[htbp] \centering \begin{tabular}{lcccccl}\toprule & \multicolumn{3}{c}{E} & \multicolumn{3}{c}{F} \\\cmidrule(lr){2-4}\cmidrule(lr){5-7} & $mv$ & Rel.~err & Time & $mv$ & Rel.~err & Time\\\midrule A & 11034 & 1.3e-7 & 3.9 & 15846 & 2.7e-11 & 5.6 \\ B & 21952 & 1.3e-7 & 6.2 & 31516 & 2.7e-11 & 8.8 \\ C & 15883 & 5.2e-8 & 7.1 & 32023 & 1.1e-11 & 1.4\\ D & 11180 & 8.0e-9 & 4.3 & 17348 & 1.5e-11 & 6.6 \\\bottomrule \end{tabular} \caption{With booktabs.} \label{tab:with-booktabs} \end{table}
19. ç« èãè¡¨æ ¼ãå¾ççå¼ç¨
- ç« èãè¡¨æ ¼ãå¾ç使ç¨\label{...}å®ä¹åï¼éè¿\ref{...}èªå¨å¼ç¨è·³è½¬ã
- 对å徿å表çå¼ç¨å¯ä»¥ä½¿ç¨Figure~\ref{fig:figure}(a)æ¥è¡¨ç¤ºã
20. ä¸è¦æå¾è¡¨ä¸çCaption卿£æä¸å¤è¿°
- 说æï¼Captionï¼æ¯ç¨æ¥åâè¿ä¸ªè¡¨æ ¼æ¯ä»ä¹âçã
- æ£ææ¯ç¨æ¥åâè¿ä¸ªè¡¨æ ¼è¯´æäºä»ä¹âçã
21. âä¸çº¿è¡¨â建议ï¼å°½éä¸è¦ç»ç«çº¿ï¼Attentionï¼
22. è¡¨æ ¼å¤§å°è°æ´
- ç¨ \centering å± ä¸ï¼ç¨\smallï¼\scriptsizeï¼\footnotesizeï¼\tiny è°æ´åå·
- ç¨\setlength{\tabcolsep}{8pt} è°æ´åé´è·
- ç¨ p{2cm} åºå®å宽
- ç¨\multirowï¼\multicolumn åå¹¶åå æ ¼
23. ç¢éå¾ï¼å¾ååºä½¿ç¨ç¢éå¾ï¼å¦PDFæ ¼å¼ï¼
- 使ç¨Adobe illustratorãOmniGraffleç软件ç»å¶åå为ç¢éå¾
- 使ç¨Matplotlibç»å¶ååå¨: plt.savefig('draw.pdf')
- å¨LaTeXä¸ä½¿ç¨pgfplotsç´æ¥ç»å¶
24. å¾çåä½å¤§å°ä»äºæ£æåä½ä¸captionä¹é´
- 建议å¾ä¸åä½å¤§å°ä¿æä¸è´
25. 论æä¸å¾ç䏿å说æåå·åºåæ£ææå大å°ç¸å½
- å¾ç䏿ååå·å¤§å°ä¸å®å¤ªå¤§
26. å¾è¡¨è®¾è®¡åºéç¨äºé»ç½æå°
- 对é»ç½æå°å好ï¼ä¸è¦ä»¥é¢è²ä½ä¸ºæä»£å¾ç¤ºä¸çº¿æ¡çå¯ä¸ç¹å¾ï¼å¯ä½¿ç¨å®çº¿/è线 ï¼äº®/æï¼ä¸å线形çã
27. å¾ç飿 ¼ä¿æç®æ´ç¾è§
- ä¸è¦ä½¿ç¨è¿å¤çé¢è²ç§ç±»ï¼é¿å è¿äº®çé¢è²
- 使ç¨ç®æ´çå¾ç¤ºï¼å°½éå°ç¨æåæè¿°ï¼ä¾åé¤å¤ï¼
- åæ ·åè½æ¨¡å使ç¨ç»ä¸æ ¼å¼
- ç®å¤´èµ°ååºè¶äºåä¸ä¸ªæ¹å
éè¯ç¨è¯
28. 注æè¿è¯ç¬¦çè¯æ§
- ä¸è¬è¿è¯ç¬¦ä¸ï¼æåä¸ä¸ªè¯æ¯åè¯çï¼è¿èµ·æ¥æ¯å½¢å®¹è¯è¯æ§ï¼
- æåä¸ä¸ªè¯æ¯å¨è¯çï¼è¿èµ·æ¥æ¯å¨è¯è¯æ§ã
29. è¯æ§æéç¹
- First, Secondlyï¼å为å¯è¯
- trainingï¼ testï¼validationï¼å为åè¯
30. 缩å符å使ç¨ä¹ æ¯
- 符åä¹ æ¯ï¼ä¸æåºè å°½éä¸è´CNNï¼LSTMï¼FEVERï¼ConceptNetï¼SQuADï¼BiDAFï¼FEVER scoreï¼Wikipediaã
- 忬¡åºç°æ¶ï¼å ¨ç§°å¨åï¼ç¼©åå¨åï¼æç¼©åå¨åï¼ç¨äºæ³¨éçcitationå¨åãgraph attention network (GAT)ï¼pre-trained language model (PLM)ï¼BERT~\citep{BERT}ã
- é¢ååãä»»å¡åãææ çä¸è¬ä¸éè¦å¤§åï¼å¦ natural language processing, question answering, accuracy, macro-F1 score.
31. 注æå夿°
- å°¤å ¶æ¯ä¸è§åå夿°ååãä¸å¯æ°åè¯ã
32. a/an è·çå é³é³ç´ èµ°
33. theç使ç¨
- 注æï¼ä¸è¬ä¸ä¼ç¬ç«åºç°ï¼ä¸ç¨å è¯ï¼å¯æ°åè¯åæ°ï¼è¦ä¹å theç¹æï¼è¦ä¹å 夿°æ³æã
34. æ¶æï¼ä»¥ä¸è¬ç°å¨æ¶ä¸ºä¸»ï¼Attentionï¼
35. é¿å ç»å¯¹å表述ã
- 使ç¨straightforwardæ¿æ¢obvious
- 使ç¨generallyãusuallyãoftenæ¿æ¢always
- 使ç¨rareæ¿æ¢never
- 使ç¨alleviateãrelieveæ¿æ¢avoidãeliminate
36. é¿å ä¸äºæ¨¡ç³çè¡¨è¿°ï¼æ¯å¦ï¼meaning, semantic, betterçã
以better举ä¾ï¼ä¹å°±æ¯å½è¡¨ç¤ºä¸ä¸ªäºç©æ´å¥½æ¶ï¼ä¸è½ä» ä» è¯´å®æ´å¥½ï¼éè¦ç»åºç¸åºçè§£éä¸çç±
å¥å表述
37. é¿å è¿å¤ä½¿ç¨ä»£è¯ï¼itï¼theyçï¼æ¨¡åå缩åä¹ä¸é¿ï¼å¹¶ä¸æ´æ¸ æ¥ã
38. é¿å è¿å¤è´´æ ç¾ï¼æ¯å¦å¨è°è®ºææå¥½æ¶ã
æåºçæ¹æ³å°åºæ¹åäºåªéï¼æ¯ä»ä¹å¯¼è´çè¿ä¸ªç»æï¼
39. ä¸å¥è¯è¯´ä¸ä»¶äºãå°½é使ç¨ç®åå¥ï¼å°ä½¿ç¨é¿çå¤åå¥ã
40. è§å¯/åç°ï¼åè®¾ï¼æ¹æ³ï¼ææï¼ä¸è¦æ··ç说ã
段è½å¸å±
41. ä¸è¡åæ°æªè¶ è¿1/4æ¶ï¼å»ºè®®å 餿è å¢å åæ°ãï¼Attentionï¼
- å¯éï¼å¯ä»¥å°è¯å¨è¯¥æ®µè¯çæåï¼æ·»å
\looseness=-1
ï¼ææ¶å¯ä»¥å¨ä¸å 餿åä¸è¡çæ åµä¸ï¼å°æåä¸è¡ç个å«åè¯âæ¤ä¸å»âã
åèæç®
42. åèæç®å¼ç¨éè¦ææ¥æ¯å¦å¨å¥åä¸åæå
- è¦ç¹ï¼å¼ç¨ä½¿ç¨\citep{}ï¼ä½ä¸ºæå ¥è¯ï¼æ\citet{}ï¼ä½ä¸ºå¥åä¸»è¦æåå¦ä¸»è¯ã宾è¯çã
43. å°½éå¼ç¨å表ççæ¬èéarXivçæ¬ã
- 伿¾å¾æ£è§ä¸äº
44. å¼ç¨æ¡ç®çæ ¼å¼å°½éååä¸è´
- å¦ä¼è®®å缩åãæ¯å¦å å«ä¼è®®æ¶é´å°ç¹çæ¯å¦ææçåèæç®æ ¼å¼ä¿æäºä¸è´
ç»ç¨¿å¿
æ¥ââæç¨¿åä¸å¨ï¼ä¸å¤©
å ³äºç§æè±è¯ä¹¦åä¹ æ¯
1. å¯åèæ¼åæ£æ¥è½¯ä»¶ï¼æ£æ¥ææ¬æ¯å¦æè¯ç æä¸ç¬¦ä¹ æ¯ç表达ã
2. ä¸è¦ä½¿ç¨ didn't can't don't isn't aren't ä¹ç±»ç缩åå½¢å¼ ä»»ä½æ¶åé½ä¸è¦ç¨æå·ç¼©åãå¯¹äºæææ ¼ï¼å®å ¨ä¸è¦ç¨ï¼éè¦è¡¨è¾¾ç±»ä¼¼ææï¼ç¨ofçè¯ã对äºå¼å·ï¼è¦å°½åé¿å ã
3. 使ç¨ç¼©åï¼å¦æ¨¡ååãå®ä¹çï¼ï¼éå¨ä½¿ç¨çæåå§ä½ç½®å®ä¹ã
4. 模ååå大å°åä¿æä¸è´ï¼å¦BERTï¼ELECTRAï¼é¿å Bertï¼Electraï¼electraæ··å使ç¨ã
5. ä¾å¥ãä¾åèèç¨æä½
6. \begin\itemæ¹ææ£å¸¸æ®µè½å¯ä»¥ä½¿é¡µé¢æ´ç´§åï¼ç¶å卿¯æ®µåæå·¥å æé»ç¹$\bullet$ï¼ï¼æµªè´¹è¿å¤ç©ºé´æè¢«æççæ°´ä¹å«
7. èæ³¨çåæ³ï¼ä¸è¬æ åµä¸ï¼è注å¯ä»¥åå¨âèæ³¨ç¸å ³çå°æ¹å第ä¸ä¸ªéå·¦æ ç¹ç¬¦å·ï¼å¦å·¦å¼å·ãå·¦æ¬å·ï¼âåé¢ã\footnoteå½ä»¤åå®åé¢çæ ç¹ç¬¦å·ä¹é´æ²¡æç©ºæ ¼ã
8. Aåançåºå«å¨äºåé³ï¼an LSTM cell, an F/H/L/M/N/S/X, a U.
9. æç« å级æ é¢ç大å°å飿 ¼ç»ä¸ï¼ä¾å¦çè¯é¦åæ¯å¤§åæåè¯é¦åæ¯å¤§å
10. 使ç¨babelå®ç°åè¯æé³æ é³èæ¢è¡ï¼hyphenation patternsï¼çææï¼å³\usepackage[english]{babel}
å ³äºå¾çï¼
11. å¾çå é¨çåä½åºç»ä¸ä¸è·æ£ææå大å°ä¸è´ã
12. æ´å¼ å¾ç两侧尽éä¸è¦æç©ºç½ï¼ä¿æç´§åã
13. å¾çé叏卿¯ä¸é¡µçæä¸æ¹æä¸é´ï¼è䏿¯æä¸æ¹ã
14. åç±»åæ¨¡åé¢è²å°½å¯è½ä¿æä¸ä¸ªè²ç³»ï¼æ¯ç±»åå ç¨åä¸ä¸ªé¢è²å¡«å æä½ä¸ºè¾¹æ¡ã
15. åæ ·è²ç³»ï¼å¦ææä¸ªæ¨¡åé¢è²æ´æ·±æ´äº® ï¼ä»£è¡¨è¿ä¸ªæ¨¡åæ´ä¸ºéè¦ã妿䏿¯æ³è¡¨è¾¾æ´ä¸ºéè¦ï¼æ´ä¸ºæ ¸å¿ï¼è¯·å¨å个模åä¹é´ä¿æåè¡¡é¢è²åé ï¼æ¯å¦ç°åº¦å¼å°½éä¸è´ã
16. ä¸å¾ä½¿ç¨è¿å¤çé¢è²ç§ç±»ï¼é¢è²æå¥½ä¸è¦é«äºå ç§ã
17. å¾ç使ç¨ç¢éå¾ã
18. figureæ¬ææ¯æä¾æ¯æåæ´ç´è§ãæ´æäºç®æ´ç表达çï¼åºå°½å¯è½å¨ç¨åççç»ç»å ç´ ï¼è䏿¯å¤§éç¨æåæ è®°ãfigureå ç´ ãè§èç¨æå°éåãæç»ä¸ãä¸è´çè®¾ç½®å®æï¼ä¸è¬ä¸ä¼é¾çã
19. ç»èå°çº¿åãé è²ï¼ç¬¬ä¸ï¼ä¿æç»ä¸ï¼ä½æè¿°å¤ææ§ï¼ï¼ç¬¬äºï¼ç¨ä¸ä¸ªææãç±»å«çå¾å½¢å ç´ è¯¥ç¨è¿ä¼¼ã䏿 ·ç线形ãé è²ï¼è®¤ç¥ç´è§æ§ï¼ã
20. ç®å¤´æ¹åå°½éä¿æååï¼é¿å åºç°æ¥åæè½¬ãæµç¨å¾ä¸é¿å åºç°å¤ç«ç»ä»¶ï¼æ 任使¥æºæå»åç®å¤´æ è¯ï¼ã
å ³äºå¼ç¨ï¼
21. å¼ç¨æ è®°çéåï¼
-
å¼ç¨å¨æåå¤ï¼parentï¼ï¼ä½¿ç¨ \citeã
-
å¼ç¨å¨æåå ï¼within textï¼
- ACL/NAACL/EMNLP模æ¿ä½¿ç¨\citet{...}ï¼
- COLING模æ¿ä½¿ç¨\newcite{...}ï¼
- AAAI/IJCAI模æ¿ä½¿ç¨\citeauthor{...} \shortcite{...}ï¼
- IEEE模çï¼\citeauthor{...}~(\citeyear{...})
ææï¼(Zhang et al. 2020) vs. Zhang et al. (2020)
22. 妿ç¯å¹ è¾ç´§å¼ ï¼å¯ä»¥å¨å¼ç¨ä¸ä½¿ç¨ä¼è®®æååç§°ç缩åã
- å¯ä»¥åèå·¥å · SimBiber
23. bibç®¡çæ³¨æä¿æä¼è®®/æååç§°å ¨ç§°å缩åä¸è´æ§ï¼æ£æ¥å¹´ä»½ãå·å·ã页ç çï¼ä¸è¦å®å ¨ä¾èµ scholar æä¾çä¿¡æ¯ï¼å¯è½åå¨ç¼ºå¤±ææ ¼å¼æ··ä¹±ï¼ã
- å¯ä»¥åèå·¥å · Rebiber
24. ç« èãè¡¨æ ¼ãå¾ç使ç¨\labelå®ä¹åï¼å¯éè¿\refèªå¨å¼ç¨è·³è½¬ã
25. å¼ç¨åæ£æä¹é´çæä¸ä¸ªç©ºæ ¼ï¼è䏿¯ç´§é»æ£æåæ¯ã
26. ä¸è¦éå¤å¼ç¨åä¸è®ºæçä¸åçæ¬ï¼ä¾å¦ arXiv忣å¼ä¼è®®è®ºæã
å ³äºå ¬å¼ï¼
27. å ¬å¼ä¸ºå¥åçä¸é¨åãå æ¤å¯å¨å ¬å¼å é¨ï¼å°¤å ¶æ¯å¤è¡ï¼å å ¥éå·åå¥å·ã
28. 对äºå ¬å¼åé¢çæåï¼è¥ä¸å ¬å¼ç»æå®æçå¥åï¼åé¦åæ¯ä¸éè¦å¤§åï¼å¹¶ç´§æ¥å¨å ¬å¼åé¢ï¼è¥å¦èµ·æ°çå¥åææ®µè½ï¼åå¨å ¬å¼ç»æç¬¦\end忢è¡ï¼å¹¶å¥åé¦åæ¯å¤§åï¼å¼å¯æ°çå¥åã
æç¨¿å注æäºé¡¹ï¼
29. æ£æ¥æç« çå¿åæ§ï¼ä¸è½å å«ä¸ªäººä¸æºæä¿¡æ¯ã
30. æ£æ¥æ¯å¦è¶ é¡µï¼æåæ¶å»æ éå¿éææ¹å¾è¡¨å¤§å°ï¼
31. æ£æ¥æ é¢åæè¦ä¸æç¨¿ç³»ç»å¡«åæ¡å çä¿¡æ¯æ¯å¦å¯¹åºã
32. æ£æ¥æäº¤çæ°æ®ä¸ä»£ç ï¼ä¸è½å å«ä¸ªäººä¸æºæä¿¡æ¯ï¼å°¤å ¶ code éé¢ä¸äº hard coded çæ¨¡åææ°æ®è·¯å¾çéè¦å¤çæï¼å¦å¤éè¦æ³¨æéèæä»¶å¤¹ï¼å¦ .gitï¼
33. Overleaf å¨é¨åä¼è®®æç¨¿åå¯è½è®¿é®ç¼æ ¢ï¼è¯·æ³¨æLaTexå¤ä»½
34. 论æçåå²çæ¬å¯ä»¥ç¨æ¶é´ç¼å·ï¼é¿å æäº¤ç䏿¯æç»çæ¬
35. æªç¨¿åä¸å¤©æå¥½æå交ä¸ä¸ªçæ¬ç论æä»¥åéå½ï¼é²æ¢æªç¨¿æ¶æå¡å¨å´©æº
36. 交稿åä»éè¦å ³æ³¨ä¼è®®å®ç½å注åé®ç®±ï¼ä»¥åæ¶æ¶å°å¯è½åå¨çä¼è®®æªç¨¿æ¥æå»¶åçæ¶æ¯ã
ç¾å®¶ä¹è¨
â æ¸ å大å¦åæ´èå¸çCWMT-2014æ¥åï¼æºå¨ç¿»è¯å¦æ¯è®ºâ½æåä½â½ æ¹æ³åæå·§
â 夿¦å¤§å¦é±é¡é¹èå¸çCCL-2018æ¥åï¼å¦ä½ç«¯å°ç«¯å°åç§ç 论æï¼
â æ¸ å大å¦åç¥è¿èå¸ï¼å¦ä½åä¸ç¯åæ ¼çNLP论æ
â ä¸å½äººæ°å¤§å¦èµµé«èå¸ï¼å¦ä½ä»¥åå¦è ç身份å好ä¸ç¯å½é 妿¯è®ºæ
â åå°æ»¨å·¥ä¸å¤§å¦è½¦ä¸ç¿èå¸ï¼å¦ä½åä¸ä¸ªç²¾å½©ç妿¯æ¥å
â 馿¸¯ä¸æå¤§å¦ï¼æ·±å³ï¼éå è±è叿´ççåä½å»ºè®® || & Rebuttal Template
â åä½å¤§å¦ Whitesides èå¸ï¼ä»åæçº²çè§åº¦åå ¥è®²è§£å¦ä½æ°å妿¯è®ºæ
â å¡èåºæ¢ éå¤§å¦ Graham Neubig èå¸ï¼How to Read/Write an International Conference Paper & Paper style guide
â MSRç ç©¶åSimon Peyton Jonesèå¸ï¼How to Write a Great Research Paper
â MSRAç ç©¶åçæä¸ï¼ç¨LaTexå论æç»éªå享
â åå°æ»¨å·¥ä¸å¤§å¦åä¸ä½³å士çNLPCC-2018æ¥åï¼è®ºæåä½çæè¯»æ§åå
â 䏿µ·äº¤é大å¦å¼ å¬èå士æ´ççåä½å»ºè®®
â TTICç³ææ¦å士ï¼å¦ä½ä¼é å°ï¼ç¨TeXï¼åAI论æ
â å¤å°ç¶çåèå±ï¼11 个好ç¨çç§ç å·¥å ·æ¨èï¼å·¥ä½æçæå maxï¼
â Deepmind Chris Dyer çèå¸ï¼å ³äºNLP论æä¸å ¬å¼ç建议
â Google BrainçBo Changèå¸çLaTeX Tips
â å¦ä½è®©æè¦å¸å¼äººï¼Nature论ææè¦æ¨¡æ¿å¼å¾æ¶è
â ä¸ä¸ªNotationçéè¦åè
â acl-org/aclpubcheck: Tools for checking ACL paper submissions
â dspinellis/latex-advice: Advice for writing LaTeX documents
â Graham Neubigï¼How to Read/Write an International Conference Paper
â Karl Whelanï¼Writing Tips for PhD Theses
â Karl Whelanï¼Tips for Preparing and Publishing Research Papers
ç»ç»è
å表
æè°¢ä»¥ä¸åå¦å¯¹æ¬é¡¹ç®è¿è¡ç»ç»ä¸æå¯¼
è´¡ç®è
å表
æè°¢ä»¥ä¸åå¦å¯¹æ¬é¡¹ç®çæ¯æä¸è´¡ç®
Top Related Projects
Simple, Pythonic, text processing--Sentiment analysis, part-of-speech tagging, noun phrase extraction, translation, and more.
💫 Industrial-strength Natural Language Processing (NLP) in Python
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Stanford NLP Python library for tokenization, sentence segmentation, NER, and parsing of many human languages
Convert
designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual Copilot