Auto Byte

专注未来出行及智能汽车科技

微信扫一扫获取更多资讯

Science AI

关注人工智能与其他前沿技术、基础学科的交叉研究与融合发展

微信扫一扫获取更多资讯

魔王报道

ACL 2020接收论文列表公开,接收率25.2%,你上榜了吗?

自然语言处理顶会 ACL 2020 将于 7 月 5 日-10 日线上举行。不久之前,ACL 论文接收结果公布,但官方并未放出完整的论文列表。近日,ACL 接收论文列表公布,让我们看一下都有哪些论文被接收了。

此次 ACL 会议的投稿数量为 3088 篇,与去年的投稿数量 2906 篇相比稍有增长。ACL 2020 共接收 779 篇论文,包括 571 篇长论文和 208 篇短论文,接收率为 25.2%。

在接收论文列表中,我们看到了很多熟悉的名字:

Christopher D. Manning(斯坦福大学教授、斯坦福 AI 实验室负责人):

  • Finding Universal Grammatical Relations in Multilingual BERT

  • Optimizing the Factual Correctness of a Summary: A Study of Summarizing Radiology Reports

  • Syn-QG: Syntactic and Shallow Semantic Rules for Question Generation

Yoshua Bengio(加拿大计算机科学家、蒙特利尔大学教授):

  • Exploiting Syntactic Structure for Better Language Modeling: A Syntactic Distance Approach

Yoav Goldberg(以色列 Bar-Ilan 大学计算机科学系高级讲师):

  • A Formal Hierarchy of RNN Architectures

  • Null It Out: Guarding Protected Attributes by Iterative Nullspace Projection

  • Simple, Interpretable and Stable Method for Detecting Words with Usage Change across Corpora

  • Unsupervised Domain Clusters in Pretrained Language Models

  • A Two-Stage Masked LM Method for Term Set Expansion

  • Towards Faithfully Interpretable NLP Systems: How should we define and evaluate faithfulness?

Noah A. Smith(华盛顿大学计算机科学与工程系教授):

  • A Formal Hierarchy of RNN Architectures

  • A Mixture of h − 1 Heads is Better than h Heads

  • Don't Stop Pretraining: Adapt Language Models to Domains and Tasks

  • Improving Transformer Models by Reordering their Sublayers

  • Social Bias Frames: Reasoning about Social and Power Implications of Language

  • The Right Tool for the Job: Matching Model and Instance Complexities

  • Recollection versus Imagination: Exploring Human Memory and Cognition via Neural Language Models

Percy Liang(斯坦福大学计算机系副教授、斯坦福人工智能实验室成员):

  • Robust Encodings: A Framework for Combating Adversarial Typos

  • Selective Question Answering under Domain Shift

  • Enabling Language Models to Fill in the Blanks

  • ExpBERT: Representation Engineering with Natural Language Explanations

  • Shaping Visual Representations with Language for Few-Shot Classification

Sebastian Ruder(DeepMind 研究科学家):

  • A Call for More Rigor in Unsupervised Cross-lingual Learning

  • On the Cross-lingual Transferability of Monolingual Representations

周明(微软亚洲研究院副院长、国际计算语言学协会(ACL)主席):

  • A Graph-based Coarse-to-fine Method for Unsupervised Bilingual Lexicon Induction

  • Curriculum Pre-training for End-to-End Speech Translation

  • Document Modeling with Graph Attention Networks for Multi-grained Machine Reading Comprehension

  • Evidence-Aware Inferential Text Generation with Vector Quantised Variational AutoEncoder

  • Graph Neural News Recommendation with Unsupervised Preference Disentanglement

  • Improving Neural Machine Translation with Soft Template Prediction

  • LogicalFactChecker: Leveraging Logical Operations for Fact Checking with Graph Module Network

  • MIND: A Large-scale Dataset for News Recommendation

  • MuTual: A Dataset for Multi-Turn Dialogue Reasoning

  • Reasoning Over Semantic-Level Graph for Fact Checking

  • A Retrieve-and-Rewrite Initialization Method for Unsupervised Machine Translation

  • A Simple and Effective Unified Encoder for Document-Level Machine Translation

刘铁岩(微软亚洲研究院副院长):

  • A Study of Non-autoregressive Model for Sequence Generation

  • SEEK: Segmented Embedding of Knowledge Graphs

  • SimulSpeech: End-to-End Simultaneous Speech to Text Translation

刘群(华为诺亚方舟实验室语音语义首席科学家):

  • Perturbed Masking: Parameter-free Probing for Analyzing and Interpreting BERT

  • Probabilistically Masked Language Model Capable of Autoregressive Generation in Arbitrary Word Order

  • Word-level Textual Adversarial Attacking as Combinatorial Optimization

宗成庆(中科院自动化所研究员):

  • Attend, Translate and Summarize: An Efficient Method for Neural Cross-Lingual Summarization

孙茂松(清华大学计算机科学与技术系教授):

  • Continual Relation Learning via Episodic Memory Activation and Reconsolidation

  • Fine-grained Fact Verification with Kernel Graph Attention Network

  • How Does NLP Benefit Legal System: A Summary of Legal Artificial Intelligence

  • Word-level Textual Adversarial Attacking as Combinatorial Optimization

刘知远(清华大学计算机科学与技术系副教授):

  • Continual Relation Learning via Episodic Memory Activation and Reconsolidation

  • Expertise Style Transfer: A New Task Towards Better Communication between Experts and Laymen

  • Fine-grained Fact Verification with Kernel Graph Attention Network

  • Grounded Conversation Generation as Guided Traverses in Commonsense Knowledge Graphs

  • How Does NLP Benefit Legal System: A Summary of Legal Artificial Intelligence

  • Word-level Textual Adversarial Attacking as Combinatorial Optimization

  • MOOCCube: A Large-scale Data Repository for NLP Applications in MOOCs

黄民烈(清华大学计算机科学与技术系副教授):

  • A Self-Training Method for Machine Reading Comprehension with Soft Evidence Extraction

  • KdConv: A Chinese Multi-domain Dialogue Dataset Towards Multi-turn Knowledge-driven Conversation

  • Multi-Agent Task-Oriented Dialog Policy Learning with Role-Aware Reward Decomposition

万小军(北京大学计算机科学技术研究所研究员):

  • Automatic Generation of Citation Texts in Scholarly Papers: A Pilot Study

  • Heterogeneous Graph Transformer for Graph-to-Sequence Learning

  • Jointly Learning to Align and Summarize for Neural Cross-Lingual Summarization

  • Learning to Ask More: Semi-Autoregressive Sequential Question Generation under Dual-Graph Interaction

  • Multi-Granularity Interaction Network for Extractive and Abstractive Multi-Document Summarization

  • Semantic Parsing for English as a Second Language

  • Multimodal Transformer for Multimodal Machine Translation

邱锡鹏(复旦大学计算机科学技术学院教授):

  • Extractive Summarization as Text Matching

  • Heterogeneous Graph Neural Networks for Extractive Document Summarization

  • Improving Image Captioning with Better Use of Caption

  • FLAT: Chinese NER Using Flat-Lattice Transformer

韩松(MIT 电子工程和计算机科学系助理教授):

  • HAT: Hardware-Aware Transformers for Efficient Natural Language Processing

欢迎中了 ACL 2020 论文的读者留言,机器之心也将持续为大家推荐更多优质论文。

ACL 2020 接收论文列表,参见:https://acl2020.org/program/accepted/#long-papers

理论ACL 2020接收论文接收率25.2%
相关数据
韩松人物

2017 年斯坦福大学电子工程系博士毕业,师从 NVIDIA 首席科学家 Bill Dally 教授。他的研究也广泛涉足深度学习和计算机体系结构,他提出的 Deep Compression 模型压缩技术曾获得 ICLR'16 最佳论文,ESE 稀疏神经网络推理引擎获得 FPGA'17 最佳论文,对业界影响深远。他的研究成果在 NVIDIA、Google、Facebook 得到广泛应用,博士期间创立了深鉴科技,现为麻省理工学院电气工程和计算机科学系的助理教授。

刘知远人物

刘知远,清华大学计算机系副教授、博士生导师。主要研究方向为表示学习、知识图谱和社会计算。2011 年获得清华大学博士学位,已在 ACL、IJCAI、AAAI 等人工智能领域的著名国际期刊和会议发表相关论文 60 余篇,Google Scholar 统计引用超过 2100 次。承担多项国家自然科学基金。曾获清华大学优秀博士学位论文、中国人工智能学会优秀博士学位论文、清华大学优秀博士后、中文信息学会青年创新奖,入选中国科学青年人才托举工程、CCF-Intel 青年学者提升计划。担任中文信息学会青年工作委员会执委、副主任,中文信息学会社会媒体处理专委会委员、秘书,SCI 期刊 Frontiers of Computer Science 青年编委,ACL、COLING、IJCNLP 领域主席。

刘铁岩人物

刘铁岩博士毕业于清华大学电子工程系。现任微软亚洲研究院主任研究员,互联网经济与计算广告学研究组负责人。他是美国计算机学会(ACM)、国际电子电气工程师学会(IEEE)、和中国计算机学会(CCF)的高级会员。中国科技大学和南开大学的客座教授。

孙茂松人物

孙茂松,教授,博士生导师,曾任清华大学计算机科学与技术系系主任,现任教育部在线教育研究中心副主任、清华大学计算机系党委书记、清华大学大规模在线开放教育研究中心主任。

推荐文章
暂无评论
暂无评论~