Wei Xu     

Postdoctoral Fellow
Computer and Information Science Department
University of Pennsylvania
   Levine Hall Room 361
3330 Walnut St, Philadelphia, PA 19104

I am a postdoc at University of Pennsylvania, working with Chris Callison-Burch. My research lies at the intersections of machine learning, natural language processing, and social media. I am particularly interested in designing learning algorithms for gleaning semantic and structured knowledge from massive social media and web data. My work enables deeper analysis of text meaning and better natural language generation.

I graduated with a PhD in Computer Science from New York University and my advisor was Ralph Grishman. My thesis work was about Data-driven Approaches for Paraphrasing Across Language Variations, with Bill Dolan, Satoshi Sekine, Luke Zettlemoyer and Ernest Davis as committee. I received my bachelor and master degrees in Computer Science from Tsinghua University in Beijing, China.

I am currently on the job market and looking for a faculty position starting fall 2016.
What's New
I designed and taught a new course at University of Pennsylvania:
    Social Media and Text Analytics, Summer 2015

Research Highlights

Joint Word-Sentence Models

I build probabilistic graphical models to extract semantic or structured knowledge from large volumes of data. I designed the first succesful models to extract paraphrases from Twitter that can scale up to billions of sentences. These web-scale paraphrases enable natural language systems to handle errors (e.g. “everytime” ↔ “every time”), lexical variations (e.g. “oscar nom’d doc” ↔ “Oscar-nominated documentary”), rare words (e.g “NetsBulls series” ↔ “Nets and Bulls games”), and language shifts (e.g. “is bananas” ↔ “is great”). But it is difficult to capture such lexically divergent paraphrases by the conventional similarity-based approaches. I invented the multi-instance learning paraphrase (MultiP) model [TACL2014], which jointly infers latent word-sentence relations and relaxes the reliance on human annotation. It is the current state-of-the-art, outperforming deep leaning and latent space methods.

Statistical Text-to-Text Generation Framework

Many text-to-text generation problems can be thought of as sentential paraphrasing or monolingual machine translation. It faces an exponential search space larger than bilingual translation, but a much smaller optimal solution space due to specific task requirements. I advocate for a statistical text-to-text framework, building on top of statistical machine translation (SMT) technology. My recent work uncovered multiple serious problems in text simplification [TACL2015] research between 2010 and 2014, and set a new state-of-the-art by designing novel objective functions for optimizing syntax-based SMT with large-scale paraphrases [to appear]. I also did the first study on stylistic paraphrasing [COLING2012] (e.g. historic → modern, colloquial → formal) .

Area Chair:   EMNLP (2016)
Publicity Chair:   NAACL (2016)
Session Chair:   EMNLP (2015), NAACL (2015), AAAI (2015), ACL (2014)
     - ACL 2015 Workshop on Noisy User-generated Text (W-NUT)
     - SemEval 2015 shared-task: Paraphrases and Semantic Similarity in Twitter (PIT)
Program Committee:
     ACL (2015, 2014, 2013), NAACL (2015), EMNLP (2015, 2014), COLING (2014)
     WWW (2016, 2015), AAAI (2016, 2015, 2012), KDD (2015)
     WWW Workshop on #Microposts (2016)
     ACL Workshop on Social Factors in Natural Language Processing (2016)
     EACL Workshop on Language Analysis in Social Media (2014)
Journal Reviewer:
     Transactions of the Association for Computational Linguistics (TACL)

Invited Talks
I am a big believer of collaborations and have been happy to work and co-author with:
    Colin Cherry (National Research Council Canada)
    Martin Chodorow (CUNY)
    Bill Dolan (Microsoft Research)
    Yangfeng Ji (Gatech)
    Raphael Hoffmann (U of Washington → AI2 Incubator)
    Wenjie Li (Hong Kong Polytechnic University)
    Adam Meyers (NYU)
    Courtney Napoles (JHU)
    Daniel Preoţiuc-Pietro (UPenn)
    Alan Ritter (U of Washington → Ohio State U)
    Joel Tetreault (ETS → Yahoo!)
    Lyle Ungar (UPenn)
    Luke Zettlemoyer (U of Washington)
    Le Zhao (CMU → Google)
    and many others ...

Places I interned and visited when I was a phd student:
    2012-2013, University of Washington, Seattle, WA
    Summer 2011, Microsoft Research, Redmond, WA
    Summer 2010, Amazon.com, Seattle, WA
    Spring/Fall 2010, ETS, Princeton, NJ

I am always happy to work with undergraduate and graduate students. If you are a student at Penn and want to do some research, email me!

My past advisees all have published a paper with me:
    Quanze Chen (undergraduate UPenn)
    Bin Fu (undergraduate Tsinghua → phd CMU → Google NYC)
    Mingkun Gao (master Upenn → phd UIUC)
    Ray Lei (undergraduate UPenn)
    Maria Pershina (phd NYU)

My current advisees:
    Siyu Qiu (master UPenn)


When I have spare time, I enjoy arts, traveling, snowboarding, rock climbing, sailing and windsurfing.

I also made a list of the best dressed NLP researchers (2014).