Instituto Superior Técnico, Av. Rovisco Pais,
1094-001 Lisboa, Portugal.
Tel. +351-218417000.
Fax. +351-218499242.
Wednesday September 13, 2000 | ||
Shared ICGI-LLL-CoNLL session (GA) | ||
9.00 |
Opening | |
9.15 |
On the Complexity of Consistent Identification of Some Classes of
Structure Language
| |
9.45 |
The Acquisition of Word Order by a Computational Learning System
| |
10.15 |
Corpus-Based Grammar Specialization
| |
10.45 |
Coffee | |
CoNLL (R 02.3) |
ICGI (R 01.1) | |
11.30 |
Memory-Based Learning for Article Generation
|
Computational Complexity of Problems on Probabilistic Grammars and
Transducers
|
12.00 |
Using Induced Rules as Complex Features in Memory-Based Language Learning
|
Probabilistic k-Testable Tree Languages
|
12.30 |
Pronunciation by Analogy in Normal and Impaired Readers
|
Counting Extensional Differences in BC-Learning
|
13.00 |
Lunch | |
CoNLL (R 02.3) | ||
14.00 |
Learning Distributed Linguistic Classes
| |
14.30 |
Knowledge-Free Induction of Morphology Using Latent Semantic Analysis
| |
15.00 |
Modeling the Effect of Cross-Language Ambiguity on Human Syntax Acquisition
| |
15.30 |
Tea | |
16.00
|
CoNLL Poster Session
Using Perfect Sampling in Parameter Estimation of a Whole Sentence
Maximum Entropy Language Model
Experiments on Unsupervised Learning for Extracting Relevant
Fragments from Spoken Dialog Corpus
Generating Synthetic Speech Prosody with Lazy Learning in Tree Structures
Inducing Syntactic Categories by Context Distribution Clustering
ALLiS: a Symbolic Learning System for Natural Language Learning
Combining Text and Heuristics for Cost-Sensitive Spam Filtering
Genetic Algorithms for Feature Relevance Assignment in Memory-Based
Language Processing
Shallow Parsing by Inferencing with Classifiers
Minimal Commitment and Full Lexical Disambiguation: Balancing Rules
and Hidden Markov Models
Learning IE Rules for a Set of Related Concepts
A Default First Order Family Weight Determination Procedure for WPDV models
A Comparison of PCFG Models
| |
19.00 JOINT BANQUET
| ||
Thursday September 14, 2000 | ||
Shared LLL-CoNLL session (R 02.3) | ||
9.30 |
Incorporating Linguistics Constraints into Inductive Logic Programming
| |
10.00 |
Increasing our Ignorance of Language: Identifying Language
Structure in an Unknown 'Signal'
| |
10.30 |
Coffee | |
11.00 |
Invited Talk:
Learning in Natural Language: Theory and Algorithmic Approaches Dan Roth (University of Illinois at Urbana-Champaign) | |
CoNLL (R 02.3) |
LLL (R 01.1) | |
12.00 |
A Comparison between Supervised Learning Algorithms for Word Sense
Disambiguation
Gerard Escudero, Lluís Màrquez, German Rigau |
Recognition and Tagging of Compound Verb Groups in Czech
|
12.30 |
The Role of Algorithm Bias vs Information Source in Learning
Algorithms for Morphosyntactic Disambiguation
Guy De Pauw, Walter Daelemans | |
13.00 |
Lunch | |
14.00 |
Incorporating Position Information into a Maximum Entropy/Minimum
Divergence Translation Model
|
Invited Talk:
|
14.30 |
Overfitting Avoidance for Stochastic Modeling of Attribute-Value Grammars
| |
15.00 | SIGNLL Business Meeting |
Learning from Parsed Sentences with INTHELEX
|
15.30 |
Tea | |
16.00
|
CoNLL Shared Task Session
Introduction to the CoNLL-2000 Shared Task: Chunking
System Descriptions
Phrase Parsing with Rule Sequence Processors: an Application to the
Shared CoNLL Task
A Context Sensitive Maximum Likelihood Approach to Chunking
Improving Chunking by Means of Lexical-Contextual Information in
Statistical Language Models
Single-Classifier Memory-Based Phrase Chunking
Shallow Parsing as Part-of-Speech Tagging
Chunking with Maximum Entropy Models
Learning Syntactic Structures with XML
Hybrid Text Chunking
Text Chunking by System Combination
Chunking with WPDV Models
Use of Support Vector Learning for Chunk Identification
Discussion and Closing |
Inductive Logic Programming for Corpus-Based Acquisition of
Semantic Lexicons
Pascale Sébillot, Pierrette Bouillon, Cécile Fabre
Learning from a Substructural Perspective
Closing Session and Community Meeting |