[Date Prev][Date Next][Date Index]

[ntcir:208] Final CFP for NTCIR-6:

     Call for Participation to NTCIR-6
Evaluation of Information Access Technologies: IR, QA and
  Cross-lingual Information Access Technologies

Final Meeting: 15-18 May 2007, NII, Tokyo, Japan

** Registration Due is June 23, 2006 *****

** Dry run for PATENT and Formal Run for CLIR will start on July 1.
So please register asap and prepare well if it was not done before!



The NTCIR is a series of evaluation workshop to enhance the
research in information access technologies, including text
retrieval, question answering, cross-lingual information
access, etc., by providing infrastructure of evaluation
and research including test collections, evaluation metrics
and methodologies, and a forum of researchers.

NTCIR has used East Asian Language documents but attracted
international participation. For example the previous NTCIR
has participants from 15 different countries and areas including
Australia, Canada, China PRC, Germany, Hong Kong, Ireland, Japan,
Korea, Netherlands, Singapore, Spain, Switzerland, Taiwan ROC,
UK, United States.


NTCIR-6 selected the following 4 areas of research as
"Tasks" and 1 area as a "pilot workshop"; Other "pilot tasks"
can be started any time during the process of NTCIR-6.

Most of the tasks will return the initial evaluation results by
Dec. 1, 2006, so that the participants can submit papers to major
international conferences on IR and NLP based on the experiments
done at NTCIR-6.

1. Cross-Lingual Information Retrieval Task (CLIR)
Multi- and Bi-lingual CLIR, and
Single language IR
Languages: Traditional Chinese, Korean, and Japanese.
Simplified Chinese can be added. To conclude the CLIR to
news documents, 4 CLIR test collections, NTCIR-3 through -6
will be used and Cross-collection analysis will be done.
New metrics for graded relevance judgments are used.
Discussion about evaluation metrics and methodologies
are also welcome.
URL: http://homepage3.nifty.com/kz_401/index.htm

2. Cross-Language Question Answering Task (CLQA)
Focus on Named Entities, which are one of the problems
in CL information access in Asian context.
Traditional Chinese, English and Japanese. Include C-C, J-J
monolingual. Korean and Simplified Chinese are under
consideration. Volunteer for cooperation to organize Korean
part is welcome.
URL: http://clqa.jpn.org/

3. Patent Retrieval Task (PATENT)
Retrieval task: "Invalidity search" Using Japanese patents
(10 year patent application fulltexts) and USPTO patents.
Use 4 patent collections of NTCIR-3 through -6.
Classification task: Multi-viewpoint categorization. The
purpose is to categorize target patent applications based
on the F-term classification system.
URL: http://if-lab.slis.tsukuba.ac.jp/fujii/ntcpat/index-en.html

4. Question Answering Challenge (QAC)
Question answering beyond factoid questions and their
evaluation methodology. Run on Japanese news documents. This
is targeting any kinds of questions, and try to use real
questions collected from WEB Knowledge Services and QA demo
sites. For the evaluation, application of summarization-or MT-
oriented metrics BE, Rouge, BLUE are proposed, and any attempt
for new metrics are welcome!
URL: http://www.nlp.is.ritsumei.ac.jp/qac/index-e.htm

5. Pilot Tasks: Opinion Extraction (OPINION)
Extracting opinion-realted elements from news texts in Chinese
and Japanese. English can be included.
URL will be announced later

Any other attempts to test the problems to be solve in short
time, or feasibility studies for the future tasks. To be
announced later. Multilingual Multi-document Summarization,
Evaluation of WEB Search engine are also under consideration,
but not decided yet.

The call for participation to the pilot task has not started
yet. It will be announced through this mailing list later.

6. Pilot Workshop: Multimodal Summarization of Trend Information (MuST)
Focusing numerical information in the text. Extract numerical
information representing the trends from multiple documents,
analyze, summarize and display visullay. We constructed an annotated
corpus and share it. Research groups from various disciplinaries
share the corpus and work various direction of research using the
corpus. Currently using Japanese documents only. To be announced

The CFP for MuST has not started yet. It will be announced later.


(1) Cross-Collection Evaluation:

In IR related tasks, multiple test collections for the same
document genres and the same users' tasks will be used and
analyze the variability across the collections. This is to
investigate the method for more stable and reliable testing.

(2) Submission Raw Data and Evaluation Results:

All the submitted runs and their evaluated results using the metrics
set by the task will be available for the active participants of the
task. The purpose of this is that we invited all the task participants
to discuss, analyze or examine the evaluation metrics and evaluation
results of the task.

Evaluation is very critical issue for all of the researchers. So
please examine how the evaluation is done and how the metrics behave,
or whether there are any methods to overcome the limitation of
current practice of the evaluation. With your cooperation, we would
like to obtain fruitful examination of the evaluation results and metrics.


Conjunction with NTCIR-6 Final Meeting, the separate conference called
"Pre-Meeting Workshop" will be held. It used be called as "Open
Submission Session" in NTCIR-4 and -5. This is a REFEREED international
conference on research in Information Access and their evaluation.
Detailed information will be announced later.


Extended Registration Deadline: June 23, 2006
Document Set Release: June 1, 2006
Dry Run: from July 2006 to Sept. 2006
Formal Run: from July 2006 to Dec. 2006
Evaluation Results Return: by Feb. 1, 2007
Paper for the Proceedings Due : March 1, 2007
Final Meeting, Tokyo, Japan: May 15-18, 2007

** Most of the tasks plan to return the initial evaluation
results by December 1, 2006, so that the participants can
submit papers to some of the major international IR- or
NLP-related conferences based on the experiments at NTCIR-6.


** User agreement forms for NTCIR-6 participants are available at;
The signed user agreement forms are needed for data delivery.

Noriko Kando at kando (at) nii.ac.jp

Your participation is more than welcome. Please join and work together!

Noriko Kando
ntcir project