2004
Journal article
Unknown
Cross-language evaluation forum: objectives, results, achievements
Braschler M., Peters C.The Cross-Language Evaluation Forum (CLEF) is now in its fourth year of activity. We summarize the main lessons learned during this period, outline the state-of-the-art of the research reported in the CLEF experiments and discuss the contribution that this initiative has made to research and development in the multilingual information access domain. We also make proposals for future directions in system evaluation aimed at meeting emerging needs.Source: Information retrieval (Boston) 7 (2004): 7–31.
See at:
CNR ExploRA
2005
Journal article
Unknown
Cross-language information retrieval: the way ahead
Gey F. C., Kando N., Peters C.This introductory paper covers not only the research content of the articles in this special issue of IP&M but attempts to characterize the state-of-the-art in the Cross-Language Information Retrieval (CLIR) domain. We present our view of some major directions for CLIR research in the future. In particular, we find that insufficient attention has been given to the Web as a resource for multilingual research, and to languages which are spoken by hundreds of millions of people in the world but have been mainly neglected by the CLIR research community. In addition, we find that most CLIR evaluation has focussed narrowly on the news genre to the exclusion of other important genres such as scientific and technical literature.Source: Information processing & management 41(3) (2005): 415–431.
See at:
CNR ExploRA
2005
Contribution to book
Unknown
Comparative evaluation of cross-language information retrieval systems
Peters C.With the increasing importance of the 'Global Information Society' and as the world's depositories of online collections proliferate, there is a growing need for systems that enable access to information of interest wherever and however it is stored, regardless of form or language. In recognition of this, five years ago, the DELOS Network for Digital Libraries launched the Cross- Language Evaluation Forum (CLEF), with the objective of promoting multilingual information access by providing the research community with an infrastructure for testing and evaluating systems operating in multilingual contexts and a common platform for the comparison of methodologies and results. In this paper, we outline the various activities initiated by CLEF over the years in order to meet the emerging needs of the application communities, and trace the impact of these activities on advances in multilingual system development.Source: From Integrated Publication and Information Systems to Virtual, edited by Matthias Hemmje, Claudia Niederée, Thomas Risse, pp. 152–161. Berlin: Springer, 2005
See at:
CNR ExploRA
2006
Journal article
Unknown
CLEF 2005: ad hoc track overview
Di Nunzio G., Ferro N., Jones G. J. F., Peters C.We describe the objectives and organization of the CLEF 2005 ad hoc track and discuss the main characteristics of the tasks offered to test monolingual, bilingual, and multilingual textual document retrieval. The performance achieved for each task is presented and a statistical analysis of results is given. The mono- and bilingual tasks followed the pattern of previous years but included target collections for two new-to-CLEF languages: Bulgarian and Hungarian. The multilingual tasks concentrated on exploring the reuse of existing test collections from an earlier CLEF campaign. The objectives were to attempt to measure progress in multilingual information retrieval by comparing the results for CLEF 2005 submissions with those of participants in earlier workshops, and also to encourage participants to explore multilingual list merging techniques.Source: Lecture notes in computer science 4022 (2006): 11–36.
See at:
CNR ExploRA
2004
Contribution to book
Unknown
CLEF 2003 methodology and metrics
Braschler R., Peters C.Sommario in IngleseWe describe the overall organization of the CLEF 2003 evaluation campaign, with a particular focus on the cross-language ad hoc and domain-specific retrieval tracks. The paper discusses the evaluation approach adopted, describes the tracks and tasks offered and the test collections used, and provides an outline of the guidelines given to the participants. It concludes with an overview of the techniques employed for results calculation and analysis for the monolingual, bilingual and multilingual and GIRT tasks.Source: Comparative Evaluation of Multilingual Information Access Systems:, edited by Carol Peters, Julio Gonzalo, Martin Braschler, et al., pp. 7–20. Berlin: Springer, 2004
See at:
CNR ExploRA
2001
Journal article
Unknown
Cross-language system evaluation: the CLEF campaigns
Peters C., Braschler M.The goals of the CLEF (Cross-Language Evaluation Forum) series of evaluation campaigns for information retrieval systems operating on European languages are described. The difficulties of organizing an activity which aims at an objective evaluation of systems running on and over a number of different languages are examined. The discussion includes an analysis of the first results and proposals for possible developments in the future.Source: Journal of the American Society for Information Science and Technology (Print) 52 (2001): 1067–1072.
See at:
CNR ExploRA
2007
Contribution to book
Open Access
CLEF 2006: ad hoc track overview
Di Nunzio G., Ferro N., Mandl T., Peters C.We describe the objectives and organization of the CLEF 2006 ad hoc track and discuss the main characteristics of the tasks offered to test monolingual, bilingual, and multilingual textual document retrieval systems. The track was divided into two streams. The main stream offered mono- and bilingual tasks using the same collections as CLEF 2005: Bulgarian, English, French, Hungarian and Portuguese. The second stream, designed for more experienced participants, offered the so-calledSource: Evaluation of Multilingual and Multi-modal Information Retrieval, pp. 21–34, 2007
DOI: 10.1007/978-3-540-74999-8Metrics:
See at:
hdl.handle.net | NARCIS | NARCIS | CNR ExploRA
2007
Conference article
Open Access
The future of large-scale evaluation campaigns for information retrieval in Europe
Agosti M., Di Nunzio G., Ferro N., Harman D., Peters C.A Workshop on "The Future of Large-scale Evaluation Campaigns" was organised jointly by the University of Padua and the DELOS Network of Excellence and held in Padua, Italy, in March 2007. The aim was to perform a critical assessment of the scientific results of such ini- tiatives and to formulate recommendations for the future. This poster summarises the outcome of the discussion with respect to the major European initiative in this area: the Cross Language Evaluation Forum.Source: Research and Advanced Technology for Digital Libraries . 11th European Conference, ECDL 2007, pp. 509–512, Budapest, Hungary, 16-21 September 2007
DOI: 10.1007/978-3-540-74851-9_54Metrics:
See at:
www.dei.unipd.it | doi.org | www.scopus.com | CNR ExploRA
2007
Contribution to journal
Unknown
Preface - Evaluation of Multilingual and Multi-modal Information Retrieval
Peters C., Clough P., Gey F., Karlgren J., Magnini B., Oard D., De Rijke M., Stempfhuber M.This book constitutes the thoroughly refereed post-proceedings of the 7th Workshop of the Cross-Language Evaluation Forum, CLEF 2006, held in Alicante, Spain, September 2006. The revised papers presented together with an introduction were carefully reviewed and selected for inclusion in the book. The papers are organized in topical sections on Multilingual Textual Document Retrieval, Domain-Specifig Information Retrieval, i-CLEF, QA@CLEF, ImageCLEF, CLSR, WebCLEF and GeoCLEF.
See at:
CNR ExploRA
2004
Journal article
Unknown
Editorial activity - CLEF 2003
Gonzalo J., Peters C.Provides information on organisation of CLEF2003 evaluation campaign.Source: Lecture notes in computer science 3237 (2004): 1–6.
See at:
CNR ExploRA
2003
Contribution to book
Restricted
CLEF 2002 Methodology and Metrics
Braschler M., Peters C.We give a detailed presentation of the organization of the CLEF 2002 evaluation campaign, focusing mainly on the core tracks. This includes a discussion of the evaluation approach adopted, explanations of the tracks and tasks and the underlying motivations, a description of the test collections, and an outline of the guidelines for the participants. The paper concludes with indications of the techniques used for results calculation and analysis.Source: Advances in Cross-Language Information Retrieval, edited by Peters C.; Braschler M.; Gonzalo J.; Kluck M., pp. 512–525, 2003
DOI: 10.1007/978-3-540-45237-9_44Metrics:
See at:
doi.org | link.springer.com | CNR ExploRA
2008
Conference article
Open Access
CLEF 2007: ad hoc track overview
Di Nunzio G., Ferro N., Mandl T., Peters C.We describe the objectives and organization of the CLEF 2007 Ad Hoc track and discuss the main characteristics of the tasks offered to test monolingual and cross-language textual document retrieval systems. The track was divided into two streams. The main stream offered mono- and bilingual tasks on target collections for central European languages (Bulgarian, Czech and Hungarian). Similarly to last year, a bilingual task that encouraged system testing with non-European languages against English documents was also offered; this year, particular attention was given to Indian languages. The second stream, designed for more experienced participants, offered mono- and bilingual "robust" tasks with the objective of privileging experiments which achieve good stable performance over all queries rather than high average performance. These experiments re-used CLEF test collections from previous years in three languages (English, French, and Portuguese). The performance achieved for each task is presented and discussed.Source: 8th Workshop of the Cross-Language Evaluation Forum, CLEF 2007, pp. 13–32, Budapest, Hungary, 19-21 September 2007
DOI: 10.1007/978-3-540-85760-0_2Metrics:
See at:
doras.dcu.ie | doi.org | link.springer.com | www.scopus.com | CNR ExploRA
2008
Conference article
Restricted
What happened in CLEF 2007
Peters C.The organization of the CLEF 2007 evaluation campaign is described and details are provided concerning the tracks, test collections, evaluation infrastructure, and participation. The main results are commented and future evolutions in the organization of CLEF are discussed.Source: 8th Workshop of the Cross-Language Evaluation Forum, CLEF 2007, pp. 1–12, Budapest, Hungary, 19-21 September 2007
DOI: 10.1007/978-3-540-85760-0_1Metrics:
See at:
doi.org | link.springer.com | CNR ExploRA
2008
Contribution to journal
Restricted
CLEF 2007
Peters C., Jijkoun V., Mandl T., Mueller H., Oard D. W., Petras V., Santos D.This book constitutes the thoroughly refereed post-proceedings of the 8th Workshop of the Cross-Language Evaluation Forum, CLEF 2007, held in Budapest, Hungary, September 2007. The revised papers presented together with an introduction were carefully reviewed and selected for inclusion in the book. The papers are organized in topical sections on Multilingual Textual Document Retrieval, Domain-Specific Information Retrieval, Multiple Language Question Answering, Cross-language Retrieval in Image Collections, Cross-Language Speech retrieval, Multingual Web Retrieval, and Cross-Language Geographical retrieval.DOI: 10.1007/978-3-540-85760-0_1Metrics:
See at:
doi.org | link.springer.com | CNR ExploRA
2010
Contribution to book
Restricted
What happened in CLEF 2009
Peters C.The organization of the CLEF 2009 evaluation campaign is described and details are provided concerning the tracks, test collections, evaluation infrastructure,and participation. The aim is to provide the reader of these proceedings with a complete picture of the entire campaign, covering both text and multimedia retrieval experiments. In the final section, the main results achieved by CLEF in the first ten years of activity are discussed and plans for the future of CLEF are presented.Source: Multilingual Information Access Evaluation I. Text Retrieval Experiments. CLEF 2009, edited by Carol Peters, Giorgio Maria Di Nunzio, Mikko Kurimo, Djamel Mostefa, Anselmo Penas, Giovanna Roda, pp. 1–12, 2010
DOI: 10.1007/978-3-642-15754-7_1DOI: 10.1007/978-3-642-15751-6_1Metrics:
See at:
doi.org | doi.org | www.springerlink.com | CNR ExploRA
2009
Contribution to book
Restricted
What Happened in CLEF 2008
Peters C.The organization of the CLEF 2008 evaluation campaign is described and details are provided concerning the tracks, test collections, evaluation infrastructure, and participation. The main results are commented and future evolutions in the organization of CLEF are discussed.Source: , pp. 1–14, 2009
DOI: 10.1007/978-3-642-04447-2_1Metrics:
See at:
doi.org | link.springer.com | CNR ExploRA
2009
Contribution to journal
Restricted
Preface to Evaluating Systems for Multilingual and Multimodal information Access
Peters C., Deselaers T., Ferro N., Gonzalo J., Jones G. J., Kurimo M., Mandl T., Penas A., Petras V.This book constitutes the thoroughly refereed proceedings of the 9th Workshop of the Cross-Language Evaluation Forum, CLEF 2008, held in Aarhus, Denmark, in September 2008. The 130 revised and extended papers presented were carefully reviewed and selected for inclusion in the book. They are completed by an introduction on CLEF 2008. As usual, the seven main evaluation tracks in CLEF 2008 aimed to test the performance of a wide range of multilingual information access systems or system components. The papers are organized in topical main sections on Multilingual Textual Document Retrieval (Ad Hoc), Mono- and Cross-Language Scientific Data Retrieval (Domain-Specific), Interactive Cross-Language Retrieval (iCLEF), Multiple Language Question Answering (QA@CLEF), Cross-Language Retrieval in Image Collections (ImageCLEF), Multilingual Web Track (WebCLEF), Cross-Language Geographical Retrieval (GeoCLEF), Cross-Language Video Retrieval (VideoCLEF), Multilingual Information Filtering (INFILE@CLEF), and Morpho Challenge at CLEF 2008.DOI: 10.1007/978-3-642-04447-2Metrics:
See at:
doi.org | link.springer.com | CNR ExploRA