FAQ
This FAQ answers common questions about ARC’s competitions, team roles, participation expectations, and how DS@GT ARC operates. If you are new to ARC, start with the Join page.
ARC’s annual cycle centers on CLEF, but we also participate in venues such as TREC, MediaEval, and NTCIR. Most questions below use CLEF as the primary example because it drives our publication schedule.
1. What is CLEF?
CLEF (Conference and Labs of the Evaluation Forum) is an annual independent peer-reviewed conference focused on information access systems in multilingual and multimodal contexts. It provides infrastructure for testing, tuning, and evaluating these systems, and for creating reusable test collections through community-based evaluation labs.
2. What lab/task should I pursue?
Each lab and task addresses a problem in a particular domain. Are you more interested in natural language processing? Computer vision? Biodiversity conservation? Medical applications? Multi-modality? You may find a summary of each lab in the last question of this FAQ. Review the task overview papers in the most recent CLEF working notes: Working Notes of CLEF 2025 (CEUR-WS Vol-4038).
3. How many tasks can I participate in?
There is no limit to tasks that someone can participate in. We strongly recommend first-time members not doing more than one task.
4. What’s the difference between a Lab Lead and a Lab member?
A Lab Lead is the main person responsible for delivering the task, including:
- Entering and participating in the competition/task
- Interviewing potential participants to join the team
- Conducting and hosting weekly team meetings
- Communicating team progress with team members and the DS@GT ARC monthly meetings
- Delegating work within the team
- Submitting the working note paper for peer review and publication
A Lab member is responsible for the following:
- Making a real contribution to the competition/task
- Attending weekly meetings and reporting progress
- Attending and presenting at DS@GT ARC monthly group meetings
- Writing portions of the working note paper related to the work done
Lab Leads and members are expected to commit time, have strong programming skills, and be legitimately curious about the lab and task.
5. How do I become a Lab Lead for a task?
Lab Lead selection typically happens before the spring competition cycle begins. If you are interested in leading a future task, monitor the Join page and reach out to Anthony Miyaguchi (acmiyaguchi at gatech.edu) or Murilo Gustineli (murilogustineli at gatech.edu) with an overview of your proposed solution for the particular task.
6. What is the time commitment required to participate?
The time commitment varies depending on your role and the effort you want to put in. However, to make a meaningful contribution, you should expect to dedicate around 100–150 hours throughout the project. Think of it as the equivalent of a 2–3 unit course, requiring consistent effort. Lab Leads require additional time to manage their tasks and coordinate with team members. This is the type of experience where you get out what you put in. Ultimately, your level of involvement is up to you, but consistent effort is key to gaining valuable experience and making an impact.
7. Can two teams participate in the same task?
No. A person can be part of one or more teams. But a team can only do one task.
8. Why can’t I edit the meeting documents?
Meeting documents and collaboration resources are typically shared with active participants after team formation. If you think you should already have access, ask in the DS@GT Slack or use the Contact page.
9. Is this opportunity only available for current students, or can alumni participate as well?
This opportunity is not limited to current students; GT alumni are also welcome to join our group! However, participants must be members of the Data Science @ Georgia Tech (DS@GT) club and have paid their membership dues. To join:
- Join the DS@GT Slack channel through @datasciencegt | Linktree
- Join the channel
applied-research-competitionsin the DS@GT Slack - Pay the membership dues: $25 for one year or $15 for one semester
- Payment links can be found at the bottom of the Linktree page
10. How can I earn academic credit for participating in CLEF?
If you are an OMSCS student, there are two primary ways to earn academic credit through CLEF participation:
- CS 8903 - Special Problems
- CS 8803 O24 - Intro to Research
CS 8903 – Special Problems
This is a supervised research course that requires special permission to enroll. To take this course, you need to:
- Develop a research proposal statement. Here’s an example of a research statement from a previous semester Spring 2024 - Statement of Research, Murilo Gustineli
- Secure a faculty member from the College of Computing (CoC) who agrees to supervise your research. Finding a professor can be challenging due to limited availability, and enrollment is contingent upon obtaining their permission. You can find more information on the Graduate Forms & Procedures | College of Computing
CS 8803 O24 – Intro to Research
This course offers a general introduction to research methods and computer science research. Unlike CS 8903, you can register for this course as part of your regular course selection in the OMSCS program.
11. What kinds of labs are available through CLEF?
The table below summarizes the labs in the current CLEF 2026 program. Lab lineups shift year to year, so visit the CLEF 2026 labs page for each lab’s current tasks, call for participation, and official lab pages.
| Lab | Focus |
|---|---|
| BioASQ | Large-scale biomedical semantic indexing and question answering |
| CheckThat! | Claim verification and combating disinformation |
| ELOQUENT | Evaluation of generative language model quality |
| eRisk | Early risk detection on the internet (health and safety) |
| EXIST | Sexism identification in social networks |
| FinMMEval | Multilingual and multimodal evaluation for financial AI |
| HIPE | Person–place relation extraction from multilingual historical texts |
| ImageCLEF | Multimodal data annotation, indexing, and retrieval |
| JOKER | Humor detection, search, and translation |
| LifeCLEF | Biodiversity monitoring with AI-powered tools |
| LongEval | Longitudinal evaluation of information retrieval model performance |
| PAN | Stylometry and digital text forensics |
| qCLEF | Quantum computing for information retrieval and recommender systems |
| SimpleText | Scientific text simplification |
| TalentCLEF | Skill and job-title intelligence for human capital management |
| Touché | Argumentation retrieval and generation systems |
Past CLEF working notes on CEUR-WS: CLEF 2025 (Vol-4038) · CLEF 2024 (Vol-3740) · CLEF 2023 (Vol-3497).