Main Menu


ICSE 2013








History

The SCORE 2013 Contest: Call for Participation

The third edition of the Student Contest on Software Engineering (SCORE) will be part of the 35rd International Conference on Software Engineering (ICSE 2013).

SCORE is a worldwide competition for undergraduate and master's level students. It emphasizes the engineering aspects of software development, not limited to programming. Student teams participating in the contest will devise and implement data-intensive applications that exploit Open Data.

To take part in the competition, teams must register and follow the contest rules as outlined below. Teams submit full software products, covering the whole software development process. After a careful evaluation carried out by the SCORE Program Committee, several finalist teams will be invited to ICSE 2013 in San Francisco to present their projects and receive their awards at the conference.

Spread the Word!

Please consider promoting the SCORE contest. A flyer is available for download.

Contest Rules

[see also: FAQ]

Project topics - Data-intensive Applications exploiting Open Data

In contrast to the previous years, participating teams are given more freedom in choosing their projects. Rather than specifying concrete project ideas, the SCORE Program Committee hopes to foster creativity among the participants by providing a theme for this year's contest instead. The theme, i.e., this year's goal is therefore to build data-intensive applications that leverage Open Data sources. The projects should put an emphasis on providing some valuable services for citizens. Examples for Open Data sources are the Census Bureau APIs provided by the U.S. Department of Commerce, the Swiss public transport API, or also the UCI KDD Archive. Participants are free, however, to choose different ones. We further encourage the use of the Open Data Protocol, if possible.

Alternatively, teams can also decide to make use of data stored in Open Source repositories instead, such as GitHub or SourceForge, to come up with tools that help developers to build better software systems in the future.

Please note that, while the originality of each idea will certainly be taken into account during evaluation by the program committee, the main focus of the participants should be to produce state-of-the-art software by following a sound engineering process.

The scope and size of the chosen project should be adequate for the development team size. The projects of last year can serve as reference for what the SCORE Program Committee expects as deliverables.

Registration and proposals

Teams that wish to participate, should submit a brief description of their idea using EasyChair. To allow us for planning the evaluation phase, this abstract should be submitted before November 30th, 2012 (preferably earlier) and should include:

  • The project title and a short description of the idea.
  • The (tentative) team members (names, email addresses, institutions).
  • A list of core technologies technologies that will be used for implementation (e.g., programming languages, frameworks, etc.).
  • The name and email address of the contact person.
  • If the project is being performed in conjunction with an academic course, then the name of the course, and the name and email address of the course instructor. The course instructor must be made aware that students are participating in SCORE, and the SCORE committee may cc the instructor in certain communications.
  • If the project is designed based on any commercial or open-source projects, the submission should reference these projects and explain the differences between these projects.

Project development

Since SCORE is a Software Engineering contest, participating teams are required to undertake, at least partially, all aspects of the engineering process, including planning, requirements, design, implementation, and testing. Requirements should be described adequately, e.g., by means of user stories, use cases and scenarios. The outcome of the design phase should be a document that at least describes the architecture model. Implementation has to follow the principles of modern software engineering (this includes proper source code documentation). The tests should include unit, integration and acceptance tests. Special attention will be given to traceability matrices that link requirements from requirements specifications to design artifacts, source code, and test cases. Ideally, the team members should keep these matrices updated as they design and develop software. The description of the format for these traceability matrices is given below.

However, a project need not cover all aspects with the same level of detail. Projects can focus on some aspects of the project (e.g. requirements elicitation and analysis, architectural design, testing, etc.) and devote more time and space to them in their reports, provided that basic project management, requirements gathering, design, implementation and quality assurance are performed. Fully implementing the application is an option, but is not required. However, if a full implementation is not produced, then at least an executable prototype that shows the feasibility of the design and the basic functionality must be delivered.

Teams are free to choose their own development approach and to organize the process accordingly. However, they should provide an adequate process documentation in the end. We encourage the teams to use continuous integration and monitor code quality, e.g., with Sonar. Such efforts will be regarded with favor by the program committee.

Team composition

Participating teams must be composed exclusively of students, at the undergraduate and at the graduate level. Every team must have no less than three members. Teams are strongly advised to have no more than five members.

Teams may be formed and projects may be developed as part of a software engineering course. Also, they can be composed of students from different institutions. [see: For Software Engineering Instructors]

Every team must designate two contact persons, to whom communications and enquiries will be addressed. These contact persons must be members of the team, or faculty members supervising the students (for example if the project is carried out in the context of a software engineering course). However, the faculty member may not actively participate in the development of the project with the supervised team.

Creating Traceability Matrices

Once requirements are defined, team members should proceed with creating traceability matrices as Excel spreadsheets. First three columns (i.e., A, B, and C) contain information about requirements and functionalities from the document that describes a given application. The first column contains the title/name of a requirement, the second column contains the number of the paragraph in the document where this requirement is located, and the third column contains the requirements location in this paragraph -- a path to a document plus the offset to the requirement in bytes. This is a part of the project where team members can be creative in terms of defining the location.

The next columns specify elements of design artifacts (e.g., an actor from use case), elements of the source code (e.g., classes, methods), and test cases by their names that uniquely identify them, that is every column corresponds to a design artifact. Cells contain numbers zero and one. Zero corresponds to the situation where an artifact is not related to a requirement and one specifies an existing relation, for example, that running a test case executes application's code which is linked to some requirement. Please note that each test case should have one or more ``1'' in some of its cells as well as each requirement. That is, each test case should be related to at least one requirement and each requirement should be tested by at least one test case.

Conflicts of interest

Projects developed by teams including students from a given university or institution will not be evaluated or referenced by people from that same university or institution (even if the latter are not the reference contact(s) of the project). ACM conflict of interest rules generally apply to the SCORE contest.

Copyright issues

Unless exceptions are explicitly stated in advance, all artifacts produced by the teams will be treated confidentially by the PC during the evaluation phase but a copyright release will be requested from the teams selected to submit a full deliverable of their project (see the evaluation procedure below).

Paid work

In principle, projects undertaken for SCORE must not be performed as part of paid industrial work. However, if it is done so, team members should submit a letter from a company or an organization that paid for this project that this company or organization allows team members to submit this project to SCORE and that no other participants besides listed team members worked on this project.

Submission and evaluation of deliverables

Abstract and Summary report

To take part in the SCORE contest, teams have to submit an abstract before November 30th 2012 deadline. Before the 30 January 2013 deadline, a report of approximately 20 pages has to be submitted. The report should describe the various artifacts produced during the development. The project reports of finalist teams of the 2009 and 2011 edition of SCORE that are online on the SCORE project repository may serve as a guideline.

Each submitted report will be evaluated by at least 2 members of the SCORE Program Committee. Evaluation will be based on standard quality criteria for software development, which will be detailed in later announcements.

Final deliverable

Selected teams will be required to submit a final deliverable.

Selected teams will be required to submit a final deliverable. All deliverables should be full functioning project installations in a VirtualBox virtual machine. Each team will be given a folder in the Dropbox to which teams will upload their virtual machines with instructions on how to run acceptance tests.

The final deliverable VM must include implementation code and other development outcomes (work products such as specifications, tests, verification experiments, etc.). The teams are responsible to deliver all the material that is necessary to run and fully evaluate their product (this will include any non-standard, non-free and/or non-publicly available development tools, libraries, run-time environments, etc.).

Evaluation will be based on quality of all aspects of the project (process followed, development outcomes, etc.).

The program committee will select a small number of overall SCORE finalists based on the final deliverables. One or more representatives from these teams will be invited to present their projects at the ICSE 2013 conference. Award winners will be selected during the conference.

Financial support to finalists

We anticipate that the ICSE 2013 conference will provide a financial award to help offset travel expenses to one member per team for finalist teams. Free registration to the main conference will be offered to a limited number of team members of finalist teams. Full details about financial support to finalist teams will be posted when the overall conference budget is finalized and in any case before the deadline for the submission of the summary report (30 January 2013).

Timeline (important dates)

The key dates and periods of the SCORE contest are given below. All deadlines are 23:59 anywhere on earth (AoE).

  • July 2012: Publication CFP on the SCORE website.
  • July 2012-November 2012: Registration opens for teams intending to participate in the contest. When registering, teams will have to submit and abstract to EasyChair and name two contact persons (see the "Registration and proposals" section, as well as the "Team composition" section above).
  • December 2012: Teams may start to submit summary reports (submissions open).
  • 30 December 2012: Registration for participation closes.
  • 30 January 2013: Submission for the summary reports closes.
  • 15 March 2013: Selection of the best teams, among which the finalists of the contest will be selected. The best teams will be asked to submit a final deliverable virtual machine to their respective dropbox folders, which will be the basis for the selection of the finalists.
  • 31 March 2013: Deadline for the submission of the final deliverable.
  • 30 April 2013: Announcement of the finalists who will be invited to ICSE 2013.
  • ICSE 2013 (May 18-26 2013): Final evaluation and presentation of the awards.

The submission deadlines accommodate teams in one-semester software engineering courses to participate in the contest.

Awards

All finalists will be recognized at the main conference.

Formal Methods Award. Special recognition will be granted by the Formal Methods Europe group for outstanding exploitation of formal methods in a project. There are no constraints on the type of formal method to be applied nor whether formal methods should be applied throughout the whole life cycle or in single phases or to specific goals. See the dedicated Web page on using Formal Methods in your project for further details.

Distributed Development Award. Special recognition will be granted for outstanding work by a geographically distributed team. Teams competing in SCORE that would like to be considered for this award must include, in their summary report, a section explaining the issues related to distributed development with which they had to deal, and the approaches they took to tackle them.

Other special recognitions may be announced in the upcoming months.

To be considered for a special recognition, a project must exhibit the overall quality that is necessary to be selected as finalist.

After the SCORE competition ends, the summary reports of the projects that are selected as finalists will be published on the SCORE contest web site.