From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: Received: from mail3-relais-sop.national.inria.fr (mail3-relais-sop.national.inria.fr [192.134.164.104]) by sympa.inria.fr (Postfix) with ESMTPS id 629D77EE6F for ; Mon, 25 Nov 2013 21:37:30 +0100 (CET) Received-SPF: None (mail3-smtp-sop.national.inria.fr: no sender authenticity information available from domain of rvconference@gmail.com) identity=pra; client-ip=209.85.212.194; receiver=mail3-smtp-sop.national.inria.fr; envelope-from="rvconference@gmail.com"; x-sender="rvconference@gmail.com"; x-conformance=sidf_compatible Received-SPF: Pass (mail3-smtp-sop.national.inria.fr: domain of rvconference@gmail.com designates 209.85.212.194 as permitted sender) identity=mailfrom; client-ip=209.85.212.194; receiver=mail3-smtp-sop.national.inria.fr; envelope-from="rvconference@gmail.com"; x-sender="rvconference@gmail.com"; x-conformance=sidf_compatible; x-record-type="v=spf1" Received-SPF: None (mail3-smtp-sop.national.inria.fr: no sender authenticity information available from domain of postmaster@mail-wi0-f194.google.com) identity=helo; client-ip=209.85.212.194; receiver=mail3-smtp-sop.national.inria.fr; envelope-from="rvconference@gmail.com"; x-sender="postmaster@mail-wi0-f194.google.com"; x-conformance=sidf_compatible X-IronPort-Anti-Spam-Filtered: true X-IronPort-Anti-Spam-Result: AhIQAMO0k1LRVdTCamdsb2JhbABZgz9HAQEKpzCMK4lzCBYOCwsMBhQqgkkjARsKAhIDEhADWgERAQUBIhMUCIdSAQMPDZ5FgwWMWYMJhDwKGScNZIc0AQUMjhgBEQGFCgOYFIEwjnYYKYMVgT87gS4HFwY X-IPAS-Result: AhIQAMO0k1LRVdTCamdsb2JhbABZgz9HAQEKpzCMK4lzCBYOCwsMBhQqgkkjARsKAhIDEhADWgERAQUBIhMUCIdSAQMPDZ5FgwWMWYMJhDwKGScNZIc0AQUMjhgBEQGFCgOYFIEwjnYYKYMVgT87gS4HFwY X-IronPort-AV: E=Sophos;i="4.93,769,1378850400"; d="scan'208";a="37860132" Received: from mail-wi0-f194.google.com ([209.85.212.194]) by mail3-smtp-sop.national.inria.fr with ESMTP/TLS/RC4-SHA; 25 Nov 2013 21:37:12 +0100 Received: by mail-wi0-f194.google.com with SMTP id f13so1343906wiv.5 for ; Mon, 25 Nov 2013 12:37:11 -0800 (PST) DKIM-Signature: v=1; a=rsa-sha256; c=relaxed/relaxed; d=gmail.com; s=20120113; h=mime-version:date:message-id:subject:from:to:content-type; bh=1LTcvmdi+qa7Ex5VgYKrnc8rQn4qCzr60QteNDmrD8I=; b=KbXa+wsb0GYaFvJYdKwfKE5wAzSDNrVXgeqhLaKuuvjqh+G9+lGaEmYsNGQnZBVdof N9FuVs4Fwn8kE2CBGg886Ab6boVZ43GSI2+YvGBh+mHfhm5euy2cBU4UwH+K5zhqYdGB G7fJNhwD2n56xluZam7Iq7BEMf7C0RVsuIvSPQmb90VizSJqnK1X9nJcq1Z4hSYmXDoa XzwsUkWyiKmU9axezsj7LCOsVGpHuhYwBsxvs9do2ym9H/hai+mlRkX1GX4sAlbHQLkr YeFs6EnOIZpXKful0c5KDdPBbPk5iaMrWhSFwgIMoclblAp8NeuyQ+PSouxiVlDIGPb0 rQ3A== MIME-Version: 1.0 X-Received: by 10.180.187.101 with SMTP id fr5mr8706159wic.42.1385411831647; Mon, 25 Nov 2013 12:37:11 -0800 (PST) Received: by 10.217.43.66 with HTTP; Mon, 25 Nov 2013 12:37:11 -0800 (PST) Date: Mon, 25 Nov 2013 21:37:11 +0100 Message-ID: From: Runtime Verification To: Runtime Verification Content-Type: multipart/alternative; boundary=001a11c25ee217d6c204ec065493 X-Validation-by: rvconference@gmail.com Subject: [Caml-list] 1st Intl. Competition of Software for Runtime Verification: call for participation --001a11c25ee217d6c204ec065493 Content-Type: text/plain; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable [Apologizes for duplicates] *1st Intl. Competition of Software for Runtime Verification (CSRV-2014)* *held with RV 2014 in Toronto, Canada* *http://rv2014.imag.fr/monitoring-competition* CSRV-2014 is the *1st International Software Runtime Verification Competition* as a part of the 14th International Conference on Runtime Verification. The event will be held in September 2014, in Toronto, Canada. CSRV-2014 will draw attention to the invaluable effort of software developers and researchers who contribute in this field by providing the community with new or updated tools, libraries and frameworks for the instrumentation and runtime verification of software. Runtime Verification is a verification technique for the analysis of software at execution time based on extracting information from a running system and checking if the observed behaviors satisfy or violate the properties of interest. During the last decade, many important tools and techniques have been developed and successfully employed. However, there is a pressing need to compare such tools and techniques, since we currently lack of a common benchmark suite as well as scientific evaluation methods to validate and test new prototype runtime verification tools. The main aims of CSRV-2014 competition are to: - Stimulate the development of new efficient and practical runtime verification tools and the maintenance of the already developed ones. - Produce a benchmark suite for runtime verification tools, by sharing case studies and programs that researchers and developers can use in the future to test and to validate their prototypes. - Discuss the metrics employed for comparing the tools. - Provide a comparison of the tools running with different benchmarks and evaluating using different criteria. - Enhance the visibility of presented tools among the different communities (verification, software engineering, cloud computing and security) involved in software monitoring. Please direct any enquiries to the competition co-organizers ( csrv14.chairs@imag.fr): - Ezio Bartocci (Vienna University of Technology, Austria), ezio.bartocci@tuwien.ac.at; - Borzoo Bonakdarpour (University of Waterloo, Canada), borzoo@cs.uwaterloo.ca; - Yli=E8s Falcone (Universit=E9 Joseph Fourier, France), ylies.falcone@ujf-grenoble.fr. *CSRV-2014 Jury *The CSRV Jury will include a representative for each participating team and some representatives of the Demonstration Tools Committee of Runtime Verification Conference. *Call for Participation *The main goal of CSRV-2014 competition is to compare tools for runtime verification. We invite and encourage the participation with benchmarks and tools for the competition.The competition will consist of three main tracks based on the input language used: - Track on monitoring Java programs (online monitoring); - Track on monitoring C programs (online monitoring); - Track on monitoring of traces (offline monitoring). The competition will follow three phases: - Benchmarks/Specification collection phase - the participants are invited to submit their benchmarks (C or Java programs and/or traces). T= he organizers will collect them in a common repository (publicly available). The participants will then train their tools using the shared benchmarks; - Monitor collection phase - the participants are invited to submit their monitors. The participants with the tools/monitors (see more information in the following section) that meet the qualification requirements will be qualified for the evaluation phase; - Evaluation phase - the qualified tools will be evaluated running the benchmarks and they will be ranked using different criteria (i.e., memo= ry utilization/overhead, CPU utilization/overhead, ...). The final results will be presented at RV 2014 conference. Please refer to the dedicated pages for more details on the three phases. *Important Dates**Dec. 15, 2013* - Declaration of intent (by email csrv14.chairs@imag.fr ). *March 1, 2014* - Submission deadline for benchmark programs and the properties to be monitored. *March 15, 2014* - Tool training starts by participants. *June 1, 2014* - Monitor submission. *July 1, 2014* - Notifications and reviews. --001a11c25ee217d6c204ec065493 Content-Type: text/html; charset=ISO-8859-1 Content-Transfer-Encoding: quoted-printable

[Apologizes for duplicates]

1st Intl. Competition of Softwar= e for Runtime Verification (CSRV-2014)

held with RV 2014 in Toronto, Canada

http://rv2014.imag.fr/monitoring-competition=

CSRV-2014 is the 1st International Software Runtime Verification Comp= etition as a part of the 14th International Conference on Runtime Verif= ication. The event will be held in September 2014, in Toronto, Canada. CSRV= -2014 will draw attention to the invaluable effort of software developers a= nd researchers who contribute in this field by providing the community with= new or updated tools, libraries and frameworks for the instrumentation and= runtime verification of software.

Runtime Verification is a verification technique for the analysis of sof= tware at execution time based on extracting information from a running syst= em and checking if the observed behaviors satisfy or violate the properties= of interest. During the last decade, many important tools and techniques h= ave been developed and successfully employed. However, there is a pressing = need to compare such tools and techniques, since we currently lack of a com= mon benchmark suite as well as scientific evaluation methods to validate an= d test new prototype runtime verification tools.=A0

The main aims of CSRV-2014 competition are to:

  • Stimulate the development of new efficient and practical runtime verifi= cation tools and the maintenance of the already developed ones.
  • Produce a benchmark suite for runtime verification tools, by sharing ca= se studies and programs that researchers and developers can use in the futu= re to test and to validate their prototypes.
  • Discuss the metrics employed for comparing the tools.
  • Provide a comparison of the tools running with different benchmarks and= evaluating using different criteria.
  • Enhance the visibility of presented tools among the different communiti= es (verification, software engineering, cloud computing and security) invol= ved in software monitoring.

Please direct any enquiries to the competition co-organizers (csrv14.chairs@imag.fr)= :


CSRV-2014 Jury
The CSRV Jury will=A0include a representative for each participating=A0= team and some representatives=A0of the Demonstration Tools=A0Committee of R= untime Verification Conference.
=A0
Call for Participation
The=A0main goal of CSRV-2014 competition is to compare tools for=A0runt= ime=A0verification.=A0We invite and encourage the=A0participation=A0with be= nchmarks and tools for the competition.The competition=A0will consist of th= ree main tracks based on the input language=A0used:

  • Track on monitoring Java=A0programs (online monitoring);
  • Track on monitoring C=A0programs (online monitoring);
  • Track on monitoring of=A0traces (offline monitoring).

The competition will follow three phases:

  • Benchmarks/Specification collection phase=A0- the=A0participants are in= vited to submit their benchmarks (C or=A0Java programs=A0and/or traces). Th= e organizers will collect=A0them in a common repository=A0(publicly availab= le). The=A0participants will then train their tools using=A0the shared=A0be= nchmarks;
  • Monitor collection phase -=A0the participants are=A0invited to=A0submit= their monitors. The participants with the=A0tools/monitors=A0(see more inf= ormation in the following=A0section) that meet the=A0qualification requirem= ents will be=A0qualified for the evaluation phase;
  • Evaluation phase=A0- the qualified tools will be evaluated=A0running th= e =A0benchmarks and they will be ranked using=A0different=A0criteria (i.e.,= memory utilization/overhead, CPU=A0utilization/overhead, ...). The final r= esults will be presented=A0at RV 2014=A0conference.

Please refer to the dedicated pages for more details on the three phases= .

Important Dates
Dec. 15, 2013=A0- Declaration of i= ntent (by email cs= rv14.chairs@imag.fr).
March 1, 2014=A0- Submission deadline for benchmark programs and the= properties to be monitored.
March 15, 2014=A0- Tool training sta= rts by participants.
June 1, 2014=A0- Monitor submission.
J= uly 1, 2014=A0- Notifications and reviews.

--001a11c25ee217d6c204ec065493--