From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: Received: from mail2-relais-roc.national.inria.fr (mail2-relais-roc.national.inria.fr [192.134.164.83]) by sympa.inria.fr (Postfix) with ESMTPS id E31237FB68; Tue, 20 Jan 2015 07:56:55 +0100 (CET) Received-SPF: None (mail2-smtp-roc.national.inria.fr: no sender authenticity information available from domain of ylies.falcone@imag.fr) identity=pra; client-ip=129.88.30.17; receiver=mail2-smtp-roc.national.inria.fr; envelope-from="ylies.falcone@imag.fr"; x-sender="ylies.falcone@imag.fr"; x-conformance=sidf_compatible Received-SPF: Neutral (mail2-smtp-roc.national.inria.fr: domain of ylies.falcone@imag.fr does not assert whether or not 129.88.30.17 is permitted sender) identity=mailfrom; client-ip=129.88.30.17; receiver=mail2-smtp-roc.national.inria.fr; envelope-from="ylies.falcone@imag.fr"; x-sender="ylies.falcone@imag.fr"; x-conformance=sidf_compatible; x-record-type="v=spf1" Received-SPF: None (mail2-smtp-roc.national.inria.fr: no sender authenticity information available from domain of postmaster@rominette.imag.fr) identity=helo; client-ip=129.88.30.17; receiver=mail2-smtp-roc.national.inria.fr; envelope-from="ylies.falcone@imag.fr"; x-sender="postmaster@rominette.imag.fr"; x-conformance=sidf_compatible X-IronPort-AV: E=Sophos;i="5.09,432,1418079600"; d="scan'208,217";a="117730619" Received: from mx2.imag.fr (HELO rominette.imag.fr) ([129.88.30.17]) by mail2-smtp-roc.national.inria.fr with ESMTP; 20 Jan 2015 07:56:55 +0100 Received: from globule.imag.fr (globule.imag.fr [129.88.34.238]) by rominette.imag.fr (8.13.8/8.13.8) with ESMTP id t0K6lBYZ002114 (version=TLSv1/SSLv3 cipher=DHE-RSA-AES256-SHA bits=256 verify=NO); Tue, 20 Jan 2015 07:47:11 +0100 Received: from [10.0.1.3] (46.238.71.86.rev.sfr.net [86.71.238.46]) (authenticated bits=0) by globule.imag.fr (8.13.8/8.13.8) with ESMTP id t0K6kkFs002679 (version=TLSv1/SSLv3 cipher=DHE-RSA-AES256-SHA bits=256 verify=NO); Tue, 20 Jan 2015 07:46:47 +0100 From: =?utf-8?Q?Yli=C3=A8s_Falcone?= Content-Type: multipart/alternative; boundary="Apple-Mail=_BEA535C9-F818-4ADE-9787-E8612243A36F" Date: Tue, 20 Jan 2015 07:46:46 +0100 To: crv15.chairs@imag.fr Message-Id: <5848FE06-07B2-488D-8EC3-5ABC76E5F03A@imag.fr> Mime-Version: 1.0 (Mac OS X Mail 8.1 \(1993\)) X-Mailer: Apple Mail (2.1993) X-Greylist: Sender IP whitelisted, not delayed by milter-greylist-4.2.2 (rominette.imag.fr [129.88.30.17]); Tue, 20 Jan 2015 07:47:28 +0100 (CET) X-IMAG-MailScanner-Information: Please contact MI2S MIM for more information X-MailScanner-ID: t0K6lBYZ002114 X-IMAG-MailScanner: Found to be clean X-IMAG-MailScanner-SpamCheck: X-IMAG-MailScanner-From: ylies.falcone@imag.fr MailScanner-NULL-Check: 1422341248.63228@hWK1vvERKebXQxZzYM8sjg X-Validation-by: ylies.falcone@imag.fr Subject: [Caml-list] CFP: CRV15 - 2nd Competition on Runtime Verification --Apple-Mail=_BEA535C9-F818-4ADE-9787-E8612243A36F Content-Transfer-Encoding: quoted-printable Content-Type: text/plain; charset=utf-8 CRV 2015 The 2nd International Competition on Runtime Verification,=20 held with RV 2015, September 22 =E2=80=93 25, 2015 Vienna, Austria CRV-2015 is the 2nd International Competition on Runtime Verification and i= s part of the 15th International Conference on Runtime Verification. The ev= ent will be held in September 2015, in Vienna, Austria. CRV-2015 will draw = attention to the invaluable effort of software developers and researchers w= ho contribute in this field by providing the community with new or updated = tools, libraries and frameworks for the instrumentation and runtime verific= ation of software. Runtime Verification is a verification technique for the analysis of softwa= re at execution-time based on extracting information from a running system = and checking if the observed behaviors satisfy or violate the properties of= interest. During the last decade, many important tools and techniques have= been developed and successfully employed. However, there is a pressing nee= d to compare such tools and techniques, since we currently lack a common be= nchmark suite as well as scientific evaluation methods to validate and test= new prototype runtime verification tools.=20 The main aims of CRV-2015 are to: =E2=80=A2 Stimulate the development of new efficient and practical runtime= verification tools and the maintenance and improvement of the already deve= loped ones. =E2=80=A2 Produce a benchmark suite for runtime verification tools, by sha= ring case studies and programs that researchers and developers can use in t= he future to test and to validate their prototypes. =E2=80=A2 Discuss the metrics employed for comparing the tools. =E2=80=A2 Provide a comparison of the tools on different benchmarks and ev= aluate them using different criteria. =E2=80=A2 Enhance the visibility of presented tools among the different co= mmunities (verification, software engineering, cloud computing and security= ) involved in software monitoring. Please direct any enquiries to the competition co-organizers (crv15.chairs@= imag.fr ) =E2=80=A2 Yli=C3=A8s Falcone (Universit=C3=A9 Joseph Fourier, France). =E2=80=A2 Dejan Nickovic (AIT Austrian Institute of Technology GmbH, Austr= ia). =E2=80=A2 Giles Reger (University of Manchester, UK). =E2=80=A2 Daniel Thoma (University of Luebeck, Germany). CRV-2015 Jury The CSRV Jury will include a representative for each participating team and= the competition chairs. The Jury will be consulted at each stage of the co= mpetition to ensure that the rules set by the competition chairs are fair a= nd reasonable. Call for Participation The main goal of CRV 2015 is to compare tools for runtime verification. We = invite and encourage the participation with benchmarks and tools for the co= mpetition.The competition will consist of three main tracks based on the in= put language used: =E2=80=A2 Track on monitoring Java programs (online monitoring). =E2=80=A2 Track on monitoring C programs (online monitoring). =E2=80=A2 Track on monitoring of traces (offline monitoring). The competition will follow three phases: =E2=80=A2 Benchmarks/Specification collection phase - the participants are= invited to submit their benchmarks (C or Java programs and/or traces). The= organizers will collect them in a common repository (publicly available). = The participants will then train their tools using the shared benchmarks. =E2=80=A2 Monitor collection phase - the participants are invited to submi= t their monitors. The participants with the tools/monitors that meet the qu= alification requirements will be qualified for the evaluation phase. =E2=80=A2 Evaluation phase - the qualified tools will be evaluated on the = submitted benchmarks and they will be ranked using different criteria (i.e.= , memory utilization, CPU utilization, ...). The final results will be pres= ented at the RV 2015 conference. The detailed description of each phase will be available on the RV 2015 web= site at http://rv2015.conf.tuwien.ac.at . =20 Expected Important Dates January 30, 2015: Declaration of intent (email: crv15.chairs@imag.fr ) March 15, 2015 Submission deadline for benchmark programs and the propertie= s to be monitored March 30, 2015 Tool training starts by participants May 30, 2015 Monitor submission June 30, 2015 Notifications At RV 2015 Presentation of results= --Apple-Mail=_BEA535C9-F818-4ADE-9787-E8612243A36F Content-Transfer-Encoding: quoted-printable Content-Type: text/html; charset=utf-8
CRV 2015
The 2nd International Competitio= n on Runtime Verification, 
held with RV 2015, September 22 =E2=80=93 25, 2015 = Vienna, Austria


CRV-2015 is the&nb= sp;2nd International Competition on Runtime Verification and is part o= f the 15th International Conference on Runtime Verification. The event will= be held in September 2015, in Vienna, Austria. CRV-2015 will draw attentio= n to the invaluable effort of software developers and researchers who contr= ibute in this field by providing the community with new or updated tools, l= ibraries and frameworks for the instrumentation and runtime verification of= software.

Runtime Verification is a verification technique= for the analysis of software at execution-time based on extracting informa= tion from a running system and checking if the observed behaviors satisfy o= r violate the properties of interest. During the last decade, many importan= t tools and techniques have been developed and successfully employed. Howev= er, there is a pressing need to compare such tools and techniques, since we= currently lack a common benchmark suite as well as scientific evaluation m= ethods to validate and test new prototype runtime verification tools. =

The main aims of CRV-2015 are to:

=
=E2=80=A2 Stimul= ate the development of new efficient and practical runtime verification too= ls and the maintenance and improvement of the already developed ones.
=
=E2=80=A2 Produce a be= nchmark suite for runtime verification tools, by sharing case studies and p= rograms that researchers and developers can use in the future to test and t= o validate their prototypes.
=E2=80=A2 Discuss the metrics employed for comparing the tools.=
=E2=80=A2 Provid= e a comparison of the tools on different benchmarks and evaluate them using= different criteria.
=E2=80=A2 Enhance the visibility of presented tools among the different= communities (verification, software engineering, cloud computing and secur= ity) involved in software monitoring.

Please direct any enq= uiries to the competition co-organizers (crv15.chairs@imag.fr)

=E2=80=A2 Yli=C3=A8s Fal= cone (Universit=C3=A9 Joseph Fourier, France).
=E2=80=A2 Dejan Nickovic (AIT Austrian Inst= itute of Technology GmbH, Austria).
=E2=80=A2 Giles Reger (University of Manchester, UK).
=E2=80=A2 Daniel Th= oma (University of Luebeck, Germany).

CRV-2015 Jury
The CSRV Jury will include a representative for each participating= team and the competition chairs. The Jury will be consulted at each stage = of the competition to ensure that the rules set by the competition chairs a= re fair and reasonable.

Call for Participation
The main goal of CRV 2015 is to compare tools for runtime verification.= We invite and encourage the participation with benchmarks and tools for th= e competition.The competition will consist of three main tracks based on th= e input language used:

=E2=80=A2 Track on monitoring Java programs= (online monitoring).
=E2=80=A2 Track on monitoring C programs (online monitoring).
=E2=80=A2 Track on monito= ring of traces (offline monitoring).

The competition will f= ollow three phases:

=E2=80=A2 Benchmarks/Specification collection p= hase - the participants are invited to submit their benchmarks (C or Java p= rograms and/or traces). The organizers will collect them in a common reposi= tory (publicly available). The participants will then train their tools usi= ng the shared benchmarks.
=E2=80=A2 Monitor collection phase - the participants are invited = to submit their monitors. The participants with the tools/monitors that mee= t the qualification requirements will be qualified for the evaluation phase= .
=E2=80=A2 Evalu= ation phase - the qualified tools will be evaluated on the submitted benchm= arks and they will be ranked using different criteria (i.e., memory utiliza= tion, CPU utilization, ...). The final results will be presented at the RV = 2015 conference.

The detailed description of each phase wil= l be available on the RV 2015 website at http://rv2015.conf.tuwien.ac.at.

 

Expected Important Dates

January 30, 2015: Declaration of intent (email: crv15.chairs@imag.fr)
March 15, = 2015 Submission deadline for benchmark programs and the properties to = be monitored
March 30, 2015 Tool training starts by participants
May 30, 2015 Monit= or submission
June 30, 2015 Notifications
At RV 2015 Presentation of results
<= /body>= --Apple-Mail=_BEA535C9-F818-4ADE-9787-E8612243A36F--