From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: Received: from mail2-relais-roc.national.inria.fr (mail2-relais-roc.national.inria.fr [192.134.164.83]) by sympa.inria.fr (Postfix) with ESMTPS id D6D717EE88 for ; Mon, 9 May 2016 12:22:15 +0200 (CEST) Authentication-Results: mail2-smtp-roc.national.inria.fr; spf=None smtp.pra=ylies.falcone@imag.fr; spf=Pass smtp.mailfrom=Ylies.Falcone@imag.fr; spf=None smtp.helo=postmaster@mx2.imag.fr Received-SPF: None (mail2-smtp-roc.national.inria.fr: no sender authenticity information available from domain of ylies.falcone@imag.fr) identity=pra; client-ip=129.88.30.17; receiver=mail2-smtp-roc.national.inria.fr; envelope-from="Ylies.Falcone@imag.fr"; x-sender="ylies.falcone@imag.fr"; x-conformance=sidf_compatible Received-SPF: Pass (mail2-smtp-roc.national.inria.fr: domain of Ylies.Falcone@imag.fr designates 129.88.30.17 as permitted sender) identity=mailfrom; client-ip=129.88.30.17; receiver=mail2-smtp-roc.national.inria.fr; envelope-from="Ylies.Falcone@imag.fr"; x-sender="Ylies.Falcone@imag.fr"; x-conformance=sidf_compatible; x-record-type="v=spf1" Received-SPF: None (mail2-smtp-roc.national.inria.fr: no sender authenticity information available from domain of postmaster@mx2.imag.fr) identity=helo; client-ip=129.88.30.17; receiver=mail2-smtp-roc.national.inria.fr; envelope-from="Ylies.Falcone@imag.fr"; x-sender="postmaster@mx2.imag.fr"; x-conformance=sidf_compatible X-IronPort-AV: E=Sophos;i="5.24,600,1454972400"; d="scan'208,217";a="217384098" Received: from mx2.imag.fr ([129.88.30.17]) by mail2-smtp-roc.national.inria.fr with ESMTP; 09 May 2016 12:17:53 +0200 Received: from globule.imag.fr (globule.imag.fr [129.88.34.238]) by mx2.imag.fr (8.13.8/8.13.8) with ESMTP id u49AHnM6024859 (version=TLSv1/SSLv3 cipher=DHE-RSA-AES128-SHA bits=128 verify=NO); Mon, 9 May 2016 12:17:49 +0200 Received: from ylies-nomade.imag.fr (ylies-nomade.imag.fr [129.88.50.230]) (authenticated bits=0) by globule.imag.fr (8.13.8/8.13.8) with ESMTP id u49AHZPT006193 (version=TLSv1/SSLv3 cipher=DHE-RSA-AES128-SHA bits=128 verify=NO); Mon, 9 May 2016 12:17:50 +0200 From: =?utf-8?Q?Yli=C3=A8s_Falcone?= Content-Type: multipart/alternative; boundary="Apple-Mail=_6385C43E-5912-4DDE-BA4D-6B1CAA761747" Message-Id: Date: Mon, 9 May 2016 12:17:51 +0200 To: caml-list@inria.fr, qest-announce@iti.uiuc.edu Mime-Version: 1.0 (Mac OS X Mail 9.3 \(3124\)) X-Mailer: Apple Mail (2.3124) X-Greylist: Sender IP whitelisted, not delayed by milter-greylist-4.2.2 (mx2.imag.fr [129.88.30.17]); Mon, 09 May 2016 12:17:49 +0200 (CEST) X-IMAG-MailScanner-Information: Please contact MI2S MIM for more information X-MailScanner-ID: u49AHnM6024859 X-IMAG-MailScanner: Found to be clean X-IMAG-MailScanner-SpamCheck: X-IMAG-MailScanner-From: ylies.falcone@imag.fr MailScanner-NULL-Check: 1463393871.53369@vdpDTReNn7sDPpis2ERuRg Subject: [Caml-list] CRV 2016 - The 3rd International Competition on Runtime Verification --Apple-Mail=_6385C43E-5912-4DDE-BA4D-6B1CAA761747 Content-Transfer-Encoding: quoted-printable Content-Type: text/plain; charset=utf-8 CRV 2016 The 3rd International Competition on Runtime Verification In Association with COST Action =E2=80=9CRuntime Verification beyond Monito= ring=E2=80=9D held with RV 2016, September 23-30 2016, Madrid, Spain https://rv2016.imag.fr/?page_id=3D188 CRV 2016 is the 3rd International Competition on Runtime Verification and w= ill be held as part of the 16th International Conference on Runtime Verific= ation. The event will be held in September 2016, in Madrid, Spain. CRV-2016= will draw attention to the invaluable effort of software developers and re= searchers who contribute in this field by providing the community with new = or updated tools, libraries and frameworks for the instrumentation and runt= ime verification of software. The competition is a product of COST Action = =E2=80=9CRuntime Verification beyond Monitoring=E2=80=9D, see https://www.c= ost-arvi.eu/ for more information. Runtime Verification is a verification technique for the analysis of softwa= re at execution-time based on extracting information from a running system = and checking if the observed behaviors satisfy or violate the properties of= interest. During the last decade, many important tools and techniques have= been developed and successfully employed. However, there is a pressing nee= d to compare such tools and techniques, since we currently lack a common be= nchmark suite as well as scientific evaluation methods to validate and test= new prototype runtime verification tools.=20 The main aims of CRV 2016 are to: =E2=80=A2 Stimulate the development of new efficient and practical runtime= verification tools and the maintenance and improvement of the already deve= loped ones. =E2=80=A2 Produce a benchmark suite for runtime verification tools, by sha= ring case studies and programs that researchers and developers can use in t= he future to test and to validate their prototypes. =E2=80=A2 Discuss the metrics employed for comparing the tools. =E2=80=A2 Provide a comparison of the tools on different benchmarks and ev= aluate them using different criteria. =E2=80=A2 Enhance the visibility of presented tools among the different co= mmunities (verification, software engineering, cloud computing and security= ) involved in software monitoring. Please direct any enquiries to the competition co-organizers (crv2016@crv.l= iflab.ca): =E2=80=A2 Yli=C3=A8s Falcone (Univ. Grenoble Alpes, Inria, France), =E2=80=A2 Sylvain Hall=C3=A9 (Universit=C3=A9 du Qu=C3=A9bec =C3=A0 Chicou= timi, Canada), =E2=80=A2 Giles Reger (University of Manchester, Manchester, UK). CRV 2016 Jury The CSRV Jury will include a representative for each participating team and= the competition chairs. The Jury will be consulted at each stage of the co= mpetition to ensure that the rules set by the competition chairs are fair a= nd reasonable. Call for Participation The main goal of CRV 2016 is to compare tools for runtime verification. We = invite and encourage the participation with benchmarks and tools for the co= mpetition. The competition will consist of three main tracks based on what = is being monitored: =E2=80=A2 Track on monitoring Java programs (online monitoring) =E2=80=A2 Track on monitoring C programs (online monitoring) =E2=80=A2 Subtrack on Generic Specifications (e.g. in LTL) =E2=80=A2 Subtrack on Implicit Specifications (e.g. memory safety) =E2=80=A2 Track on monitoring of traces (offline monitoring) The general organisation of the competition is described in the rules docum= ent found at http://crv.liflab.ca/CRV2016.pdf. To register please fill in the form at http://goo.gl/forms/kWxFFfFCvZ. Expected Important Dates May 9th Registration Opens May 29th Benchmark Submission Deadline June 5th Registration Closes June 5-12th Clarifications Phase June 19th Benchmarks Announced July 10th Monitor Submission Deadline August 1st Notifications At RV 2016 Presentation of Results= --Apple-Mail=_6385C43E-5912-4DDE-BA4D-6B1CAA761747 Content-Transfer-Encoding: quoted-printable Content-Type: text/html; charset=utf-8
CRV 2016
The 3rd International Competition on Runtime Verification
In Association with COST Action = =E2=80=9CRuntime Verification beyond Monitoring=E2=80=9D
held with RV 2016, September 23-30 2016, = Madrid, Spain



<= /div>
CRV 2016 is the 3rd Int= ernational Competition on Runtime Verification and will be held as part of = the 16th International Conference on Runtime Verification. The event will b= e held in September 2016, in Madrid, Spain. CRV-2016 will draw attention to= the invaluable effort of software developers and researchers who contribut= e in this field by providing the community with new or updated tools, libra= ries and frameworks for the instrumentation and runtime verification of sof= tware. The competition is a product of COST Action =E2=80=9CRuntime Verific= ation beyond Monitoring=E2=80=9D, see https://www.cost-arvi.eu/ for more information.

Runtime Verification is a verification te= chnique for the analysis of software at execution-time based on extracting = information from a running system and checking if the observed behaviors sa= tisfy or violate the properties of interest. During the last decade, many i= mportant tools and techniques have been developed and successfully employed= . However, there is a pressing need to compare such tools and techniques, s= ince we currently lack a common benchmark suite as well as scientific evalu= ation methods to validate and test new prototype runtime verification tools= . 

=
The main aims of CRV 2= 016 are to:

=E2=80=A2 Stimulate the de= velopment of new efficient and practical runtime verification tools and the= maintenance and improvement of the already developed ones.
<= /div>
=E2=80=A2 Produce a benchmark sui= te for runtime verification tools, by sharing case studies and programs tha= t researchers and developers can use in the future to test and to validate = their prototypes.
= =E2=80=A2 Discuss the metrics employed for comparing the tools.
=E2=80=A2 Provide a compariso= n of the tools on different benchmarks and evaluate them using different cr= iteria.
= =E2=80=A2 Enhance the visibility of presented tools among the diffe= rent communities (verification, software engineering, cloud computing and s= ecurity) involved in software monitoring.


Please direct any enquiries to the competition co-organizers (crv2016@crv.liflab.ca= ):

=E2=80=A2 Yli=C3=A8s Falcone (Univ. G= renoble Alpes, Inria, France),
=E2=80=A2 Sylvain Hall=C3=A9 (Universit=C3=A9 du Qu=C3=A9bec = =C3=A0 Chicoutimi, Canada),
=E2=80=A2 Giles Reger (University of Manchester, Manchester, UK).=

CRV 2016 Jury<= /div>

The CSRV Jury will include a r= epresentative for each participating team and the competition chairs. The J= ury will be consulted at each stage of the competition to ensure that the r= ules set by the competition chairs are fair and reasonable.

Call for Participation

The main goal of CRV 2016 is to compare tools for ru= ntime verification. We invite and encourage the participation with benchmar= ks and tools for the competition. The competition will consist of three mai= n tracks based on what is being monitored:

=E2=80=A2 Track on monitoring Java programs (online monitoring)
=E2=80=A2 Track on monitor= ing C programs (online monitoring)
=E2=80=A2 Subtrack on Generic Specifications (e.g. in LTL= )
=E2=80=A2 Subtr= ack on Implicit Specifications (e.g. memory safety)
=E2=80=A2 Track on moni= toring of traces (offline monitoring)

The general organisation of the competition is described in t= he rules document found at

<= a href=3D"http://crv.liflab.ca/CRV2016.pdf" class=3D"">http://crv.liflab.ca= /CRV2016.pdf.

To register = please fill in the form at

<= a href=3D"http://goo.gl/forms/kWxFFfFCvZ" class=3D"">http://goo.gl/forms/kW= xFFfFCvZ.

Expected Import= ant Dates

May 9th Registratio= n Opens
May 29th Bench= mark Submission Deadline
June 5th Registration Closes
June 5-12th Clarifications Phase
June 19th Benchmarks Announced
July 10th Monitor Submission Deadline
August 1st Notifications
<= div style=3D"text-align: justify;" class=3D"">At RV 2016 Presentation of Re= sults
= --Apple-Mail=_6385C43E-5912-4DDE-BA4D-6B1CAA761747--