From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: Received: from mail3-relais-sop.national.inria.fr (mail3-relais-sop.national.inria.fr [192.134.164.104]) by sympa.inria.fr (Postfix) with ESMTPS id 434787EE6B for ; Mon, 25 Nov 2013 22:24:53 +0100 (CET) Received-SPF: None (mail3-smtp-sop.national.inria.fr: no sender authenticity information available from domain of fm-announcements-bounces@lists.nasa.gov) identity=pra; client-ip=128.156.249.229; receiver=mail3-smtp-sop.national.inria.fr; envelope-from="fm-announcements-bounces@lists.nasa.gov"; x-sender="fm-announcements-bounces@lists.nasa.gov"; x-conformance=sidf_compatible Received-SPF: Pass (mail3-smtp-sop.national.inria.fr: domain of fm-announcements-bounces@lists.nasa.gov designates 128.156.249.229 as permitted sender) identity=mailfrom; client-ip=128.156.249.229; receiver=mail3-smtp-sop.national.inria.fr; envelope-from="fm-announcements-bounces@lists.nasa.gov"; x-sender="fm-announcements-bounces@lists.nasa.gov"; x-conformance=sidf_compatible; x-record-type="v=spf1" Received-SPF: Pass (mail3-smtp-sop.national.inria.fr: domain of postmaster@lists.nasa.gov designates 128.156.249.229 as permitted sender) identity=helo; client-ip=128.156.249.229; receiver=mail3-smtp-sop.national.inria.fr; envelope-from="fm-announcements-bounces@lists.nasa.gov"; x-sender="postmaster@lists.nasa.gov"; x-conformance=sidf_compatible; x-record-type="v=spf1" X-IronPort-Anti-Spam-Filtered: true X-IronPort-Anti-Spam-Result: Ao8CAEy/k1KAnPnlm2dsb2JhbABZgkN8tC6JexYOAQEBAQEICQsJFCEHgicGAQECGCkIAhQCAgYOAwECBgInAQIaAQMIAgEBEwEOAgksEgEFDwgBBIdgDb49F44kAREBg3eBEwOJQoghg2KDf4R9jSItgV+BSQcXBg X-IPAS-Result: Ao8CAEy/k1KAnPnlm2dsb2JhbABZgkN8tC6JexYOAQEBAQEICQsJFCEHgicGAQECGCkIAhQCAgYOAwECBgInAQIaAQMIAgEBEwEOAgksEgEFDwgBBIdgDb49F44kAREBg3eBEwOJQoghg2KDf4R9jSItgV+BSQcXBg X-IronPort-AV: E=Sophos;i="4.93,769,1378850400"; d="gif'147?scan'147,208,217,147";a="37866188" Received: from lists.nasa.gov ([128.156.249.229]) by mail3-smtp-sop.national.inria.fr with ESMTP; 25 Nov 2013 22:24:51 +0100 Received: from localhost (localhost [127.0.0.1]) by lists.nasa.gov (Postfix) with ESMTP id 78CE1602AE1; Mon, 25 Nov 2013 16:23:27 -0500 (EST) Received: from lists.nasa.gov ([127.0.0.1]) by localhost (lists.nasa.gov [127.0.0.1]) (amavisd-new, port 10024) with ESMTP id LH5M0VmOmcj0; Mon, 25 Nov 2013 16:23:27 -0500 (EST) Received: from lists.nasa.gov (localhost [127.0.0.1]) by lists.nasa.gov (Postfix) with ESMTP id 778FD602A9F; Mon, 25 Nov 2013 16:23:24 -0500 (EST) Received: from localhost (localhost [127.0.0.1]) by lists.nasa.gov (Postfix) with ESMTP id 70A9C602A71 for ; Mon, 25 Nov 2013 16:23:21 -0500 (EST) Received: from lists.nasa.gov ([127.0.0.1]) by localhost (lists.nasa.gov [127.0.0.1]) (amavisd-new, port 10024) with ESMTP id zo7NGE+oZX7S for ; Mon, 25 Nov 2013 16:23:21 -0500 (EST) Received: from mail.jpl.nasa.gov (smtp.jpl.nasa.gov [128.149.139.109]) by lists.nasa.gov (Postfix) with ESMTP id 22E05602958 for ; Mon, 25 Nov 2013 16:23:20 -0500 (EST) Received: from dhcp-137-79-195-7.jpl.nasa.gov (dhcp-137-79-195-7.jpl.nasa.gov [137.79.195.7]) (authenticated (0 bits)) by smtp.jpl.nasa.gov (Sentrion-MTA-4.3.1/Sentrion-MTA-4.3.1) with ESMTP id rAPLNJcX028480 (using TLSv1/SSLv3 with cipher AES128-SHA (128 bits) verified NO) for ; Mon, 25 Nov 2013 13:23:19 -0800 From: Klaus Havelund Message-Id: <7AA2B006-DCAF-45A2-B9E6-09984760D022@jpl.nasa.gov> Date: Mon, 25 Nov 2013 13:23:19 -0800 To: "fm-announcements@lists.nasa.gov" Mime-Version: 1.0 (Mac OS X Mail 6.3 \(1503\)) X-Mailer: Apple Mail (2.1503) X-Source-Sender: klaus.havelund@jpl.nasa.gov X-AUTH: Authorized X-BeenThere: fm-announcements@lists.nasa.gov X-Mailman-Version: 2.1.14 List-Id: NASA Formal Methods Announcements List-Unsubscribe: , List-Post: List-Help: List-Subscribe: , Content-Type: multipart/mixed; boundary="===============8859140248168322059==" Errors-To: fm-announcements-bounces@lists.nasa.gov Sender: fm-announcements-bounces@lists.nasa.gov X-Validation-by: klaus.havelund@jpl.nasa.gov Subject: [Caml-list] [fm-announcements] 1st Intl. Competition of Software for Runtime Verification: call for participation --===============8859140248168322059== Content-Type: multipart/alternative; boundary="Apple-Mail=_8D8515FA-8849-4D79-A804-47130BFE0BD8" --Apple-Mail=_8D8515FA-8849-4D79-A804-47130BFE0BD8 Content-Transfer-Encoding: quoted-printable Content-Type: text/plain; charset=iso-8859-1 [Apologizes for duplicates] 1st Intl. Competition of Software for Runtime Verification (CSRV-2014) held with RV 2014 in Toronto, Canada http://rv2014.imag.fr/monitoring-competition CSRV-2014 is the 1st International Software Runtime Verification Competitio= n as a part of the 14th International Conference on Runtime Verification. T= he event will be held in September 2014, in Toronto, Canada. CSRV-2014 will= draw attention to the invaluable effort of software developers and researc= hers who contribute in this field by providing the community with new or up= dated tools, libraries and frameworks for the instrumentation and runtime v= erification of software. Runtime Verification is a verification technique for the analysis of softwa= re at execution time based on extracting information from a running system = and checking if the observed behaviors satisfy or violate the properties of= interest. During the last decade, many important tools and techniques have= been developed and successfully employed. However, there is a pressing nee= d to compare such tools and techniques, since we currently lack of a common= benchmark suite as well as scientific evaluation methods to validate and t= est new prototype runtime verification tools.=20 The main aims of CSRV-2014 competition are to: - Stimulate the development of new efficient and practical runtime verifica= tion tools and the maintenance of the already developed ones. - Produce a benchmark suite for runtime verification tools, by sharing case= studies and programs that researchers and developers can use in the future= to test and to validate their prototypes. - Discuss the metrics employed for comparing the tools. - Provide a comparison of the tools running with different benchmarks and e= valuating using different criteria. - Enhance the visibility of presented tools among the different communities= (verification, software engineering, cloud computing and security) involve= d in software monitoring. Please direct any enquiries to the competition co-organizers (csrv14.chairs= @imag.fr): Ezio Bartocci (Vienna University of Technology, Austria), ezio.bartocci@tuw= ien.ac.at; Borzoo Bonakdarpour (University of Waterloo, Canada), borzoo@cs.uwaterloo.c= a; Yli=E8s Falcone (Universit=E9 Joseph Fourier, France), ylies.falcone@ujf-gr= enoble.fr. CSRV-2014 Jury The CSRV Jury will include a representative for each participating team and= some representatives of the Demonstration Tools Committee of Runtime Verif= ication Conference. =20 Call for Participation The main goal of CSRV-2014 competition is to compare tools for runtime veri= fication. We invite and encourage the participation with benchmarks and too= ls for the competition.The competition will consist of three main tracks ba= sed on the input language used: Track on monitoring Java programs (online monitoring); Track on monitoring C programs (online monitoring); Track on monitoring of traces (offline monitoring). The competition will follow three phases: - Benchmarks/Specification collection phase - the participants are invited = to submit their benchmarks (C or Java programs and/or traces). The organize= rs will collect them in a common repository (publicly available). The parti= cipants will then train their tools using the shared benchmarks; - Monitor collection phase - the participants are invited to submit their m= onitors. The participants with the tools/monitors (see more information in = the following section) that meet the qualification requirements will be qua= lified for the evaluation phase; - Evaluation phase - the qualified tools will be evaluated running the ben= chmarks and they will be ranked using different criteria (i.e., memory util= ization/overhead, CPU utilization/overhead, ...). The final results will be= presented at RV 2014 conference. Please refer to the dedicated pages for more details on the three phases. Important Dates Dec. 15, 2013 - Declaration of intent (by email csrv14.chairs@imag.fr). March 1, 2014 - Submission deadline for benchmark programs and the properti= es to be monitored. March 15, 2014 - Tool training starts by participants. June 1, 2014 - Monitor submission. July 1, 2014 - Notifications and reviews.= --Apple-Mail=_8D8515FA-8849-4D79-A804-47130BFE0BD8 Content-Type: multipart/related; type="text/html"; boundary="Apple-Mail=_2B8F5F87-58A9-409A-BA60-67E1E05B5091" --Apple-Mail=_2B8F5F87-58A9-409A-BA60-67E1E05B5091 Content-Transfer-Encoding: quoted-printable Content-Type: text/html; charset=iso-8859-1
<= div>[Apologizes for duplicates]

1st Intl. Competit= ion of Software for Runtime Verification (CSRV-2014)

held with RV 2014 in Toronto, Canada



CSRV-2014 is the 1st Internati= onal Software Runtime Verification Competition as a part of the 14th Intern= ational Conference on Runtime Verification. The event will be held in Septe= mber 2014, in Toronto, Canada. CSRV-2014 will draw attention to the invalua= ble effort of software developers and researchers who contribute in this fi= eld by providing the community with new or updated tools, libraries and fra= meworks for the instrumentation and runtime verification of software.
=

Runtime Verification is a verification technique for th= e analysis of software at execution time based on extracting information fr= om a running system and checking if the observed behaviors satisfy or viola= te the properties of interest. During the last decade, many important tools= and techniques have been developed and successfully employed. However, the= re is a pressing need to compare such tools and techniques, since we curren= tly lack of a common benchmark suite as well as scientific evaluation metho= ds to validate and test new prototype runtime verification tools. 

The main aims of CSRV-2014 competition are to:
<= div>
- S= timulate the development of new efficient and practical runtime verificatio= n tools and the maintenance of the already developed ones.
- Produce a benchmark suite for r= untime verification tools, by sharing case studies and programs that resear= chers and developers can use in the future to test and to validate their pr= ototypes.
- Discus= s the metrics employed for comparing the tools.
- Provide a comparison of the tools running w= ith different benchmarks and evaluating using different criteria.
- Enhance the visibility of= presented tools among the different communities (verification, software en= gineering, cloud computing and security) involved in software monitoring.

Please direct any enquiries t= o the competition co-organizers (csrv14.chairs@imag.fr<= /a>):

Borzoo Bonakdarpour (University of Waterloo, Canada), borzoo@cs.uwaterloo.ca;
Yli=E8s Falcone (Univ= ersit=E9 Joseph Fourier, France), ylies.fa= lcone@ujf-grenoble.fr.

CSRV-2014 Jury

The CSRV Jury will include a representative for each parti= cipating team and some representatives of the Demonstration Tools Committee= of Runtime Verification Conference.
 
Call for Pa= rticipation

The main goal of CSRV-2014 competition= is to compare tools for runtime verification. We invite and encourage the = participation with benchmarks and tools for the competition.The competition= will consist of three main tracks based on the input language used:
<= div>
Track on monitoring Java programs (online monitoring);
Track on monitoring C programs (online monitoring);
Trac= k on monitoring of traces (offline monitoring).

Th= e competition will follow three phases:

- Benchmarks/Specification colle= ction phase - the participants are invited to submit their benchmarks (C or= Java programs and/or traces). The organizers will collect them in a common= repository (publicly available). The participants will then train their to= ols using the shared benchmarks;
- Monitor collection phase - the participants are invited to= submit their monitors. The participants with the tools/monitors (see more = information in the following section) that meet the qualification requireme= nts will be qualified for the evaluation phase;
- Evaluation= phase - the qualified tools will be evaluated running the  benchmarks= and they will be ranked using different criteria (i.e., memory utilization= /overhead, CPU utilization/overhead, ...). The final results will be presen= ted at RV 2014 conference.

Please refer to the dedicated pag= es for more details on the three phases.

Important= Dates

Dec. 15, 2013 - Declaration of inten= t (by email csrv14.chairs@imag.fr).
= March 1, 201= 4 - Submission deadline for benchmark programs and the p= roperties to be monitored.
March 15, 2014 - Tool training = starts by participants.
June 1, 2014 - Monitor submission.=
= J= uly 1, 2014 - Notifications and reviews.
= --Apple-Mail=_2B8F5F87-58A9-409A-BA60-67E1E05B5091 Content-Transfer-Encoding: base64 Content-Disposition: inline; filename=cleardot.gif Content-Type: image/gif; name="cleardot.gif" Content-Id: <299FBC13-85D4-4F55-9D9D-6F9C93E76A8A@jpl.nasa.gov> R0lGODlhAQABAIAAAP///wAAACH5BAEAAAAALAAAAAABAAEAAAICRAEAOw== --Apple-Mail=_2B8F5F87-58A9-409A-BA60-67E1E05B5091-- --Apple-Mail=_8D8515FA-8849-4D79-A804-47130BFE0BD8-- --===============8859140248168322059== Content-Type: text/plain; charset="us-ascii" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit Content-Disposition: inline --- To opt-out from this mailing list, send an email to fm-announcements-request@lists.nasa.gov with the word 'unsubscribe' as subject or in the body. You can also make the request by contacting fm-announcements-owner@lists.nasa.gov --===============8859140248168322059==--