From mboxrd@z Thu Jan 1 00:00:00 1970 Return-Path: Received: from mail2-relais-roc.national.inria.fr (mail2-relais-roc.national.inria.fr [192.134.164.83]) by sympa.inria.fr (Postfix) with ESMTPS id A9AB27FB38 for ; Tue, 23 Dec 2014 16:18:07 +0100 (CET) Received-SPF: None (mail2-smtp-roc.national.inria.fr: no sender authenticity information available from domain of fm-announcements-bounces@lists.nasa.gov) identity=pra; client-ip=128.156.249.229; receiver=mail2-smtp-roc.national.inria.fr; envelope-from="fm-announcements-bounces@lists.nasa.gov"; x-sender="fm-announcements-bounces@lists.nasa.gov"; x-conformance=sidf_compatible Received-SPF: PermError (mail2-smtp-roc.national.inria.fr: cannot correctly interpret sender authenticity information from domain of fm-announcements-bounces@lists.nasa.gov) identity=mailfrom; client-ip=128.156.249.229; receiver=mail2-smtp-roc.national.inria.fr; envelope-from="fm-announcements-bounces@lists.nasa.gov"; x-sender="fm-announcements-bounces@lists.nasa.gov"; x-conformance=sidf_compatible Received-SPF: PermError (mail2-smtp-roc.national.inria.fr: cannot correctly interpret sender authenticity information from domain of postmaster@lists.nasa.gov) identity=helo; client-ip=128.156.249.229; receiver=mail2-smtp-roc.national.inria.fr; envelope-from="fm-announcements-bounces@lists.nasa.gov"; x-sender="postmaster@lists.nasa.gov"; x-conformance=sidf_compatible X-IronPort-Anti-Spam-Filtered: true X-IronPort-Anti-Spam-Result: AnwAAMWGmVSAnPnlnGdsb2JhbABbg1hYxFd4ZwEFhw0WAQEBAQERAQEBAQEGDQkJFC6EDgYBARopCAIYBgYIAwECBgJEBAgDASQJPwUPCAEEiAvOeQEfjEQBgkoBEQGDbYETBYN9hU2Cc4UWgx4igwEwgjWCAw+LSIIzggBOAYEEBxcGgRoBAQE X-IPAS-Result: AnwAAMWGmVSAnPnlnGdsb2JhbABbg1hYxFd4ZwEFhw0WAQEBAQERAQEBAQEGDQkJFC6EDgYBARopCAIYBgYIAwECBgJEBAgDASQJPwUPCAEEiAvOeQEfjEQBgkoBEQGDbYETBYN9hU2Cc4UWgx4igwEwgjWCAw+LSIIzggBOAYEEBxcGgRoBAQE X-IronPort-AV: E=Sophos;i="5.07,632,1413237600"; d="scan'208,217";a="114673323" Received: from lists.nasa.gov ([128.156.249.229]) by mail2-smtp-roc.national.inria.fr with ESMTP/TLS/ADH-AES256-SHA; 23 Dec 2014 16:18:05 +0100 Received: from localhost (localhost [127.0.0.1]) by lists.nasa.gov (Postfix) with ESMTP id BEC88601B10; Tue, 23 Dec 2014 10:16:49 -0500 (EST) Received: from lists.nasa.gov ([127.0.0.1]) by localhost (lists.nasa.gov [127.0.0.1]) (amavisd-new, port 10024) with ESMTP id xwhE5qjQuIhq; Tue, 23 Dec 2014 10:16:49 -0500 (EST) Received: from lists.nasa.gov (localhost [127.0.0.1]) by lists.nasa.gov (Postfix) with ESMTP id BC061601A54; Tue, 23 Dec 2014 10:16:46 -0500 (EST) Received: from localhost (localhost [127.0.0.1]) by lists.nasa.gov (Postfix) with ESMTP id 5AB226004AA for ; Tue, 23 Dec 2014 10:16:44 -0500 (EST) Received: from lists.nasa.gov ([127.0.0.1]) by localhost (lists.nasa.gov [127.0.0.1]) (amavisd-new, port 10024) with ESMTP id qsaGNz+O7bCJ for ; Tue, 23 Dec 2014 10:16:44 -0500 (EST) Received: from mail.jpl.nasa.gov (smtp.jpl.nasa.gov [128.149.139.106]) by lists.nasa.gov (Postfix) with ESMTPS id 024546002B3 for ; Tue, 23 Dec 2014 10:16:43 -0500 (EST) Received: from [192.168.0.3] (cpe-76-168-140-36.socal.res.rr.com [76.168.140.36]) (authenticated (0 bits)) by smtp.jpl.nasa.gov (Sentrion-MTA-4.3.1/Sentrion-MTA-4.3.1) with ESMTP id sBNFGeGK016804 (using TLSv1/SSLv3 with cipher AES128-SHA (128 bits) verified NO) for ; Tue, 23 Dec 2014 07:16:41 -0800 From: Klaus Havelund Message-Id: <9FD9D2E8-3101-4AC4-9BC4-50B539EBDF5F@jpl.nasa.gov> Date: Tue, 23 Dec 2014 07:16:18 -0800 To: fm-announcements@lists.nasa.gov Mime-Version: 1.0 (Mac OS X Mail 7.3 \(1878.6\)) X-Mailer: Apple Mail (2.1878.6) X-Source-Sender: klaus.havelund@jpl.nasa.gov X-JPL-Spam-Score': 80% X-BeenThere: fm-announcements@lists.nasa.gov X-Mailman-Version: 2.1.14 List-Id: NASA Formal Methods Announcements List-Unsubscribe: , List-Post: List-Help: List-Subscribe: , Content-Type: multipart/mixed; boundary="===============5317975268435990583==" Errors-To: fm-announcements-bounces@lists.nasa.gov Sender: fm-announcements-bounces@lists.nasa.gov X-Validation-by: klaus.havelund@jpl.nasa.gov Subject: [Caml-list] [fm-announcements] First CFP: CRV15 - 2nd Competition on Runtime Verification --===============5317975268435990583== Content-Type: multipart/alternative; boundary="Apple-Mail=_D8B9A6D7-B015-4904-B98B-82D7C209AB6A" --Apple-Mail=_D8B9A6D7-B015-4904-B98B-82D7C209AB6A Content-Transfer-Encoding: quoted-printable Content-Type: text/plain; charset=windows-1252 CRV 2015 The 2nd International Competition on Runtime Verification,=20 held with RV 2015, September 22 =96 25, 2015 Vienna, Austria CRV-2015 is the 2nd International Competition on Runtime Verification and i= s part of the 15th International=20 Conference on Runtime Verification. The event will be held in September 201= 5, in Vienna, Austria. CRV-2015 will=20 draw attention to the invaluable effort of software developers and research= ers who contribute in this field by=20 providing the community with new or updated tools, libraries and frameworks= for the instrumentation and runtime verification of software. Runtime Verification is a verification technique for the analysis of softwa= re at execution-time based on extracting information from a running system and checking if the observed behaviors sa= tisfy or violate the properties of interest. During the last decade, many important tools and techniques have been devel= oped and successfully employed. However,=20 there is a pressing need to compare such tools and techniques, since we cur= rently lack a common benchmark suite as well as scientific evaluation methods to validate and test new prototype runtime= verification tools.=20 The main aims of CRV-2015 are to: =95 Stimulate the development of new efficient and practical runtime verif= ication tools and the maintenance=20 and improvement of the already developed ones. =95 Produce a benchmark suite for runtime verification tools, by sharing c= ase studies and programs that=20 researchers and developers can use in the future to test and to v= alidate their prototypes. =95 Discuss the metrics employed for comparing the tools. =95 Provide a comparison of the tools on different benchmarks and evaluate= them using different criteria. =95 Enhance the visibility of presented tools among the different communit= ies (verification, software=20 engineering, cloud computing and security) involved in software m= onitoring. Please direct any enquiries to the competition co-organizers (crv15.chairs@= imag.fr) =95 Yli=E8s Falcone (Universit=E9 Joseph Fourier, France). =95 Dejan Nickovic (AIT Austrian Institute of Technology GmbH, Austria). =95 Giles Reger (University of Manchester, UK). =95 Daniel Thoma (University of Luebeck, Germany). CRV-2015 Jury The CSRV Jury will include a representative for each participating team and= the competition chairs. The Jury will be consulted at each stage of the co= mpetition to ensure that the rules set by the competition chairs are fair a= nd reasonable. Call for Participation The main goal of CRV 2015 is to compare tools for runtime verification. We = invite and encourage the participation with benchmarks and tools for the co= mpetition.The competition will consist of three main tracks based on the in= put language used: =95 Track on monitoring Java programs (online monitoring). =95 Track on monitoring C programs (online monitoring). =95 Track on monitoring of traces (offline monitoring). The competition will follow three phases: =95 Benchmarks/Specification collection phase - the participants are invit= ed to submit their benchmarks=20 (C or Java programs and/or traces). The organizers will collect t= hem in a common repository=20 (publicly available). The participants will then train their tool= s using the shared benchmarks. =95 Monitor collection phase - the participants are invited to submit thei= r monitors. The participants with=20 the tools/monitors that meet the qualification requirements will = be qualified for the evaluation phase. =95 Evaluation phase - the qualified tools will be evaluated on the submit= ted benchmarks and they will be=20 ranked using different criteria (i.e., memory utilization, CPU ut= ilization, ...). The final results=20 will be presented at the RV 2015 conference. The detailed description of each phase will be available on the RV 2015 web= site at http://rv2015.conf.tuwien.ac.at. =20 Expected Important Dates January 15, 2015: Declaration of intent (email: crv15.chairs@imag.fr) March 1, 2015 Submission deadline for benchmark programs and the properties= to be monitored March 15, 2015 Tool training starts by participants May 15, 2015 Monitor submission June 15, 2015 Notifications At RV 2015 Presentation of results =97=97=97 --Apple-Mail=_D8B9A6D7-B015-4904-B98B-82D7C209AB6A Content-Transfer-Encoding: quoted-printable Content-Type: text/html; charset=windows-1252

CRV 2015
The 2nd International Competition on Runtime Verification,&n= bsp;
held with RV 2015, Septemb= er 22 =96 25, 2015 Vienna, Austria


CRV-2015 is the 2nd International Competition on Runtime Ver= ification and is part of the 15th International 
Conference on Runtime Verification. The event = will be held in September 2015, in Vienna, Austria. CRV-2015 will 
draw attention to the invaluable e= ffort of software developers and researchers who contribute in this field b= y 
providing the community= with new or updated tools, libraries and frameworks for the instrumentatio= n and runtime
verification of s= oftware.

Runtime Verificatio= n is a verification technique for the analysis of software at execution-tim= e based on extracting
informati= on from a running system and checking if the observed behaviors satisfy or = violate the properties of interest.
During the last decade, many important tools and techniques have been = developed and successfully employed. However, 
there is a pressing need to compare such tools and tec= hniques, since we currently lack a common benchmark suite as well
as scientific evaluation methods to vali= date and test new prototype runtime verification tools. 

The main aims of CRV-2015 are to:

=95 Stimulate the development of new effici= ent and practical runtime verification tools and the maintenance 
          = and improvement of the already developed ones.
          researchers and developers= can use in the future to test and to validate their prototypes.
=95 Discuss the metrics employed for comparing th= e tools.
=95 Provide a comparison of th= e tools on different benchmarks and evaluate them using different criteria.=
=95 Enhance the visibility of presente= d tools among the different communities (verification, software 
=
          e= ngineering, cloud computing and security) involved in software monitoring.<= /div>

Please direct any enquiries = to the competition co-organizers (crv15.chairs@imag.fr)

= =95 Yli=E8s Falcone (Universit=E9 Joseph Fourier, France).
=95 Dejan Nickovic (AIT Austrian Institute of Technolo= gy GmbH, Austria).
=95 Giles Reger (Un= iversity of Manchester, UK).
=95 Daniel= Thoma (University of Luebeck, Germany).

CRV-2015 Jury

T= he CSRV Jury will include a representative for each participating team and = the competition chairs. The Jury will be consulted at each stage of the com= petition to ensure that the rules set by the competition chairs are fair an= d reasonable.

Call for Parti= cipation

The main goal of CR= V 2015 is to compare tools for runtime verification. We invite and encourag= e the participation with benchmarks and tools for the competition.The compe= tition will consist of three main tracks based on the input language used:<= /div>

=95 Track on monitoring Java program= s (online monitoring).
=95 Track on mon= itoring C programs (online monitoring).
=95 Track on monitoring of traces (offline monitoring).

The competition will follow three phases:

=95 Benchmarks/Specification collection = phase - the participants are invited to submit their benchmarks 
=
          (= C or Java programs and/or traces). The organizers will collect them in a co= mmon repository 
  &n= bsp;       (publicly available). The participants will then = train their tools using the shared benchmarks.
          the tools/monitors tha= t meet the qualification requirements will be qualified for the evaluation = phase.
=95 Evaluation phase - the quali= fied tools will be evaluated on the submitted benchmarks and they will be&n= bsp;
       = ;   ranked using different criteria (i.e., memory utilization, CPU uti= lization, ...). The final results 
          will be presented at the RV 201= 5 conference.

The detailed d= escription of each phase will be available on the RV 2015 website at <= a href=3D"http://rv2015.conf.tuwien.ac.at" class=3D"">http://rv2015.conf.tu= wien.ac.at.

 

Expected Important Dates

January 15, 2015: Declaration of intent (= email: crv15.chairs= @imag.fr)
March 1, 2015&nbs= p;Submission deadline for benchmark programs and the properties to be monit= ored
March 15, 2015 Tool t= raining starts by participants
= May 15, 2015 Monitor submission
June 15, 2015 Notifications
At RV 2015 Presentation of results

=97= =97=97


= --Apple-Mail=_D8B9A6D7-B015-4904-B98B-82D7C209AB6A-- --===============5317975268435990583== Content-Type: text/plain; charset="us-ascii" MIME-Version: 1.0 Content-Transfer-Encoding: 7bit Content-Disposition: inline --- To opt-out from this mailing list, send an email to fm-announcements-request@lists.nasa.gov with the word 'unsubscribe' as subject or in the body. You can also make the request by contacting fm-announcements-owner@lists.nasa.gov --===============5317975268435990583==--