Difference between revisions of "Benchmark Template - Offline track"

From CRV
Jump to: navigation, search
(Created page with "TODO: update ''Two to three line overview of benchmark.'' == Benchmark Data == === Specification Information === ''Describe the specification being monitored. The descript...")
 
 
Line 1: Line 1:
TODO: update
+
'''Name of benchmark''' Two or three sentences that describe the benchmark. These can mention the domain, scope or source or the benchmark. Or might suggest the purpose of the benchmark i.e. to demonstrate a certain feature. ''List of categories''
 
+
''Two to three line overview of benchmark.''
+
  
 
== Benchmark Data ==
 
== Benchmark Data ==
  
=== Specification Information ===
+
=== The Trace Part ===
 
+
''Describe the specification being monitored. The description should include:''
+
* ''an informal description of the specification (with, whenever possible, explicit references to the events involved);''
+
* ''a formal description of the property in the submitting team's specification language.''
+
 
+
=== Instrumentation Information ===
+
 
+
''Give details of how the events in the specification map to concrete events i.e., if AspectJ is being used give the pointcuts that would generate the events.''
+
 
+
''It is possible to map multiple concrete events to a single abstract event i.e., add and remove on a list could both map to a 'use' event. It is also possible to map a single concrete event to multiple symbolic events i.e., the concrete lock(object) event could be mapped to lock(object1) and lock(object2) events appearing in a specification.''
+
  
=== Trace Information ===
+
Please upload the trace files to the server and include the necessary information (see rules document) here.
  
''Provide a description of the monitored trace.
+
=== The Property Part ===
The description should include:
+
  
* ''a link to the trace file(s) in the FTP server, e.g., Java_Track/1_TeamName/Benchmark1;''
+
See the rules for the necessary descriptions of what should be included in the following sections
* ''for each event, the number of times it should be observed in the trace;''
+
* ''the verdict i.e., does the trace satisfy the property.''
+
  
=== Example (short) Traces ===
+
==== Informal Description ====
  
''Give at least 3 valid and 3 invalid short traces to demonstrate the behaviour captured by the specification. Whenever possible, illustrate special cases. For instance, if there are multiple kinds of failures, give examples of each.''
+
==== Demonstration Traces ====
  
== Comments ==
+
==== Formal Specification ====
  
''Additional comments on the benchmark.''
+
==== A FO-LTL Specification ====
  
 
== Clarification Requests ==
 
== Clarification Requests ==
  
 
This is a space where other teams can ask for clarifications and the submitting team can answer. When a team requests a clarification, the team should be mentioned in the request and the request timestamped. The team submitting the benchmark should ensure they watch this page so that they get notifications of questions.
 
This is a space where other teams can ask for clarifications and the submitting team can answer. When a team requests a clarification, the team should be mentioned in the request and the request timestamped. The team submitting the benchmark should ensure they watch this page so that they get notifications of questions.

Latest revision as of 18:20, 20 May 2016

Name of benchmark Two or three sentences that describe the benchmark. These can mention the domain, scope or source or the benchmark. Or might suggest the purpose of the benchmark i.e. to demonstrate a certain feature. List of categories

Benchmark Data

The Trace Part

Please upload the trace files to the server and include the necessary information (see rules document) here.

The Property Part

See the rules for the necessary descriptions of what should be included in the following sections

Informal Description

Demonstration Traces

Formal Specification

A FO-LTL Specification

Clarification Requests

This is a space where other teams can ask for clarifications and the submitting team can answer. When a team requests a clarification, the team should be mentioned in the request and the request timestamped. The team submitting the benchmark should ensure they watch this page so that they get notifications of questions.