Benchmark Template - Offline track
TODO: update
Two to three line overview of benchmark.
Contents
Benchmark Data
Specification Information
Describe the specification being monitored. The description should include:
- an informal description of the specification (with, whenever possible, explicit references to the events involved);
- a formal description of the property in the submitting team's specification language.
Instrumentation Information
Give details of how the events in the specification map to concrete events i.e., if AspectJ is being used give the pointcuts that would generate the events.
It is possible to map multiple concrete events to a single abstract event i.e., add and remove on a list could both map to a 'use' event. It is also possible to map a single concrete event to multiple symbolic events i.e., the concrete lock(object) event could be mapped to lock(object1) and lock(object2) events appearing in a specification.
Trace Information
Provide a description of the monitored trace. The description should include:
- a link to the trace file(s) in the FTP server, e.g., Java_Track/1_TeamName/Benchmark1;
- for each event, the number of times it should be observed in the trace;
- the verdict i.e., does the trace satisfy the property.
Example (short) Traces
Give at least 3 valid and 3 invalid short traces to demonstrate the behaviour captured by the specification. Whenever possible, illustrate special cases. For instance, if there are multiple kinds of failures, give examples of each.
Comments
Additional comments on the benchmark.
Clarification Requests
This is a space where other teams can ask for clarifications and the submitting team can answer. When a team requests a clarification, the team should be mentioned in the request and the request timestamped. The team submitting the benchmark should ensure they watch this page so that they get notifications of questions.