Commit 178425cc authored by Martin Jeřábek's avatar Martin Jeřábek

Merge branch 'gtkw-gui' into 'master'

testfw: use waveform layout files for gtkwave

See merge request !233
parents 515e6d64 92f6c4ed
Pipeline #6649 passed with stage
in 41 seconds
# Testing Framework
Features:
* based on VUnit
* supports Modelsim and GHDL+gtkwave
* test suite configuration in YAML files
* assigning TCL and/or GHW waveform layout files to testcases
* and probably more
## Configuration options
* `wave`: a path to a TCL file with waveform layout definition, used for
Modelsim and gtkwave (if not overriden by `gtkw`). For gtkwave, it
is internally converted to a gtkw file on each run (so modify the
TCL file, not the generated gtkw).
* `gtkw`: a path to GTKW file with waveform layout definition for gtkwave; if
set together with `wave`, this takes precedence. The specified gtkw
file is not modified.
* many more
## Using waveform layout files
* Specify the file in YML config, either as `gtkw` or `wave` (tcl). Later, this
might be extended to native gtkw-generating python files.
* Run the tests with `--create-ghws`. This generates signal and type hierarchy.
You should run this each time you modify a signal in the layout (or add a signal both to code and to layout).
* Run in gui mode, using the VUnit `-g` flag.
# How it works
## Converting Modelsim TCL layout files to GTKW
In addition to setting GHW file directly, it is possible to specify a Modelsim
TCL layout file (with `add wave` commands), which is automatically converted to
GTKW. This has several layers, as described below.
### Interpreting the TCL file
Python comes with TCL interpreter in the standard Tkinter module. The layout
files are thus processed by full-fledged TCL interpreter. The `add wave`
function is then implemented in Python.
This is implemented in `gtkwave.py`.
### Implementing `add wave` function
This function parses the arguments to `add wave` and creates a GTKW file using
`gtkw.GTKWSave` class from the package `pyvcd`. Among its features are:
* setting display format (dec, hex, bin, signed, ...)
* setting color (only a limited palette of ~8 colors is supported by gtkwave.
See `gtkwave.TclFuncs.conv_color` for details).
* Grouping
* Delimiters
* Include all items of a record type in a group (optionally expanded by
default).
As gtkwave supports displaying just a subset of vector's bits, it requires the
(vector) signal to specify the full bit range, which is not included in the
Modelsim TCL files. This information has to be obtained automatically.
### Getting signal hierarchy and type hierarchy
To detect the bitwidth of a vector, we need to be able to find the signal by its
fully qualified name (FQN) in the design. Furthermore, as there may be a vector
inside of a record type, even type hierarchy must be known.
This information is available in GHDL's GHW data dumps (analogue of VCD). We are
now faced with two problems: how to generate the file, and how to parse it.
#### Generating GHW files
The GHW file is generated when GHDL runs the test case. However, we need it
*before* we run the testcase, to be able to produce the GTKW file for gtkwave
(in `test_common.TestsBase.add_modelsim_gui_file`).
The GHW files are thus generated manually by running the test framework with
`--create-ghws` argument. It could be automated, but the reasoning is that the
relevant signals do not change very often, and the generaing takes non-trivial
amount of time (~2 sec *per testcase*).
Now what does `--create-ghws` do:
1. Append `--elaborate` to VUnit arguments. This causes the test cases to be
elaborated (GHDL runs and generates the GHW file), but not executed.
2. For each testbench, add GHDL simulation option `--wave=xxx.ghw`, which tells
GHDL to output the GHW file.
#### Parsing GHW files
GHW files are binary. Non-standard. Not documented. With non-stable format. Only
2 known implementations exist -- one in GHDL (write), one in gtkwave (read).
Fortunately, gtkwave comes with a useful program `ghwdump`, which can output
both signal hierarchy (with `-h` flag) and types (`-t` flag). The output of
`ghwdump` is then parsed in Python (package `testfw.ghw_parse`).
##### Signal hierarchy
Parsing signal hierarchy is very simple, because it is structured as a tree,
with levels differentiated by indentation.
```
design
instance tb_sanity:
instance t_sanity:
port-out errors: natural: #18
signal error_ctr: natural: #21
```
#### Type hierarchy
Types are printed as VHDL code. The parsing is thus a bit harder and uses a
real parser. The `parsy` Python module, modelled after Haskell's `parsec`, is
used. A little preprocessing is needed to fix some bohus `ghwdump` output. The
implementation probably does not understand all valid VHDL type definitions, and
probably accepts some non-valid ones. It is, however, not needed to implement a
conformant VHDL parser, but to Just Make It Work In Our Case™.
......@@ -59,7 +59,7 @@ quietly set PROTOCOL_CONTROL "protocol_control_comp"
#Add common waves for each test entity
add_test_status_waves
add wave error_ctr
add wave $TCOMP/error_ctr
add wave -label "Name of test" $TCOMP/test_name
#Add circuit specific signals
......
......@@ -10,7 +10,7 @@ from os.path import abspath
from .log import MyLogRecord
d = Path(abspath(__file__)).parent
func_cov_dir = os.path.join(str(d.parent), "build/functional_coverage")
func_cov_dir = d.parent / "build/functional_coverage"
def setup_logging() -> None:
......@@ -76,8 +76,10 @@ def create():
@click.argument('vunit_args', nargs=-1)
@click.option('--strict/--no-strict', default=True,
help='Return non-zero if an unconfigured test was found.')
@click.option('--create-ghws/--no-create-ghws', default=False,
help='Only elaborate and create basic GHW files necessary for converting TCL layout files to GTKW files for gtkwave..')
@click.pass_obj
def test(obj, config, strict, vunit_args):
def test(obj, config, strict, create_ghws, vunit_args):
"""Run the tests. Configuration is passed in YAML config file.
You mas pass arguments directly to VUnit by appending them at the command end.
......@@ -108,6 +110,10 @@ def test(obj, config, strict, vunit_args):
build.mkdir(exist_ok=True)
os.chdir(str(build))
if create_ghws:
# discard the passed vunit_args, it does only evil
vunit_args = ['--elaborate']
ui = create_vunit(obj, vunit_args, out_basename)
lib = ui.add_library("lib")
......@@ -124,12 +130,10 @@ def test(obj, config, strict, vunit_args):
tests = []
for cfg_key, factory in tests_classes:
if cfg_key in config:
tests.append(factory(ui, lib, config[cfg_key], build, base))
tests.append(factory(ui, lib, config[cfg_key], build, base, create_ghws=create_ghws))
if not os.path.exists(func_cov_dir):
os.makedirs(func_cov_dir);
os.makedirs(os.path.join(func_cov_dir, "html"))
os.makedirs(os.path.join(func_cov_dir, "coverage_data"))
(func_cov_dir / "html").mkdir(parents=True, exist_ok=True)
(func_cov_dir / "coverage_data").mkdir(parents=True, exist_ok=True)
for t in tests:
t.add_sources()
......
"""Tools to parse hierarchy from GHW files and find out array ranges.
"""
from . import hierarchy
from . import types
import subprocess as sp
from pathlib import Path
from pprint import pprint
import re
import pickle
from functools import wraps
from typing import Tuple, List
import hashlib
GHWDUMP = 'ghwdump'
def cached(f):
@wraps(f)
def wrapper(path: Path):
cache = path.with_suffix('.{}.pck'.format(f.__name__))
h = hashlib.sha256()
with path.open('rb') as ff:
h.update(ff.read())
curr_hash = h.digest()
old_hash = None
if cache.is_file():
with cache.open('rb') as ff:
old_hash, data = pickle.load(ff)
if old_hash != curr_hash:
data = f(path)
with cache.open('wb') as ff:
pickle.dump((curr_hash, data), ff)
return data
return wrapper
#@cached
def parse_hierarchy(ghw: Path):
res = sp.run([GHWDUMP, '-h', str(ghw)], stdout=sp.PIPE)
return hierarchy.parse(res.stdout.decode('ascii'))
#@cached
def parse_types(ghw: Path):
res = sp.run([GHWDUMP, '-t', str(ghw)], stdout=sp.PIPE)
return types.parse(res.stdout.decode('latin1'))
"""
TODO:
- parse types
- represent the hierarchy as proper tree
- may be dict, keys are id()s, values the whole thing
- find
- separates the name into name and (optional) index
- searches for both name-only and name-with-index
- in case the matching item is a signal
- descend into type
"""
def inject_types(h, ts):
if hasattr(h, 'type'):
parsed = types.parse_type(h.type)
t = types.resolve(parsed, ts)
h.type = t
if hasattr(h, 'children'):
for c in h.children.values():
inject_types(c, ts)
def find(h, fqn):
names = fqn.split('.')
curr = h
def children(o):
return o.items if isinstance(o, types.t_record) else o.children
for name in names:
if name in children(curr):
curr = children(curr)[name]
elif isinstance(curr, hierarchy.signal):
# the name may refer to a signal, which is a record ...
type = curr.type
assert isinstance(type, types.t_record)
curr = type.items[name]
else:
# the name may refer to a signal, which is an array ...
m = re.match(r'^([^(]+)\(([0-9]+)\)$', name)
if not m:
raise KeyError('{} (from {}) not found in {}'
.format(name, fqn, list(curr.children.keys())),
name, fqn, list(curr.children.keys()))
#print(name, m.groups())
name, index = m.groups()
index = int(index)
if name in children(curr):
signal = children(curr)[name]
type = signal.type
#print(type)
range, type = strip_array(type)
# TODO: check range: type.range.contains(index)
while isinstance(type, types.t_subarray):
type = type.type
assert isinstance(type, types.t_array)
type = type.type
curr = type
else:
raise KeyError('{} (from {}) not found in {}'
.format(name, fqn, list(curr.children.keys())))
if not isinstance(curr, types.t_base):
curr = curr.type
return curr
def is_array(type) -> bool:
return isinstance(type, types.t_subarray) or \
isinstance(type, types.t_array)
def is_record(type) -> bool:
return isinstance(type, types.t_record)
def strip_array(type) -> Tuple[List[types.t_range], types.t_base]:
ranges = type.ranges
while isinstance(type, types.t_subarray):
type = type.type
return ranges, type
@cached
def parse(path: Path):
h = parse_hierarchy(path)
t = parse_types(path)
inject_types(h, t)
return h
if __name__ == '__main__':
h = parse(Path('wave.ghw'))
#print(type(h))
#pprint(attr.asdict(h))
x = find(h, 'tb_feature.test_comp.g_inst(1).can_inst.drv_bus')
pprint(x)
pprint(strip_array(x))
x = find(h, 'tb_feature.test_comp.p(1).clk_sys')
pprint(x)
#pprint(strip_array(x))
import subprocess as sp
from pathlib import Path
from pprint import pprint
import attr
from typing import Dict
@attr.s
class hbase:
name = attr.ib() # type: str
children = attr.ib(factory=dict, kw_only=True) # type: Dict[str, hbase]
@attr.s
class design(hbase):
@classmethod
def create(cls, dummy) -> 'design':
return cls(name='')
@attr.s
class signal(hbase):
type = attr.ib(kw_only=True)
extra = attr.ib(kw_only=True)
kind = attr.ib(kw_only=True) # signal / port-in / port-out / port-inout
@classmethod
def create(cls, type, name, extra) -> 'signal':
tp, id = extra.split(':', 1)
return cls(name=name, type=tp, extra=id.strip(), kind=type)
@attr.s
class container(hbase):
kind = attr.ib(kw_only=True) # package / process / instance / generate-if
@classmethod
def create(cls, type, name) -> 'container':
return cls(name=name, kind=type)
@attr.s
class generate_for(hbase):
index = attr.ib(kw_only=True)
@classmethod
def create(cls, type, name, extra) -> 'generate_for':
return cls(name=name+extra, index=extra) # TODO
def factory(line: str):
type, rest = line.split(' ', 1) if ' ' in line else (line, '')
name_rest = rest.split(':', 1)
r = tuple(x.strip() for x in [type]+name_rest if x.strip())
#print(r)
factories = {
'design': design.create,
'signal': signal.create,
'port-in': signal.create,
'port-out': signal.create,
'port-inout': signal.create,
'generate-for': generate_for.create,
'package': container.create,
'process': container.create,
'instance': container.create,
'generate-if': container.create,
'block': container.create,
}
fact = factories[r[0]]
return fact(*r)
def parse(s: str) -> dict:
"""Parse signal hierarchy from `ghwdump -h`."""
lines = s.splitlines()
curr_ws = 0
last = dict()
stack = [last]
for i, line in enumerate(lines[:4096]):
l = line.lstrip()
if not l:
continue
ws = len(line) - len(l)
if ws > curr_ws:
assert curr_ws + 1 == ws
stack.append(last)
elif ws < curr_ws:
for _ in range(curr_ws-ws):
stack.pop()
#print("{:4d}: ws {:2d}; depth {:2d}: |{:<100} -- parent = {}".format(i, ws, len(stack)-1, line, ""))
obj = factory(l)
last = obj.children
stack[-1][obj.name] = obj
curr_ws = ws
#pprint(stack)
return next(stack[0].values().__iter__())
import subprocess as sp
from pathlib import Path
from pprint import pprint
import attr
from typing import Dict, OrderedDict, List, Any, Optional
import re
from enum import Enum
from parsy import string, regex, seq
@attr.s
class t_base:
name: Optional[str] = attr.ib(default=None, kw_only=True)
@attr.s
class t_ref:
"""Unresolved reference to another type."""
name: str = attr.ib(kw_only=True)
class Direction(Enum):
to = 'to'
downto = 'downto'
@attr.s
class t_range:
left: int = attr.ib(kw_only=True)
direction: Direction = attr.ib(kw_only=True)
right: int = attr.ib(kw_only=True)
range_type: t_base = attr.ib(kw_only=True, default=t_ref(name='integer'))
@attr.s
class t_array(t_base):
type: t_base = attr.ib(kw_only=True)
ranges: List[t_range] = attr.ib(kw_only=True)
@attr.s
class t_subarray(t_base):
type: t_array = attr.ib(kw_only=True)
ranges: List[t_range] = attr.ib(kw_only=True)
@attr.s
class t_record(t_base):
items: OrderedDict[str, t_base] = attr.ib(kw_only=True)
@attr.s
class t_enum(t_base):
items: List[Any] = attr.ib(kw_only=True)
@attr.s
class t_number(t_base):
type: t_base = attr.ib(kw_only=True)
range: t_range = attr.ib(kw_only=True)
@attr.s
class t_unconstrained_range(t_base):
pass
@attr.s
class type_binding:
name: str = attr.ib(kw_only=True)
type: t_base = attr.ib(kw_only=True)
@classmethod
def create(cls, factory, *, name, **kwds) -> 'type_binding':
return cls(name=name, type=factory(**kwds))
def tb(factory):
def f(**kwds):
return type_binding.create(factory=factory, **kwds)
return f
def tbs(type_bindings: List[type_binding]) -> Dict[str,t_base]:
res = dict()
for t in type_bindings:
# some array types are defined in 2 steps:
# type mem_bus_arr_t is array (integer range <>) of avalon_mem_type;
# subtype mem_bus_arr_t is mem_bus_arr_t (1 to 2);
# This takes care of that.
if t.name in res:
res[t.name] = resolve(t.type, res)
else:
res[t.name] = t.type
return res
comment = regex('--.*$', re.MULTILINE).desc('comment')
ws = (regex(r'\s+') >> comment.optional()).at_least(1).desc('whitespace')
ows = ws.optional().desc('ows')
_type = string('type')
_subtype = string('subtype')
_is = string('is')
_end = string('end')
_record = string('record')
_to = string('to')
_downto = string('downto')
_range = string('range')
_array = string('array')
_of = string('of')
_units = string('units')
_rng_unconstrained = string('<>').result(t_unconstrained_range())
LP = ows >> string('(') << ows
RP = ows >> string(')') << ows
COMMA = ows >> string(',') << ows
INT = regex(r'[+-]?\d+').map(int).desc('integer')
identifier = regex(r'[a-zA-Z_][a-zA-Z_0-9]*').desc('identifier')
t_name = identifier.tag('name')
T_REF = seq(t_name).combine_dict(t_ref)
list_item = INT | identifier | regex(r"'.'")
comma_sep_list = LP >> list_item.sep_by(COMMA) << RP
enum = seq(_type >> ws >> t_name,
ws >> _is >> ows >> comma_sep_list.tag('items')
).combine_dict(tb(t_enum))
# --- Enum
# 1 to 2, N downto M, <>
range = seq(list_item.tag('left'),
ws >> (_to | _downto).map(Direction).tag('direction') << ws,
list_item.tag('right')
).combine_dict(t_range).desc('Range') | _rng_unconstrained
# std_logic, std_logic_vector(1 to 2)
Type = T_REF.desc('type')
type_int_range = seq(Type.tag('type'),
ws >> _range >> ws >> range.tag('range')
).combine_dict(t_number)
type_or_array = seq(Type.tag('type'),
(LP >> range.sep_by(COMMA, min=1).tag('ranges') << RP)
).combine_dict(t_subarray).desc('array') | type_int_range | Type
# --- Record
record_item = seq(ows >> t_name,
regex(r'\s*:\s*') >> type_or_array.tag('type') << string(';')
).combine_dict(type_binding)
record = seq(_type >> ws >> t_name,
ws >> _is >> ws >> _record >> ws >> record_item.many().map(tbs).tag('items') <<
(ows >> _end >> ws >> _record)).combine_dict(tb(t_record))
# type integer is range <>
td_rng_1 = seq(_type >> ws >> t_name <<
ws << _is << ws << _range << ws << _rng_unconstrained).combine_dict(tb(t_base))
# type natural is integer range 0 to 123
td_rng = seq((_type | _subtype) >> ws >> t_name,
ws >> _is >> ws >> T_REF.tag('type'),
ws >> _range >> ws >> range.tag('range')).combine_dict(tb(t_number)) \
| td_rng_1
# type std_ulogic_vector is array (natural range <>) of std_ulogic;
td_arr = seq(_type >> ws >> t_name,
ws >> _is >> ws >> _array >> LP >> (T_REF.tag('range_type') >>
ws >> _range >> ws >> range).sep_by(COMMA, min=1).tag('ranges'),
RP >> _of >> ws >> type_or_array.tag('type')).combine_dict(tb(t_array)).desc('arr')
# type A is B
td_alias = seq((_type | _subtype) >> ws >> t_name,
ws >> _is >> ws >> T_REF.tag('type')
).combine_dict(type_binding)
# subtype runner_sync_t is std_ulogic_vector (0 to 2);
td_subarr = seq(_subtype >> ws >> t_name,
ws >> _is >> ws >> T_REF.tag('type'),
LP >> range.sep_by(COMMA, min=1).tag('ranges') << RP).combine_dict(tb(t_subarray)).desc('subarr')
# type time is range <> units ... end units;
td_units = seq(_type >> ws >> t_name <<
ws << _is << ws << _range << ws << range << ws << _units <<
regex('.*?end units', re.DOTALL)).combine_dict(tb(t_base)).desc('units')
type_def = td_units | enum | record | td_rng | td_arr | td_subarr | td_alias
top = (ows >> type_def << ows << regex(r';')).many().map(tbs) << ows
def resolve(o, top: Dict[str, t_base]):
"""Resolve the `t_ref` type references."""
if isinstance(o, t_ref):
res = resolve(top[o.name], top)
res.name = o.name # set name, so that the information is keps
return res
elif isinstance(o, t_base):
return type(o)(**resolve(o.__dict__, top))
elif isinstance(o, dict):
return dict((k, resolve(v, top)) for k, v in o.items())
else:
return o