Abstract:
We introduce Vis-Trec, an open-source cross-platform system, which provides the capability to perform in-depth analysis of the results obtained from trec-style evaluation campaigns. Vis-Trec allows researchers to dig deeper in their evaluations by providing various visualizations of the results based on performance percentiles, query difficulty, and comparative analysis of different methods using help-hurt diagrams at the query level. It also automatically organizes the obtained results in tabular LaTeX format that can be used for reporting evaluation findings. The added benefit for Vis-Trec is that it has been developed in Python and is extensible by other developers. The source code along with a functional version of the program are released to the public.