There are lies, damn lies and benchmarks.
One of our competitors have published benchmark results regarding their XPS viewing software.
To much of anyone's surprise the study reveals that their solution is superior to every other solution on the market today!
Both in terms of features, and in performance and quality of rendering.
The document presents a feature comparison. But it leaves out the numerous features (like pdf conversion, editing, unlimited Zoom, etc...) our competitor's product doesn't have, but some of the other products that were tested do have. (Of course, we need to speculate on this, as the competitor's product is not generally available - it cannot be downloaded for trial, nor purchased from their site).
The document also contains benchmark tests.
It's not very difficult to make sure that your application performs well on a set of test files.
What matters is if you perform well on the files that actual users are generating and sharing. And I seriously doubt that the benchmark in question is a good test for that.
I've been following the XPS market for quite a while now, one thing this market does seem to have in abundance are benchmark suites.
But the thing is, benchmarks really don't tell you anything.
They don't tell you that the tested software runs great on your files.
It tells you it runs great on the test set.
And if a vendor publishes these results, you bet that they will have made sure that their software performs well one these few files.
It's a debatably tactic to discredit competitors.
The relevancy of this to you as a customer I think is not that great.
I wouldn't buy the snake oil.