stirmark benchmark 4.0
In November 1997, the first version of StirMark was published as a generic tool for simple robustness testing of image watermarking algorithms. It introduced random bilinear geometric distortions to de-synchronise watermarking algorithms. Then several versions followed improving the original attack but also introducing a longer lists of tests. In January 1999 we discussed the urgent need for fair evaluation procedures for watermarking systems and a first benchmark was made possible with the release of StirMark 3.1.
The natural extension to this work was an automated independent public service with extended evaluation profiles to evaluate quickly watermarking libraries. This was the goal of the StirMark Benchmark Service project.
A first brick for this tool was the new StirMark Benchmark evaluation engine that you can download from this page. The previous version of StirMark has been completely re-written so that you can easily plug-in your watermarking library so it can be evaluated using an evaluation profile (basically list of tests and images) you specify. During the re-writing we have also separated more clearly the different components of the engine so you can code your own attacks. If you do so you can send the code to us (provided it is compliant with the StirMark Benchmark license) and we will happily include it into the main distribution.
usage & download
If you use StirMark for your research, please cite:
Using StirMark for any other purpose than research or evaluation of copyright marking systems is prohibited, at least in Europe: ‘Member States shall provide adequate legal protection against the circumvention without authority of any effective technological measures designed to protect any copyrights or any rights related to copyright as provided by law or the sui generis right provided for in chapter III of European Parliament and Council Directive 96/9/EC’.
Last update: Tuesday, 10 January 2012 22:02:19 -0000
|Copyright © 1997–2012 by Fabien Petitcolas|