Watermarking schemes evaluation
Thread Rating:
  • 0 Vote(s) - 0 Average
  • 1
  • 2
  • 3
  • 4
  • 5
seminar class
Active In SP

Posts: 5,361
Joined: Feb 2011
01-03-2011, 12:43 PM

.doc   F. A. P. Petitcolas, Watermarking schemes.doc (Size: 102.5 KB / Downloads: 51)
Watermarking schemes evaluation
Abstract—Digital watermarking has been presented as a solution to copy protection of multimedia objects and dozens of schemes and algorithms have been proposed. Two main problems seriously darken the future of this technology though.
Firstly, the large number of attacks and weaknesses which appear as fast as new algorithms are proposed, emphasizes the limits of this technology and in particular the fact that it may not match users expectations.
Secondly, the requirements, tools and methodologies to assess the current technologies are almost non-existent. The lack of benchmarking of current algorithms is blatant. This confuses rights holders as well as software and hardware manufacturers and prevents them from using the solution appropriate to their needs. Indeed basing long-lived protection schemes on badly tested watermarking technology does not make sense.
In this paper we will discuss how one could solve the second problem by having a public benchmarking service. We will examine the challenges behind such a service.
Index Terms—watermarking, robustness, evaluation, benchmark
Digital watermarking remains a largely untested field and only very few large industrial consortiums have published requirements against which watermarking algorithms should be tested [ , ]. For instance the International Federation for the Phonographic Industry led one of the first large scale comparative testing of watermarking algorithm for audio. In general, a number of broad claims have been made about the ‘robustness’ of various digital watermarking or fingerprinting methods but very few researchers or companies have published extensive tests on their systems.
The growing number of attacks against watermarking systems (e.g., [ , , ]) has shown that far more research is required to improve the quality of existing watermarking methods so that, for instance, the coming JPEG 2000 (and new multimedia standards) can be more widely used within electronic commerce applications.
We already pointed out in [ ] that most papers have used their own limited series of tests, their own pictures and their own methodology and that consequently comparison was impossible without re-implementing the method and trying to test them separately. But then, the implementation might be very different and probably weaker than the one of the original authors. This led to suggest that methodologies for evaluating existing watermarking algorithms were urgently required and we proposed a simple benchmark for still image marking algorithms.
With a common benchmark authors and watermarking software providers would just need to provide a more or less detailed table of results, which would give a good and reliable summary of the performances of the proposed scheme. So end users can check whether their basic requirements are satisfied, researchers can compare different algorithms and see how a method can be improved or whether a newly added feature actually improves the reliability of the whole method and the industry can properly evaluate risks associated to the use of a particular solution by knowing which level of reliability can be achieved by each contender. Watermarking system designers can also use such evaluation to identify possible weak points during the early development phase of the system.
Evaluation per se is not a new problem and significant work has been done to evaluate, for instance, image compression algorithms or security of information systems [ ] and we believe that some of it may be re-used for watermarking.
In section II will explain what is the scope of the evaluation we envisage. Section III will review the type of watermarking schemes that an automated evaluation service could deal with. In section IV we will review what are the basic functionalities that need to be evaluated. Section V will examine how each functionality can be tested. Finally, sec section VI will argue the need for a third party evaluation service and briefly sketch its architecture.
Scope of the evaluation
Watermarking algorithms are often used in larger system designed to achieve certain goals (e.g., prevention of illegal copying). For instance Herrigel et al. [ ] presented a system for trading images; this system uses watermarking technologies but relies heavily on cryptographic protocols.
Such systems may be flawed for other reasons than watermarking itself; for instance the protocol, which uses the watermark , may be wrong or the random number generator used by the watermark embedder may not be good. In this paper we are only concerned with the evaluation of watermarking (so the signal processing aspects) within the larger system not the effectiveness of the full system to achieve its goals.
Target of evaluation
The first step in the evaluation process is to clearly identify the target of evaluation, that is the watermarking scheme (set of algorithms required for embedding and extraction) subject to evaluation and its purpose. The purpose of a scheme is defined by one or more objectives and an operational environment. For instance, we may wish to evaluate a watermarking scheme that allows automatic monitoring of audio tracks broadcast over radio.
Typical objectives found across the watermarking and copy protection literature include:
• Persistent identification of audio-visual signals: the mark carries a unique identification number (similar to an I.S.B.N.), which can be used as a pointer in a database. This gives the ability to manage the association of digital content with its related descriptive data, current rights holders, license conditions and enforcement mechanisms. This objective is quite general as it may wrap many other objectives described below. However one may wish to have the data related to the work stored into the work itself rather than into a central database in order to avoid connection to a remote server.
• Proof of creatorship, proof of ownership: the embedded mark is be used to prove to a court who is the creator or the right holder of the work;
• Auditing: the mark carries information used to identify parties present in a transaction involving the work (the distributors and the end users). This audit trail shows the transfer of work between parties. Marks for identifying users are usually referred to as fingerprints;
• Copy-control marking: the mark carries information regarding the number of copies allowed. Such marks are used in the digital versatile disk copy protection mechanisms. In this system a work can be copied, copied once only or never copied [ ].
• Monitoring of multimedia object usage: monitoring copyright liability can be achieved by embedding a license number into the work and having, for instance, an automated service constantly crawling the web or listening to the radio, checking the licensing and reporting infringement.
• Tamper evidence: special marks can be used in a way that allows detection of modifications introduced after the mark has been added.
• Labelling for user awareness: this type of marks are typically used by right holders to warn end users that the work they ‘have in hands’ is copyrighted. For instance, whenever an end user tries to save a copyrighted image opened in a web browser or an image editor, he may get a warning encouraging him to purchase a license for the work.
• Data augmentation: this is not really in the scope of ‘digital watermarking’ but a similar evaluation methodology can be applied to it.
• Labelling to speed up search in databases.
seminar paper
Active In SP

Posts: 6,455
Joined: Feb 2012
14-02-2012, 03:48 PM

to get information about the topic uses of watermarking full report ,ppt and related topic refer the link bellow






topicideashow-to-digital-watermarking-full-seminar and presentation-report-download?page=15

topicideashow-to-digital-watermarking-full-seminar and presentation-report-download?pid=20966


Important Note..!

If you are not satisfied with above reply ,..Please


So that we will collect data for you and will made reply to the request....OR try below "QUICK REPLY" box to add a reply to this page

Quick Reply
Type your reply to this message here.

Image Verification
Please enter the text contained within the image into the text box below it. This process is used to prevent automated spam bots.
Image Verification
(case insensitive)

Possibly Related Threads...
Thread Author Replies Views Last Post
  Computerized Paper Evaluation using Neural Network computer science crazy 11 8,956 03-02-2014, 03:21 PM
Last Post: Guest
  Information Theoretic Framework of Trust Modeling and Evaluation for Ad Hoc Net seminar projects maker 0 405 30-09-2013, 04:22 PM
Last Post: seminar projects maker
  IMAGE ADAPTIVE & FRAGILE WATERMARKING ppt seminar projects maker 0 438 14-09-2013, 12:44 PM
Last Post: seminar projects maker
  The clinical evaluation of vein contrast enhancement study tips 0 246 20-08-2013, 04:28 PM
Last Post: study tips
  Report on Digital Watermarking study tips 0 355 20-07-2013, 04:12 PM
Last Post: study tips
  Test & Evaluation/Science &Technology Net - Centric System Test ppt study tips 0 378 12-07-2013, 04:05 PM
Last Post: study tips
  Evaluation of Impact of Wormhole Attack on AODV pdf study tips 0 250 25-06-2013, 12:58 PM
Last Post: study tips
  Hardware Assisted Watermarking for Multimedia pdf study tips 0 283 21-06-2013, 12:51 PM
Last Post: study tips
  Comparative Performance Evaluation of Routing Algorithms pdf study tips 0 233 17-06-2013, 04:15 PM
Last Post: study tips
Photo Digital Watermarking Full Seminar Report Download computer science crazy 96 58,939 27-05-2013, 09:15 AM
Last Post: study tips