<body><script type="text/javascript"> function setAttributeOnload(object, attribute, val) { if(window.addEventListener) { window.addEventListener('load', function(){ object[attribute] = val; }, false); } else { window.attachEvent('onload', function(){ object[attribute] = val; }); } } </script> <div id="navbar-iframe-container"></div> <script type="text/javascript" src="https://apis.google.com/js/platform.js"></script> <script type="text/javascript"> gapi.load("gapi.iframes:gapi.iframes.style.bubble", function() { if (gapi.iframes && gapi.iframes.getContext) { gapi.iframes.getContext().openChild({ url: 'https://www.blogger.com/navbar.g?targetBlogID\x3d33547542\x26blogName\x3dNotes+%26+Thoughts\x26publishMode\x3dPUBLISH_MODE_BLOGSPOT\x26navbarType\x3dBLUE\x26layoutType\x3dCLASSIC\x26searchRoot\x3dhttps://jonpoon.blogspot.com/search\x26blogLocale\x3den_US\x26v\x3d2\x26homepageUrl\x3dhttp://jonpoon.blogspot.com/\x26vt\x3d3412814716534773350', where: document.getElementById("navbar-iframe-container"), id: "navbar-iframe" }); } }); </script>

Wednesday, September 06, 2006

Open Season For Antimalware Testing?

Not exactly following after the CR testing of the antispyware and antivirus products, but being brought into closer media attention from the CR aftermath, a number of other antimalware testing reports have been published recently.

These reports, published or co-published by "security experts", clouded the testing environment of a security space that is inherently complex and diverse to test fully. The consequence from these reports is a disservice to the antimalware product users, as this brings confusion and panick.

This space is highly specialized and requires years of dedicated discipline and knowledge before one is able to fully understand the challenges, let alone communicate these challenges to end users and the mass media.

With the major antimalware vendors labelling these reports as flawed and bogus, it creates a wrong impression that the antimalware vendors are bullying newcomers into the antimalware space, and/or that the vendors are attacking the results as these shows the weaknesses in the products. There's also an impression that "we know more than you do" that some bloggers, forum posters, and especially trolls, are using to create churn in a discussion that need to happen.

However, i believe that this is not the case. Experienced testing organizations like Virus Bulletin, ISCA/Checkmark, AV-Test, AV-Comparatives have been in the market to do the testing of antimalware products, and there are times that the software companies will still dispute the testing results. The communications that come out of these disputes will help to create a better test methodology and one that is going to be accepted by the software companies.

I've also seen non-responses to communication attempts by antimalware vendors to the publishers of the testing reports. If one don't even want to respond to queries on one's report, it really weakens the value of the report's output.

One action that i have not seen (it might be due to my lack of visibility) is the participation of the experienced testing labs on this issue, other than AV-Comparatives. If these other organizations got involved in the open discussions, it might help to reduce the fingerpointing from the parties involved.

Testings that are done by these various organizations need to have a way to be "reproed", so to speak. Well-documented methods and papers to test antivirus products are available, especially by the established testing labs and experienced independent consultants, but i do not see such documentation on some of the recent reports . Perhaps the usage of simulated malware is one of the reason for such non-disclosure, which flaws any testing of antimalware products from the start

If the testing methodologies are sound and correct, there shouldn't be as much argument about the reports as there are right now.

Thus, the challenges coming out of this open season of antimalware testing would be:

  • creating and defining an open, accepted testing methodology that its results can be reproduced in any other testing labs/organizations
  • the testing organziations need to be open to suggestions to improve their testing
  • to find a way to match the results from these testing organizations with mass media organizations like CR, magazines and other news media, perhaps by defining a list of criteria for "end-user perceived value" of an antimalware product such as
    • speed of updates
    • usability
    • supportability
    • coverage
    • false positives
    • compatibility with other products

With the Virus Bulletin and the AVAR conferences in the coming months, there lies great opportunities for these issues to be discussed and addressed if the publishers of the recent reports can attend and discuss with the AV researchers face-to-face, and most probably over an endless flow of alcohol. 8)

 

technorati: , , ,

0 Comments:

Post a Comment

<< Home