Open Season For Antimalware Testing?
Not exactly following after the CR testing of the antispyware and antivirus products, but being brought into closer media attention from the CR aftermath, a number of other antimalware testing reports have been published recently.
These reports, published or co-published by "security experts", clouded the testing environment of a security space that is inherently complex and diverse to test fully. The consequence from these reports is a disservice to the antimalware product users, as this brings confusion and panick.
This space is highly specialized and requires years of dedicated discipline and knowledge before one is able to fully understand the challenges, let alone communicate these challenges to end users and the mass media.
With the major antimalware vendors labelling these reports as flawed and bogus, it creates a wrong impression that the antimalware vendors are bullying newcomers into the antimalware space, and/or that the vendors are attacking the results as these shows the weaknesses in the products. There's also an impression that "we know more than you do" that some bloggers, forum posters, and especially trolls, are using to create churn in a discussion that need to happen.
However, i believe that this is not the case. Experienced testing organizations like Virus Bulletin, ISCA/Checkmark, AV-Test, AV-Comparatives have been in the market to do the testing of antimalware products, and there are times that the software companies will still dispute the testing results. The communications that come out of these disputes will help to create a better test methodology and one that is going to be accepted by the software companies.
I've also seen non-responses to communication attempts by antimalware vendors to the publishers of the testing reports. If one don't even want to respond to queries on one's report, it really weakens the value of the report's output.
One action that i have not seen (it might be due to my lack of visibility) is the participation of the experienced testing labs on this issue, other than AV-Comparatives. If these other organizations got involved in the open discussions, it might help to reduce the fingerpointing from the parties involved.
Testings that are done by these various organizations need to have a way to be "reproed", so to speak. Well-documented methods and papers to test antivirus products are available, especially by the established testing labs and experienced independent consultants, but i do not see such documentation on some of the recent reports . Perhaps the usage of simulated malware is one of the reason for such non-disclosure, which flaws any testing of antimalware products from the start
If the testing methodologies are sound and correct, there shouldn't be as much argument about the reports as there are right now.
Thus, the challenges coming out of this open season of antimalware testing would be:
- creating and defining an open, accepted testing methodology that its results can be reproduced in any other testing labs/organizations
- the testing organziations need to be open to suggestions to improve their testing
- to find a way to match the results from these testing organizations with mass media organizations like CR, magazines and other news media, perhaps by defining a list of criteria for "end-user perceived value" of an antimalware product such as
- speed of updates
- false positives
- compatibility with other products
With the Virus Bulletin and the AVAR conferences in the coming months, there lies great opportunities for these issues to be discussed and addressed if the publishers of the recent reports can attend and discuss with the AV researchers face-to-face, and most probably over an endless flow of alcohol. 8)
technorati: antispyware, AntiVirus, testing, Antimalware