AV-Comparatives Rates Anti-Malware Performance

Wednesday, December 23, 2009

Last week AV-Comparatives.org released the results of their recent "Whole Product Dynamic Test," which challenges anti-malware products to protect test systems as if in the real world.

This week they've got a new one for us - a performance test on the same collection of products. The virus experts timed a variety of commonplace actions with and without each product installed and compared the results to determine which had the lowest impact on system performance.

AV-Comparatives used just-defragmented disks for testing and worked to eliminate any external factors that would influence performance, even considering the temperature in the lab. They repeated each test several times and averaged the results. In several cases they ran the test and then ran the same test again, to handle programs that learn and therefore run more quickly after the first time.

Overall their tests were very similar to the tests I use for evaluating performance in my reviews of security suites, except that they didn't attempt to measure impact on the system's boot time. The team also used results from the World Bench test suite, noting how much the overall score went down with each product installed.

Of course low performance impact is just one element that defines a good security product. I could write a product with absolutely no impact on performance simply by programming it to do nothing at all. But for the most part the high scorers were also high-powered in their protection.

Norton AntiVirus 2010, Microsoft Security Essentials 1.0, avast! Free 5.0 and seven others received three stars (the highest rating ADVANCED+); AVIRA AntiVir Premium 9.0 eked out the highest score. Even the low scorers eScan AntiVirus 10.0 and Trustport Antivirus 2010 achieved STANDARD (one star) protection.

Get the rest of this story on PCMag's Security Watch blog.



Post a Comment