Tuesday, January 22, 2013

SecureIT Plus (2013)


Working in the Microsoft Malware Protection Center must be tough. Here your company is among the biggest in the world, yet a little (but well-regarded) antivirus testing lab in Germany says your antivirus software doesn't cut the mustard. Not just once but twice in the last few months, Microsoft failed to receive certification from AV-Test.

Joe Blackbird, Program Manager at the Center, explained in a blog post that failing this test does not mean Microsoft's user aren't protected.

Transparency in Testing
In accord with principles of the AMTSO, AV-Test makes no secret of the methodology used in conducting their ongoing certification tests. That means Blackbird could check and verify exactly why Microsoft Security Essentials didn't pass.

The test gives equal weight to three elements of security: protection (keeping new malware from infesting a clean system), repair (clearing out malware that's already present), and usability (doing the job without slowing the system or falsely accusing valid programs). Microsoft did OK in the repair and usability areas but got just 1.5 of 6 possible points for repair.

Not Real-World?
Blackbird's main contention is that this test doesn't mirror the real-world experience of Microsoft's customers. According to AV-Test, Microsoft missed 28 of 100 zero-day threats. However, Microsoft's telemetry shows that "99.997 percent of our customers hit with any 0-day did not encounter the malware samples tested in this test." Note that he's not talking about 99.997 percent of all customers. He's saying that, of those who encountered a zero-day threat of some kind, only 0.003 percent encountered one that AV-Test actually used.

But is this necessarily an indictment of the testing method? Let's look at it the other way around. Zero-day attacks occur constantly, in huge numbers. If the 0.003 percent represents just one customer, then 300,000 customers have encountered such an attack. Of course, the number could be larger. AV-Test picked a random sample of 100 and just happened to find 28 that Microsoft can't detect. That sounds kind of bad, doesn't it?

Microsoft also missed 9 percent of the 216,000 files in the "recent malware" collection, but, said Blackbird, the missed samples "don't represent what our customers encounter. When we explicitly looked for these files, we could not find them on our customers' machines."

Customer-Focused Prioritization
"In December 2012,", Blackbird continued, "we processed 20 million new potentially malicious files, and, using telemetry and customer impact to prioritize those files, added protection that blocked 4 million different malicious files on nearly 3 million computers. Those 4 million files could have been customer-impacting if we had not prioritized them appropriately." In other words, Microsoft fared poorly in this test due to their emphasis on prioritizing files that would actively impact their customers.

It's an interesting point, but other vendors manage to protect their users and also earn top scores from AV-Test. Bitdefender, F-Secure, and Trend Micro all received 6 of 6 possible point in the protection test.

I grant the point that AV-Test didn't use all of the malware actually encountered by Microsoft's users for their test. That would be impossible. Antivirus testers must do their best to use representative samples. However, my own hands-on testing with Microsoft Security Essentials, using samples that are far from zero-day, suggests that it's just not as effective as the best antivirus products.

For more from Neil, follow him on Twitter @neiljrubenking.

Source: http://feedproxy.google.com/~r/ziffdavis/pcmag/~3/sqoIF8DgWIU/0,2817,2414448,00.asp

daytona 500 start time ryan zimmerman oscars red carpet jennifer lopez wardrobe malfunction hugo hugo nfl combine

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.