Jump to content

ESET missing from AVTest , Feb2018


novice

Recommended Posts

  • Most Valued Members

I'm pretty positive the exact same thing was posted by someone on the forum back in FEB when these results were spotted.

Av-Test seems to be a great source of information. That good it's not ran a test on windows 10 since December 2017 :lol: , so i would personally question their overall methodology and reliability.

https://www.av-test.org/en/antivirus/home-windows/windows-10/

MSE is never consistent in either its detection or false positives rate,  if you scroll through the (Av-Comparatives) chart for the whole of 2017 you will see that. Paid solutions are more consistent , regardless of who the vendor is.

Link to comment
Share on other sites

MSE is indeed impressive with their new cloud system. It blacklist threats very quickly with recent updates. But it also generates way more false positives.

I personally tend to interpret the FP test in such 3rd party test as: if one product performs poorly in such FP test, it is indeed bad (and there is actually a sensible gap betwen 1~2 FPs and 4~5 FPs in such test when using the product in real life). On the other hand, if one product performs well in such FP test, it doesn't necessarily mean it is indeed low in FPs in real life.

For some "low FP" product in such test, one can easily make it generate an FP using simple tools and innocent code (like hello world). And for some static ML engines with low FPs in such test, one can easily trigger an FP by randomly padding zeroes and ones at the end of a benign file. Such product is susceptible to "pool pollution" attack and can be bypassed with clever social engineering.

So there is a trade off here: you can do conservative detection in favor of less FPs, and leave aggresive detection to IT administrators; or you can be more aggressive in detection and optimize 3rd party FP test scores using certain white samples in the training set (e.g. a majority of AVC's FP test samples are software in top download category from software distribution sites). But if the optimization in the latter case is good enough to suppress FPs in real life is questionable :rolleyes:

There is no free lunch here, and my personal experience indicates that ESET tend to favor less FPs over a more aggressive detection in consumer products.

Edited by 0xDEADBEEF
Link to comment
Share on other sites

5 hours ago, 0xDEADBEEF said:

ESET tend to favor less FPs

ESET had 0FP in Aug2017 , while MSE had 2FP in Jan2018 and 1FP in Feb2018, from 1,500,000 samples, which is 0.00006%.

Is beyond reasonable.

 

Edited by MSE
Link to comment
Share on other sites

6 hours ago, cyberhash said:

not ran a test on windows 10 since December 2017 :lol:

Dec 2017 is 3 months ago, so not big deal.

Link to comment
Share on other sites

There is no Eset product in the latest AV-Test for Android mobile devices (March 2018), Windows 7 Home User (February 2018), Windows 10 Business Client (February 2018) and 16 Client-Server Solutions for Windows 10 (April 2018).

AV-Test and AV-Comparatives are one of my favorites institutes testing av security products. If a security solution is missing in these tests I wouldn't buy it in the future.

Link to comment
Share on other sites

5 hours ago, MSE said:

ESET had 0FP in Aug2017 , while MSE had 2FP in Jan2018 and 1FP in Feb2018, from 1,500,000 samples, which is 0.00006%.

Is beyond reasonable.

 

As I said, having bad FP score indicates bad product, but having a good FP score doesn't necessarily mean the actual FP of a product is low. I can easily make an innocent hello world program (without obfuscation techniques) and let several products which have low num of FPs in the AV test (say, 1~4) raise a false alarm. It is not easy to make ESET do so. Real world is much more complex than this (note their FP test only executed ~50 samples to test behavior blockers, which is more prone to FPs.) I have even experienced one product with good looking FP score flag PCMark as malicious and auto quarantine it, and one product delete multiple benign application during a disinfection process due to inappropriate OS tagging. These are something you might not know if you don't really try such product out. So far I didn't see an FP test that can well reflect the FP of those products in real world. 

Anyway, I am not here to defend why ESET doesn't participate in recent tests. Actually I am wondering the same.

I just feel it is necessary to state my observation for fairer comparison. There is always a trade-off. And whether a higher detection rate or a lower FP rate is more important is left to customers to decide. I personally feel that a security product will start to be meaningless when its FP exceed certain threshold, because in that case users will be busy excluding files, including malicious ones :rolleyes:

Edited by 0xDEADBEEF
Link to comment
Share on other sites

16 hours ago, MSE said:

however MSE free scores 100% in both tests!

To set the record straight in regards to the recent AV-Comparatives realtime test:

1. MSE was not tested but in fact Windows Defender ver. 4.12

2. Windows Defender had in fact a positive detection rate of 95% with the remaining 5% requiring user determination action to block or allow. The positive detection rate was the second lowest score among products tested.

3. Window Defender had 6 false positive detections.

Really, this is getting a bit redundant is it not?

Link to comment
Share on other sites

To set the record straight in regards to the recent AV_TEST test:

1. MSE 4.10 was tested on Win 7 SP1 , 64bit

2. Detection rate was 100% for Jan2018 and 100% for Feb2018

3. MSE 4.10 had 2 FP on Jan2018 and 1 FP on Feb 2018; industry average 8FP

Indeed, this  is getting a bit redundant.

Link to comment
Share on other sites

In the last SE Labs test, Oct. - Dec., 2017, MSE scored 75% in the protection category: https://selabs.uk/en/reports/consumers . It was the second from last in that category.

Starting to "get the picture" why Eset might no longer be participating in AV-Test comparatives? This Eset security blog article might explain some of what is happening on the AV lab test scene: https://www.welivesecurity.com/2018/04/13/anti-malware-testing-needs-standards/ .

For those that believe MSE offers equal or better protection than Eset, just go ahead and use it. And please stop posting about it in this forum. 

 

Link to comment
Share on other sites

ESET scored 93% in the same SE Labs test; hard to believe.

"For those that believe MSE offers equal or better protection than Eset, just go ahead and use it"

Doing so , as we speak!

Link to comment
Share on other sites

47 minutes ago, itman said:

This Eset security blog article might explain some of what is happening on the AV lab test scene

This reminds me of another interesting observation: a year ago I was always wondering why ESET is scored very badly in some 3rd party performance impact eval, because it is very counter intuitive from my own experience. After some leisure time benchmarking and analysis across different products, I start to see some reasons behind the numbers.

Some tests "successfully" avoided many scenario a caching mechanism may help. Installing new apps, starting an application, etc., all fall outside this range, and they may take up a large portion of the performance score. As performance optimizations are always for optimizing common cases and let uncommon cases finish gracefully, such performance benchmark break-down is in question.

Some cloud-based products incurs over 35% slow down in my app start up tests across multiple platforms, but it performs really well in some 3rd party app start up test. This makes me believe some targeted optimization may have been applied to help boost the test score.

Link to comment
Share on other sites

A couple other points to note about the recent SE Labs test referenced.

  1. The test was different from standard AV lab comparative tests in that it used malware samples used in targeted attacks.
  2. Target attacks are predominately performed against enterprise IT installations.
  3. No product tested achieved a score higher than 96% effectiveness against these attacks.
  4. Free AV solutions other than MSE did have respectable scores on this test.
  5. A paid AV solution, Bitdefender, that consistently scores at or near the top in conventional AV lab tests did not fare well in this test.

 

EXECUTIVE SUMMARY

Products Tested

Protection Accuracy (%)

Legitimate Accuracy (%)

Total Accuracy (%)

Norton Security

94%

98%

96%

Kaspersky Internet Security

88%

100%

96%

ESET Smart Security

79%

100%

93%

Avira Free Security Suite

77%

100%

92%

Avast Free Antivirus

75%

100%

91%

Trend Micro Internet Security

78%

96%

89%

AVG Antivirus Free Edition

70%

100%

89%

ZoneAlarm Free Antivirus

61%

100%

86%

F-Secure Safe Internet Security

76%

89%

84%

Bitdefender Internet Security

72%

90%

83%

Cisco Immunet

52%

100%

83%

Microsoft Security Essentials

41%

99%

78%

Qihoo 360 Total Security

27%

95%

70%

Edited by itman
Link to comment
Share on other sites

In regards to the Eset blog article link I posted previously, a comment was made about "commissioned" tests by AV labs. For those not familiar with what that is, the AV vendor pays the lab a fee to run a test on its product. The testing agreement usually entails the vendor can perform remedial activities for deficiencies found with subsequent retesting. Additionally, the agreement may contain provisions that the final test results can be made public or remain confidential. The problem with this type of testing is when it is done comparatively against competing AV vendor products. I believe the issues are rather obvious in this type of testing. Additionally, I have never reviewed a publically release commissioned test when the sponsoring AV vendor was not the top scoring product tested.

Recently, AV-Comparatives performed a commissioned test for Bitdefender that can be reviewed here: https://weblog.av-comparatives.org/advanced-endpoint-protection-test/ . This test has striking similarities to the recent SE Labs test referenced previously. There are however some striking differences. The AV-Comparatives test for Bitdefender employed what appears to me "synthetic" targeted attack malware payload techniques. I state this since it almost impossible to duplicate the actual payload delivery employed in these attacks. For example if the payload was deliver via a download Word document, the targeted organization will as a rule not provide that document for examination due to corporate privacy or data disclosure policies. In most cases however, the actual payload delivery method is unknown.

What is of interest is the glaring protection differences in the two AV lab tests for Bitdefender capability against the same class of malware attacks. I believe the reason is rather obvious. 

Edited by itman
Link to comment
Share on other sites

2 hours ago, itman said:

usually entails

 

2 hours ago, itman said:

may contain

 

2 hours ago, itman said:

I believe

Way to many unknown factors...

Link to comment
Share on other sites

Guest
This topic is now closed to further replies.
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...