============================================================ File 0XECSUM.TXT: "EXECUTIVE SUMMARY" AntiVirus/AntiMalware Product Test "2004-07" antiVirus Test Center (VTC), University of Hamburg ============================================================ [Formatted with non-proportional font (Courier), 72 columns] ********************************************************************** Content of this file: ********************************************************************** 0. Editors Foreword 1. Background of this test; Malware Threats Table ES0: Development of viral/malware threats 2. VTC Testbeds used in VTC test 2004-07 Table ES1: Content of VTC test databases in test 2004-07 3. Products participating in test 2004-07 Table ES2: List of AV products in test 2004-07 4. Serious problems: Many products crash, some dont find all files. Flaw in Microsoft FindFirst/FindNext routine? 5. Results of on-demand detection under Windows-2000: Table W2k-A: Development of W-2k scanners for file, macro and script viruses/malware from 2000-08 to 2004-07 Findings W2k.1 - W2k.7 Grading W2k products according to their detection performance 6. Results of on-demand detection under Windows-XP: Table WXP-A: Development of WXP scanners for file, macro and script viruses/malware from 2003-04 to 2004-07 Findings WXP.1 - WXP.7 Grading WXP products according to their detection performance 7. Comparison of detection behaviour for W32 platforms Grading AV products concerning W32-harmonical behaviour 8. Results of on-demand detection under LINUX (SUSE): Table W2k-A: Development of LINUX scanners for file, macro and script viruses/malware from 2001-07 to 2004-07 Findings LIN.1 - LIN.7 Grading LINUX products according to their detection performance 9. Comparison of on-demand detection under ALL system platforms 10. Conclusion: In Search of the "Perfect AV/AM product" 12. Availability of full test results 13. Copyright, License, and Disclaimer *********************************************************************** 1. Editors Foreword: ==================== VTC test "2004-07" was started in June 2003, with tesbeds frozen as known on December 31, 2003 and products submitted in February 2003 (test start was delayed as HEUREKA-3 test was completed only in May 2003). In test "2004-07", testbeds have grown significantly. With growth of our testbeds, we experienced much more problems as products behaved "abnormally" (see 8problms.txt). Moreover, some products (and our test crew) suffered from a known but uncorrected flaw in Microsoft FindFirst/FindNext routines which required postscans for almost ALL products on almost ALL testbeds (see 4). As in last test, AV products under two Windows platforms (Windows 2000 and Windows XP) as well as under one LINUX (SUSE) platform were tested. Concerning Windows platforms, customers assume equal behaviour of their AV/AM products when going from one W32 platform to another. We tested this assumption (which we call "W32-harmonical behaviour", see 9) and found that this assumptions is justified for many (though not all) products for macro/script viruses but NOT for W32 virus detection. One serious concern from our results is that AV producers concentrate more on detection of In-The-Wild (ITW) viruses than on zoo viruses. Indeed, one AV company - TrendMicro - informed us that they dont wish to participate in our test as they concentrate on ITW detection and are aware that their products will produce "unfavourable results" for VTCs zoo testbeds (see 3). Indeed, VTC tests differ from other tests in that large zoo testbeds are used for testing reliable and consistent detection. For many other AV products, detection rates of ITW viruses are perfect (100%) or excellent (>99%) but detection of zoo viruses is often significantly lower. Evidently, AV producers focusing on ITW detection forget that any ITW virus has been a zoo virus before becoming In-the-wild. It cannot surprise that customers of such products experience badly how neglection of zoo virus detection affects their IT services when a hithertoo unknown zoo virus is deployed broadly (the author of this test report had to advice several victims of such ill-advised "ITW-mindedness" aka "zoo-blindness"). And first victims - often large companies - can also not win from any fast exchange of newly "wildering" code: this is always too late for some! This test - as all previous ones - has been performed by students of Hamburg University Faculty for Informatics with special interest in IT Security (see our 4-semester curriculum started in 1988, on our homepage). Different from other tests where submitters of products have to pay a fee for being admitted to tests, VTC tests are "FREE OF FEE". This implies that students who have to complete their examinations and usually also work to earn their income are only "partially available" for tests. Most recently, availability of test supervisor and report author was also affected by new priorities. Finally, our hardware which is essentially funded by Faculty support (sometimes also by donation of new machines, usually more powerful than those which we can buy from university money) canNOT compete with the technical equipment in other testlabs. We regret that these circumstances cause delays in performing and publishing our regular test reports, but instead of hurrying to meet dates and expectations, we insist that assessed QUALITY of our test results shall have - also in the future - highest priority. Most work in VTC tests rest on the shoulders of our test crew, and the editor wishes to thanx them all for their devotion and hard work. (See VTC test team at the end of this report). 1. Background of this test: Malware Threats: ============================================ Malicious software (malware) including viruses (=self-replicating malware), trojan horses (=pure payload without self-replication), virus droppers and network malware (e.g. worms and hostile applets), are regarded as serious threats to PC users esp. when connected to Intranetworks and Internet. The development of malicious software can well be studied in view of the growth of VTC (zoo and In-The-Wild) testbeds. The following table summarizes, for previous and current VTC tests (indicated by their year and month of publication), the size of virus and malware (full = "zoo") databases (indicating each the different viruses and number of instantiations of a virus or malware and having in mind that some revisions of testbeds were made): Table ES0: Development of threats as present in VTC test databases: =================================================================== Table ES0-A: Development of File, Boot and Script Virus/ Malware Testbeds: ================================================= --------------------------------+---------------+------------------------ == FileViruses/Malware ==I =Boot Viruses=I =Macro Viruses/Malware= Test# Number Infected NumberI Number Infected Number Infected Number Viruses objects MalwareI viruses objects viruses objects malware --------+-----------------------+---------------+------------------------ 1997-07 12,826 83,910 213 I 938 3,387 I 617 2,036 72 1998-03 14,596 106,470 323 I 1,071 4,464 I 1,548 4,436 459 1998-10 13,993 112,038 3,300 I 881 4,804 I 2,159 9,033 191 1999-03 17,148 128,534 3,853 I 1,197 4,746 I 2,875 7,765 200 VKIT/Poly: +5 146,640 I I 1999-09 17,561 132,576 6,217 I 1,237 5,286 I 3,546 9,731 329 VKIT/Poly: +7 166,640 I I 2000-04 18,359 135,907 6,639 I 1,237 5,379 I 4,525 12,918 260 VKIT/Poly: +7 166,640 I I 2000-08 --- --- --- I --- --- I 5,418 15,720 500 2001-04 20,564 140,703 12,160 I 1,311 5,723 I 6,233 19,387 627 VKIT/Poly: +7 166,640 I I 2001-07H1 --- --- --- I --- --- I + 544 + 2,035 +102 2001-10 --- --- I --- --- I 6,762 21,677 683 2002-02H2 --- --- --- I --- --- I + 675 + 3,245 +720 --------+-----------------------+---------------+------------------------ THIS TEST: 2004-07 21,790 158,747 18,277 I ITW:11 149 I 7,306 25,231 747 --------+-----------------------+---------------+------------------------ Table ES0-B: Development of SCript Virus/Malware Testbeds: ========================================================== =ScriptViruses/Malware = Test# Number Infected Number viruses objects malware --------+----------------------- 1997-07 --- --- --- 1998-03 --- --- --- 1998-10 --- --- --- 1999-03 --- --- --- 1999-09 --- --- --- 2000-04 --- --- --- 2000-08 306 527 --- 2001-04 477 904 --- 2001-07H1 +206 + 386 --- 2001-10 481 1,079 30 2002-02H2 +854 +1,645 +270 --------+----------------------- THIS TEST: 2004-07 823 1,574 202 --------+----------------------- Remark #1: Before test 1998-10, an ad-hoc cleaning operation was applied to remove samples where virality could not be proved easily. Since test 1999-03, separate tests are performed to evaluate detection rates of VKIT-generated and selected polymorphic file viruses. Remark #2: Heureka tests (marked "H#") use only those samples (viruses and malware) which were newly found (marked "+") in a given period (for details, see related test reports); Heureka tests include macro and script viruses/malware. Remark #3: Special tests were performed between 1999 and 2001 to determine the ability of AV products to reliably detect the (then new) phenomenon of polymorphic viruses. With annual deployment of more than 5,000 viruses and several 1,000 Trojan horses, many of which are available from Internet, and in the absence of inherent system protection against dysfunctional software, users must rely on AntiMalware and esp. AntiVirus software to detect and eradicate - where possible - such malicious software. Hence, the detection quality of AntiMalware esp. including AntiVirus products becomes an essential prerequisite of protecting customer productivity and data. Virus Test Center (VTC) at Hamburg University?s Faculty for Informatics performs regular tests of AntiMalware and esp. AntiVirus Software. VTC recently tested current versions of on-demand scanners for their ability to identify PC viruses. Tests were performed on VTCs malware databases, which were frozen on their status as of *** December 31, 2002 *** to give AV/AM producers a fair chance to support updates within the 6 weeks submission period (product submission date: February 16, 2003). The main test goal was to determine detection rates, reliability (=consistency) of malware identification and reliability of detection rates for submitted or publicly available scanners; this test determined detection rates for file, macro and script viruses and nonviral malware. For transfer of both static and executable objects via The Internet, such objects are usually packed with some "packer". Among almost 100 packers, it was also tested whether viruses packed with 6 popular compressing methods (PKZIP, ARJ, LHA, 2 versions of RAR and CAB) would be detected (and to what degree) by scanners. It was also tested whether the assumption that AV/AM products detect ALL ITW viruses BOTH in unpacked and PACKED equally is valid. Rgerettably, only few products validate this assumption. Moreover, avoidance of False Positive alarms on "clean" (=non-viral and non-malicious) objects was also determined. Finally, a set of selected non-viral file, macro and script malware (droppers, Trojan horses, intended viruses etc) was used to determine whether and to what degree AntiVirus products may be used for protecting customers against Trojan horses and other forms of malware. VTC maintains, in close and secure connection with AV experts worldwide, collections of file, macro and script viruses as well as related malware ("zoo") which have been reported to VTC or AV labs. Here, we wish to thank our colleagues in the AV industry (especially those cooperation in the Computer Antivirus Research Organisation, CARO) for their cooperation and critical support. Moreover, following the list of "In-The-Wild Viruses" (published on regular basis by Wildlist.org), a collection of viruses reported to be broadly visible is maintained to allow for comparison with other tests; presently, this list does not report ITW Malware. 2. VTC Testbeds used in VTC test "2004-07": =========================================== The current sizes of different VTC testbeds (developed from previous testbeds through inclusion of new viruses and malware and some revision) is given in the following table (for detailed indices of VTC testbeds, see file "a3testbed.zip") Table ES1: Content of VTC test databases in test "2004-07": ========================================================================== "Full Zoo": 21,790 File Viruses in 158,747 infected files 8,001 different File Malware in 18,277 files 664 Clean file objects for False Positive test 7,306 Macro Viruses in 25,231 infected documents 450 different Macro Malware in 747 macro objects 329 Clean macro objects for False Positive test 823 different script viruses in 1,574 infected objects 117 different script malware in 202 macro objects ----------------------------------------------------------------- "ITW Zoo": 11 Boot Viruses in 149 infected images/sectors 50 File Viruses in 443 infected files 124 Macro Viruses in 1,337 infected documents 20 Script Viruses in 122 infected objects ============================================================================ For a survey of platforms, see A4tstdir.txt, and for the content of the resp. testbeds see A3TSTBED.zip (available for download). Concerning quality of viral testbeds, it is sometimes difficult to assess the "virality" (=ability of a given sample to replicate at least twice under given constraints) of large "viral zoo" databases, esp. as some viruses work under very specific conditions. We are glad to report, that Dr. Vesselin Bontchev, Eugene Kaspersky and Dr. Igor Muttik and other experts esp. from the Organisation of cooperating AV experts (Computer Antivirus Research Organisation, CARO) helped us significantly with critical and constructive comments to establish viral testbeds, the residual non-viral part of which should be very small. Post-analysis and Quality Measure of VTC testbeds: -------------------------------------------------- Those samples which some product didnot properly detect are usually sent to a trustworthy expert from that company for post-analysis. In almost all cases, newer versions of that product were able to detect previously missed samples. We were esp. happy to receive comments from Vesselin Bontchev, Eugene Kaspersky, Fridrik Skulason and Igor Muttik about some specimen which should not have been included in one testbed (though possibly belonging to another category. Indeed, some samples contained remainders of some improperly cleaned virus. After an analysis of those comments, we can report that a VERY SMALL number of entries in few testbeds does NOT belong there: Number of Number of objects Inaccuracy Improper samples in testbed ratio ----------------+------------------+---------- Zoo File testbed: 25 I 158,747 I 0,0006% Zoo Macro testbed: 5 I 25,231 I 0,02% Zoo Script testbed: 3 I 1,337 I 0,2% ----------------+------------------+---------- We also wish to thank "WildList Organisation" for supporting us with their set of In-The-Wild viruses; the related results may support users in comparing VTC tests with other ITW-only tests. 3. Products participating ins Test "2004-07": ============================================= For test "2004-07", the following *** 27 *** AntiVirus products (adressed in subsequent tables by a 3-letter abbreviation) under DOS, Windows-98, Windows-2000 and LINUX in 58 different versions were tested: Table ES2: List of AV products in test "2004-07" ================================================ Abbreviation/Product/Version Tested under Platform ---------------------------------------------------------------------- ANT = Antivir: H+B EDV Datentechnik Germany W2K WXP LIN AVA = Avast! ALWIL Software, Czech Republic W2K WXP AVG = AVG Antivirus System: GriSoft,Czech Republic W2K WXP AVK = AntiVirenKit: GData Software, Germany W2K WXP AVP = Kaspersky Anti-Virus: Kaspersky Lab. Russia W2K WXP LIN BDF = BitDefender Professional: SOFTWIN, Romania W2K WXP CLA = CLAM AntiVirus: Open Antivirus Project LIN CMD = Command Antivirus: Command Software Systems, USA W2K WXP LIN DRW = Dr. Web: DialogueScience, Russia W2K WXP LIN FIR = Fire Anti-virus: Prognet Technologies India W2K WXP FPR = F-PROT: Frisk Software Intnl. Iceland W2K WXP LIN FSE = F-SECURE: F-Secure Corporation, Finland W2K WXP LIN GLA = Gladiator AV: Author="Gladiator" W2K WXP IKA = Ikarus Virus Utilities: IKARUS Software Austria W2K WXP LIN INO = eTrust AV: Computer Associates Intnl. USA W2K WXP NAV = Norton Antivirus: Symantec, USA W2K WXP NVC = Norman Virus Control: Norman Data Defense, Norway W2K WXP OAV = Open AntiVirus: Open Antivirus Project LIN PAV = Power AV: GData Software, Germany W2K WXP PER = Peruvian AntiVirus: PER Systems, Peru W2K WXP PRO = Protector: Proland Software, India W2K WXP QHL = Qhickheal: Cat Computer Services India W2K WXP RAV = RAV Antivirus: GeCAD Software, Romania W2K WXP SCN = McAfee ViruScan: Network Associates, USA W2K WXP LIN SWP = Sophos AV: Sophos, UK W2K WXP LIN VBR = VirusBuster: Leprechaun Australia W2K WXP VSP = VirScanPlus: Ralph Roth, Germany W2K WXP ---------------------------------------------------------------------- Products tested: 25+ 25 + 11 = 61 ---------------------------------------------------------------------- For details of AV products including options used to determine optimum detection rates: see A3SCNLS.TXT. For scanners where results are missing: see 8problms.txt. In general, AV products were either submitted or, when test versions were available on Internet, downloaded from respective ftp/http sites. Few scanners were not available either in general (e.g. TNT) or for this test, some of which were announced to participate in some future test. Finally, very few AV producers answered VTCs requests for submitting scanners with electronic silence. Concerning frequently asked questions, some AV producers deliberately do NOT submit their products and even FORBID VTC to test their product:: TrendMicro Germany has again recently informed the author of this report that they are NOT interested in VTC test participation as their scanner is deliberately trimmed to on-access scanning and detection of In-The-Wild viruses. As VTC emphasize also the detection of zoo viruses where their products would produce "unfavourable results", there is still no intent to submit their products. When 2 experts from TrendMicro asked to test their product but NOT TO PUBLISH results but report them only to TM lab, VTC - as university lab devoted to public information - had to refuse that "offer". Consequently, VTC refrains from inviting TrendMicro for future test participation. Panda has permitted tests to any institution and university *except VTC*. Background: after first participation of a product (with less positive results), Panda CEO requested that VTC transfers the whole testbed to Panda labs, as condition for future test participation. VTCs policy is to send missed samples to participating companies, but sending the whole testbed is inconsistent with VTCs security policy (see VTC Code of Conduct). The following paragraphs survey essential findings in comparison with last VTC tests (performance over time), as well as some relative "grading" of scanners for detection of file, macro and script viruses, both in full "zoo" and "In-The-Wild" testbeds, of macro and script malware, as well as detection of ITW file and macro viruses in objects packed with ARJ, LHA, ZIP, RAR, WinRAR and CAB. Finally, the ability of AV products to avoid False Positive alarms is also analysed. Detailed results including precision and reliability of virus and malware identification (including the grids used to assign a performance level to each product) are presented in files (platform-specific): for W32: 6iw2k.txt, 6jwxp.txt comparison of W32 results: 6mcmp32.txt for Linux: 6llin.txt comparison of ALL platforms: 6ncmpos.txt In a rather detailed analysis, detection rates are presented for each platform (operating systems), and product behaviour is graded in compa- rison with all products tested on ther resp. platform: Evaluation/Grading for W-2k products: 7ievaw2k.txt 7jevawxp.txt for LINUX products: 7levalin.txt Under the scope of VTCs grading system, a "Perfect AV/AM product" would have the following characteristics: Definition (1): A "Perfect AntiVirus (AV) product" -------------------------------------------------- 1) Will detect ALL viral samples "In-The-Wild" AND at least 99.9% of zoo samples, in ALL categories (file, boot, macro and script-based viruses), with always same high precision of identification and in every infected sample, 2) Will detect ALL ITW viral samples in compressed objects for all (6) popular packers, and +2A) Will detect ALL ITW samples both in unpacked instantiations AND packed with ALL (6) popular packers, and 3) Will NEVER issue a False Positive alarm on any sample which is not viral. Remark: detection of "exotic viruses" is presently NOT rated. Definition (2): A "Perfect AntiMalware (AM) product" ---------------------------------------------------- 1) Will be a "Perfect AntiVirus product", That is: 100% ITW detection AND >99% zoo detection AND high precision of identification AND high precision of detection AND 100% detection of ITW viruses in compressed objects, AND 0% False-Positive rate, 2) AND it will also detect essential forms of malicious software, at least in unpacked forms, reliably at high rates (>90%). Remark: detection of "exotic malware" is presently NOT rated. 4. A serious problem: Flaw in Microsoft FindFirst/FindNext routine? =================================================================== Since VTC tests started (including the work of Vesselin Bontchev in the early 1990s), we have experienced many problems. Products were often difficult to install and manage (see 8problems). With the growing size and diversity of testbeds, it was expected that problems would again grow. But there is one problem which not only affects testers with large viral databases (which is hardly the situation which customers experience). More than ever before, we found after completion of some test run that AV products had NOT TOUCHED all parts of the directory, for no obvious reason and always without any diagnosis (no exception etc). In such cases, we determined those parts of the testbed which had not been processed and restarted the product ("postscan"). When after completion some remainder part was untouched, we started a 2nd postscan for the remainder. The most probably reason for such behaviour of SEVERAL products which otherwise behave "smoothly" is: the methods offered by Microsoft to traverse a directory, esp. routines FindFirst and FindNext, DONT WORK RELIABLY on large directories. This effect has been reported first by Eugene Kaspersky, but we have seen NO IMPROVEMENT OR CORRECTION. Evidently, this problems seems to be related to the invocation of those routines (FF/FM), and this may be affected by some compiler or assembler. Only through excessive inspection of resulting testlogs ("test quality assurance"), we could reduce the impact of this MS flaw on our test results: but it is "NOT NATURAL" that anyone must start an AV product more than once to be sure that all potentially malicious objects had ALL been checked! This problem may not only show its dirty face for large virus/malware testbeds. With growing sizes of customer directories, the likelihood that NOT ALL OBJECTS are touched by any method using FF/FM grows. As this is a problem also for many companies with large directories, WE STRONGLY REQUEST THAT MICROSOFT CURES THIS FLAW. 5. Results of on-demand detection under Windows-2000 (W2k): =========================================================== This is a summary of the essential findings for AV/AM products under W-2k. For details see 7ievaw2k.txt. Meant as a perspective of product results, the following table (W2k-A) lists all results of W2k scanners for zoo detection of file, macro and script viruses, in last 5 VTC tests. Moreover, differences ("delta") in resp. detection rates for those products which participated in last 5 tests are also given, and mean values are calculated. Table W2k-A: Comparison: File/Macro/Script Virus Detection Rate: ================================================================ Table W2k-A1: Detection performance for file ZOO viruses: =================================== Scan I ==== File Virus === ner I Detection -----+--------------------- Test I 0104 0212 0407 Delta -----+--------------------- ANT I - - 90.9 - AVA I 95.0 96.2 95.7 -0.5 AVG I 81.9 80.6 79.8 -0.8 AVK I 99.8 99.9 100~ +0.1 AVP I 99.9 100~ 100~ 0.0 BDF I - 82.9 84.4 +1.5 CLE I - - - - CMD I 97.8 98.5 98.6 +0.1 DRW I - 98.3 79.0-19.3 FIR I - - 75.4 - FPR I 97.8 98.8 99.5 +0.7 FPW I 97.8 98.8 - - FSE I - 100~ 100~ 0.0 GLA I - - 40.7 - IKA I - 89.2 90.8 +1.6 INO I 97.9 98.7 95.8 -2.9 MCV I - - - - MR2 I - 9.5 - - NAV I 93.9 98.3 99.3 +1.0 NVC I 98.1 97.8 95.0 -2.8 PAV I 97.5 - 100~ - PER I - - 35.9 - PRO I 70.6 70.4 67.2 -3.2 QHL I - - 59.0 - RAV I 93.5 94.7 99.4 +4.7 SCN I 89.0 99.8 100~ +0.2 SWP I - - 98.2 - VBR I - - 68.5 - VSP I - 14.0 14.6 +0.6 -----+--------------------- Mean : 93.6 85.6 82.7 -1.5% M.>10%: 89.8 82.7 -----+--------------------- Table W2k-A2: Detection performance for macro/script ZOO viruses: =========================================== Scan I ======== Macro Virus ======== + ======= Script Virus ======== ner I Detection I Detection -----+--------------------------------+-------------------------------- Test I 0008 0104 0110 0212 0407 DeltaI 0008 0104 0110 0212 0407 Delta -----+ -------------------------------+-------------------------------- ANT I 93.3 - - - 97.9 - I 53.9 - - - 87.5 - AVA I 94.1 95.7 97.7 97.8 97.9 +0.1 I 15.0 29.1 29.6 31.5 88.7 +57.2 AVG I 97.9 98.3 98.4 98.1 98.0 -0.1 I 45.8 57.9 62.9 63.9 68.2 +4.3 AVK I 100~ 100~ 100% 100~ 100~ 0.0 I 91.5 99.8 100% 99.0 99.8 +0.8 AVP I 100~ 100~ 100~ 100~ 100~ 0.0 I 88.2 99.8 100% 98.9 99.7 +0.8 BDF I 99.0 - - 99.0 98.1 -0.9 I 61.4 - - 72.4 94.3 +21.9 CLE I - - - - - - I 4.2 - - - - - CMD I 100% 100% 100~ 99.9 99.9 0.0 I 93.5 96.9 93.2 89.1 98.5 +9.4 DRW I 97.5 - 99.5 99.4 99.4 0.0 I 59.8 - 95.4 94.7 95.4 +0.7 FIR I - - - - 85.9 - I - - - - 75.9 - FPR I - 100% 100~ 100~ 99.9 -0.1 I - 96.9 94.6 88.7 99.3 +10.6 FPW I 100% 100% 100~ 100~ - . I 90.8 96.9 94.6 88.7 - . FSE I 100% 100% 100% 100~ 100~ 0.0 I 96.7 100% 100% 99.5 100% +0.5 GLA I - - - - 1.5 - I - - - - 49.5 - IKA I - - - 96.2 96.5 +0.3 I - - - 81.2 91.7 +10.5 INO I 99.8 99.7 99.9 99.9 99.9 0.0 I 78.1 93.1 93.9 94.7 97.5 +2.8 MCV I - - 88.5 - - - I - - 27.7 - - - MR2 I - - 0.7 10.4 - - I - - 83.3 81.0 - - NAV I 97.7 97.0 99.5 99.6 99.9 +0.3 I 36.6 54.5 94.2 96.8 98.3 +1.5 NVC I 99.9 99.8 99.8 99.8 99.2 -0.6 I 83.7 88.5 91.3 87.6 99.7 +12.1 PAV I 100~ 99.4 100% - 100~ - I 90.2 98.5 100% - - PER I 85.0 68.2 - - 69.8 - I 0.0 22.0 - - 22.9 - PRO I 69.1 67.1 - 72.7 73.1 +0.4 I 12.1 40.7 - 59.8 70.4 +10.6 QHL I 0.0 - - - - - I 6.9 - - - 29.1 - RAV I 96.9 99.6 99.5 99.9 99.8 -0.1 I 47.1 84.9 82.5 96.1 99.7 +3.6 SCN I 100% 100% 100% 100% 100~ -0.0 I 95.8 100% 99.8 99.6 100% +0.4 SWP I - - - - 99.7 - I - - - - 96.8 - VBR I - - - - 98.4 - I - - - - 46.4 - VSP I - 0.0 0.~ 0.~ 0.1 +0.1 I - 85.3 84.0 81.2 83.5 +2.3 -----+--------------------------------+-------------------------------- Mean : 99.9 89.7 88.0 88.0 88.1 -0.1%I 57.6 79.4 84.8 84.4 83.2 +8.8% M.>10%: 98.9 92.9 96.1 I 91.9 84.4 83.2 -----+ -------------------------------+-------------------------------- Remark: for abbreviations of products (code names), see appendix A5CodNam.txt. Concerning ZOO file virus detection under W2k, "mean" detection rate is further reduced to an inacceptably low level (82.7%). Also thoise products which participated in last test didnot significantly improve their (low) detection rates. Concerning zoo macro virus detection under W2k, "mean" detection rate is almost stable on a still inacceptably low level (<90%). When not counting those products with extremely low detection rate, the mean result is more promising. Concerning zoo script virus detection under W2k, rates are stbable on too low a level. The mean rate of those products which participated in last test developped very well, but the high gain (+8.8%) resulted essentially from large improvements of 4 products. Findings W2k.1: General development of zoo virus detection rates of AV products under Windows 2000: ---------------------------------------------------------------- For W-2000, mean detection rates of zoo viruses are still insufficient, with file virus detection rates going down, macro virus detection rates being stable, script virus detection rates improved. Still significant work is needed. Mean detection rates remain inacceptably low: mean file zoo virus detection rate: 82.7% mean macro virus detection rate: 88.1% mean script virus detection rate: 83.2% ------------------------------------------------ Concerning file zoo viruses: NO product detects ALL viruses ("perfect") 6 products detect more than 99% and are rated "excellent": AVK,AVP,FSE,PAV,SCN;RAV. ------------------------------------------------- Concerning macro zoo viruses: NO product detects ALL macro zoo viruses 13 products detect almost all macro viruses in almost all files and are rated "excellent": AVK,AVP,FSE,PAV,SCN; CMD,FPR,INO,NAV;RAV,SWP,DRW,NVC. ------------------------------------------------- Concerning script zoo viruses: 2 products detect ALL viruses and are rated "perfect": FSE,SCN. 5 products detect almost all script viruses in almost all files and are rated "excellent": AVK,AVP,NVC,RAV;FPR ------------------------------------------------- Findings W2k.2: Development of ITW file/macro/script virus detection rates of AV products under Windows 2000: --------------------------------------------------------------------------- 1 AV product (out of 25) detects ALL In-The-Wild file, macro and zoo viruses in ALL instantiations (files) and is "perfect":SCN 8 scanners are "excellent": AVK,AVP,FSE,NAV,PAV,FPR,INO,RAV -------------------------------------------------- Concerning detection of ITW file viruses: 1 "perfect" scanner: SCN 11 "excellent" scanners: AVK,AVP,DRW,FPR,FSE,NAV,PAV,RAV,INO,SWP,AVA -------------------------------------------------- Concerning detection of ITW macro viruses: 10 "perfect" scanners: ANT,AVK,AVP,BDF,DRW,FSE,NAV,PAV,SCN,SWP 8 "excellent" scanners: AVG,CMD,FPR,INO,RAV,AVA,IKA,PRO -------------------------------------------------- Concerning detection of ITW script viruses: 6 "perfect" scanners: AVK,AVP,FSE,NAV,PAV,SCN 3 "excellent" scanners: FPR,INO,RAV -------------------------------------------------- Findings W2k.3: Assessment of overall (ITW/zoo) detection rates of AV products under Windows 2000: ----------------------------------------------------------------- NO W2k product is overall rated "perfect" --- 1 "excellent" overall scanner: SCN 0 "very good" overall scanner: --- ------------------------------------------------- Findings W2k.4: Performance of W2k scanners by virus classes of AV products under Windows 2000: ------------------------------------------------------------ Perfect scanners for file zoo: --- Excellent scanners for file zoo: AVK,AVP,FSE,PAV,SCN,FPR,RAV,NAV Very Good scanners for file zoo: CMD,SWP,INO,AVA,NVC ------------------------------------------------------------ Perfect scanners for macro zoo: --- Excellent scanners for macro zoo: AVK,AVP,FSE,PAV,SCN,CMD,INO,FPR,NAV,RAV,SWP,DRW,NVC Very Good scanners for macro zoo: VBR,BDF,AVG,AVA,ANT,AVA,IKA ------------------------------------------------------------ Perfect scanners for script zoo: FSE,SCN Excellent scanners for script zoo: AVK,AVP,PAV,RAV,FPR Very Good scanners for script zoo: NAV,CMD,INO,DRW,SWP ------------------------------------------------------------ Findings W2k.5: Detection of packed viral (ITW) objects of AV products under Windows 2000: -------------------------------------------------------- Concerning detection of packed file AND macro viruses: NO product is "perfect": --- 1 product is "excellent": SCN NO product is "very good": --- ------------------------------------------------------- Concerning detection of packed FILE ITW viruses: 0 product is "perfect": --- 1 product is "excellent": SCN 0 product is "very good": --- ------------------------------------------------------- Concerning detection of packed MACRO viruses: 5 products are "perfect": AVK,AVP,BDF,FSE,PAV 5 products are "excellent": AVA,DRW,QHL,RAV,SCN 5 products are "very good": CMD,FPR,INO,NAV,SWP ------------------------------------------------------- Concerning EQUAL detection of UNPACKED AND PACKED of ITW file AND macro viruses: 2 "perfect" products have NO LOSS for 6 packers: AVP,FSE 3 "excellent" products have NO LOSS for 5 packers: PAV,RAV,SCN 1 "very good" product has NO LOSS for 4 packers: SWP ------------------------------------------------------- Findings W2k.6: Avoidance of False Alarms of AV products under Windows 2000: -------------------------------------------------- Findings W2K.06: Avoidance of False-Positive Alarms is rather well developed, at least for file-FP avoidance. 11 Overall FP-avoiding "perfect" W2k scanners: ANT,AVA,AVG,BDF,GLA,INO,NAV,PRO,SCN,SWP,VSP 7 more products are "excellent": AVK,AVP,CMD,FPR,FSE,PAV,RAV --------------------------------------------------- Concerning file-FP avoidance, ALL 25 products are "perfect": ANT,AVA,AVG,AVK,AVP,BDF, CMD,DRW,FIR,FPR,FSE,GLA,IKA,INO,NAV, NVC,PAV,PER,PRO,QHL,RAV,SCN,SWP,VBR,VSP --------------------------------------------------- Concerning macro-FP avoidance, 11 products are "perfect": ANT,AVA,AVG,BDF,GLA,INO,NAV,PRO,SCN,SWP,VSP 7 products are "excellent": AVK,AVP,CMD,FPR,FSE,PAV,RAV --------------------------------------------------- Findings W2k.7: Detection rates for file/macro/script malware of AV products under Windows 2000: ------------------------------------------------------------- Findings W2k.07:File and Macro Malware detection under W2k is less developed compared to last test whereas script malware detection is improving. Concerning overall malware detection: 0 products are "perfect": --- 5 products are "excellent": FSE,AVK,PAV,AVP,SCN 2 products are "very good": NAV,FPR -------------------------------------------------- 0 product is "perfect": --- 7 products are "excellent": FSE,PAV,AVK,FPR,AVP,SCN,NAV 3 products are rated "very good": CMD,SWP,INO -------------------------------------------------- Concerning only macro malware detection, 5 products are "perfect": AVK,AVP,CMD,FSE,PAV 11 products are "excellent": SCN,INO,RAV,DRW,NAV,SWP,NVC,BDF,AVA,IKA,VBR 1 product is rated "very good": ANT -------------------------------------------------- Concerning only script malware detection, 0 products are "perfect": --- 5 products are "excellent": AVK,FSE,AVP,SCN,PAV,RAV 3 products are rated "very good": NAV,FPR,AVA -------------------------------------------------- Grading W2k products according to their detection performance: ============================================================== Under the scope of VTCs grading system (see 4), we summarize our results for W2k-related scanners: ***************************************************************** In VTC test "2004-07", we found *** NO perfect W2k AV product *** and we found *** No perfect W2k AM product *** ***************************************************************** But several products seem to approach our definition on a rather high level (taking into account the highest value of "perfect" defined on 100% level and "Excellent" defined by 99% for virus detection, and 90% for malware detection): Test category: "Perfect" "Excellent" ----------------------------------------------------------------- W2k file ITW test: SCN AVK,AVP,DRW,FPR,FSE,NAV, PAV,RAV,INO,SWP,AVA W2k macro ITW test: ANT,AVK,AVP,BDF,DRW, AVG,CMD,FPR,INO,RAV, FSE,NAV,PAV,SCN,SWP AVA,IKA,PRO W2k script ITW test: AVK,AVP,FSE,NAV, FPR,INO,RAV PAV,SCN ------------------------------------------------------------------ W2k file zoo test --- AVK,AVP,FSE,PAV,SCN, FPR,RAV,NAV W2k macro zoo test: --- AVK,AVP,FSE,PAV,SCN,CMD, INO,FPR,NAV,RAV,SWP,DRW,NVC W2k script zoo test: FSE,SCN AVK,AVP,PAV,RAV,FPR ------------------------------------------------------------------ W2k file pack test: --- SCN W2k macro pack test: AVK,AVP,BDF,FSE,PAV AVA,DRW,QHL,RAV,SCN + W2k pack/unpack test: AVP,FSE PAV,RAV,SCN ------------------------------------------------------------------ W2k file FP avoidance: ANT,AVA,AVG,AVK, --- AVP,BDF,CMD,DRW,FIR,FPR,FSE, GLA,IKA,INO,NAV,NVC,PAV,PER, PRO,QHL,RAV,SCN,SWP,VBR,VSP W2k macro FP avoidance: ANT,AVA,AVG,BDF, AVK,AVP,CMD,FPR, GLA,INO,NAV,PRO,SCN,SWP,VSP FSE,PAV,RAV ------------------------------------------------------------------ W2k file malware test: --- FSE,PAV,AVK,FPR, AVP,SCN,NAV W2k macro malware test: AVK,AVP,CMD,FSE,PAV SCN,INO,RAV,DRW,NAV, SWP,NVC,BDF,AVA,IKA,VBR W2k script malware test: --- AVK,FSE,AVP,SCN,PAV,RAV ------------------------------------------------------------------ In order to support the race for more customer protection, we evaluate the order of performance in this W2k test with a simple algorithm, by counting the majority of places (weighing "perfect" twice and "excellent" once), for the first places: ************************************************************ "Perfect" W-2000 AntiVirus product: =NONE= (22 points) "Excellent" W-2000 AV products: 1st place: SCN (17 points) 2nd place: FSE (16 points) 3rd place: AVP (15 points) 4th place: AVK,PAV (13 points) 6th place: NAV,RAV (11 points) 8th place: FPR ( 9 points) 9th place: BDF,INO,SWP ( 8 points) 12th place: AVA,DRW ( 7 points) 14th place: ANT ( 6 points) 15th place: AVG,CMD,PRO ( 5 points) 18th place: GLA,VSP ( 4 points) 20th place: IKA,NVC,QHL ( 3 points) 22th place: FIR,PER,VBR ( 2 points) ************************************************************ "Perfect" W-2000 AntiMalware product: =NONE= (28 points) "Excellent" W-2000 AntiMalware product: 1st place: SCN,FSE (20 points) 3rd place: AVP (19 points) 4th place: AVK,PAV (17 points) 6th place: NAV,RAV (13 points) 8th place: FPR (10 points) 9th place: BDF,INO,SWP ( 9 points) 12th place: AVA,DRW ( 8 points) 14th place: CMD ( 7 points) 15th place: IKA,NVC ( 4 points) 17th place: VBR ( 3 points) ************************************************************ 6. Results of on-demand detection under Windows-XP (WXP): ========================================================= This is a summary of the essential findings for AV/AM products under WXP. For details see 7jevawxp.txt. Meant as a perspective of product results, the following table (WXP-A) lists all results of WXP scanners for zoo detection of file, macro and script viruses, in last 2 VTC tests. Moreover, differences ("delta") in resp. detection rates for those products which participated in last 2 tests are also given, and mean values are calculated. Table WXP-A: Comparison: File/Macro/Script Virus Detection Rate: ================================================================ Scan I === File Virus === + == Macro Virus === + === Script Virus === ner I Detection I Detection I Detection -----+--------------------+--------------------+---------------------- Test I 0304 0407 Delta I 0304 0407 Delta I 0304 0407 Delta -----+--------------------+------------------------------------------- ANT I - 90.9 - I - 97.9 - I - 87.5 - AVA I - 95.7 - I - 97.9 - I - 88.8 - AVG I - 79.8 - I - 98.0 - I - 68.2 - AVK I - 100~ - I - 100~ - I - 99.8 - AVP I 100~ 100~ 0.0 I 100~ 100~ 0.0 I 98.9 99.7 +0.8 BDF I 82.9 84.4 +1.5 I 99.0 98.1 -0.9 I 72.4 94.3 +22.1 CMD I 98.5 98.6 +0.1 I 99.9 99.9 0.0 I 89.1 98.5 +9.4 DRW I 98.3 77.8 -20.5 I 99.4 99.4 0.0 I 94.7 95.4 +0.7 FIR I - 75.4 - I - 85.9 - I - 75.9 - FPR I - 99.5 - I - 99.9 - I - 99.3 - FSE I 100~ 100~ 0.0 I 100~ 100~ 0.0 I 99.5 100% +0.5 GLA I - 40.7 - I - 1.5 - I - 49.5 - IKA I - 90.5 - I - 76.5 - I - 91.7 - INO I 98.7 95.8 -2.9 I 99.9 99.9 0.0 I 94.7 97.5 +2.8 NAV I 98.3 99.3 +1.0 I 99.6 99.9 +0.3 I 96.8 98.7 +1.9 NVC I 97.8 95.0 -2.8 I 99.8 99.2 -0.6 I 87.6 86.3 -1.3 PAV I - 100~ - I - 100~ - I - 99.7 - PER I - 35.9 - I - 69.8 - I - 22.9 - PRO I - 67.2 - I - 73.1 - I - 70.4 - QHL I - 56.0 - I - - - I - 29.1 - RAV I 96.7 99.4 +2.7 I 99.9 99.8 -0.1 I 96.1 99.7 +1.6 SCN I 99.8 100~ +0.2 I 100% 100~ -0.0 I 99.6 100% +0.4 SWP I - 98.2 - I - 99.7 - I - 96.8 - VBR I - 68.5 - I - 98.4 - I - 46.4 - VSP I - 14.6 - I - 0.1 - I - 83.5 - -----+--------------------+--------------------+-------------------- Mean : 97.1 82.6% -2.1%I 99.8 87.3% -0.1 I 92.9 83.2% +3.9% >10%: 97.1 82.6% -2.1%I 99.8 95.2% -0.1 I 92.9 83.2% +3.9% -----+--------------------+--------------------+-------------------- Remark: for abbreviations of products (code names), see appendix A5CodNam.txt. Concerning ZOO file virus detection under WXP, "mean" detection rate is reduced to an inacceptably low level (82.6%). Also those products which participated in last test didnot significantly improve their (low) detection rates (the mean loss is essentially due to a large loss of one product). Concerning zoo macro virus detection under WXP, "mean" detection rate significantly reduced to an inacceptably low level (<90%). When not counting those products with extremely low detection rate, the mean result is slightly more promising. Concerning zoo script virus detection under W2k, rates are significantly reduced. The mean rate of those products which participated in last test developped well, but the high gain (+8.8%) resulted essentially from large improvement of 1 product. Findings WXP.1: General development of zoo virus detection rates of AV products under Windows XP: ---------------------------------------------------------------- In comparison with last test (where 10 good products were selected), mean detection rate is significantly reduced (due to some products with very bad detection rates): mean file zoo virus detection rate: 82.6% mean macro virus detection rate: 87.3% mean script virus detection rate: 83.2% -------------------------------------------------- Concerning detection of zoo viruses in all categories (file, macro and script), no product is "perfect" but 3 are "excellent" as they detect at least 99% of all zoo viruses: FSE,SCN,AVP -------------------------------------------------- Concerning file zoo viruses: NO product detects ALL viruses ("perfect") 3 products detect more than 99% and are rated "excellent": AVP,FSE,SCN -------------------------------------------------- Concerning macro zoo viruses: NO product detects ALL macro zoo viruses in all files and is rated "perfect": --- 13 products detect at least 99% macro viruses and are rated "excellent": AVK,AVP,FSE,PAV,SCN,CMD, FPR,INO,NAV,RAV,SWP,DRW,NVC -------------------------------------------------- Concerning script zoo viruses: 2 products detect ALL viruses and are rated "perfect": FSE,SCN 5 products detect at least 99% of script viruses and are rated "excellent": AVK,AVP,PAV,RAV,FPR -------------------------------------------------- Findings WXP.2: Development of ITW file/macro/script virus detection rates of AV products under Windows XP: -------------------------------------------------------------------------- Mean detection rate of ITW file/macro/script viruses has significantly decreased in comparison with last test: Mean detection rate of - ITW file viruses: 91.5% (last test:99.8%) - ITW macro viruses: 91.5% (last test: 100%) - ITW script viruses: 95.1% (last test: 100%) ------------------------------------------------- 1 AV product (out of 25) detects ALL In-The-Wild file, macro and zoo viruses in ALL instantiations (files) and is rated "perfect": SCN 5 more scanners miss one ITW file virus but detect all ITW macro and script viruses: AVK,AVP,FSE,NAV,PAV ------------------------------------------------- Concerning detection of ITW file viruses only: 1 "perfect" scanner (last test: 5): SCN 11 next best scanners (unrated): AVK,AVP,DRW,FPR,FSE,NAV,PAV,RAV,INO,SWP,AVA ------------------------------------------------- Concerning detection of ITW macro viruses only: 10 "perfect" scanners (last test: 6): ANT,AVK,AVP,BDF,DRW,FSE,NAV,PAV,SCN,SWP 8 "excellent" scanners: AVG,CMD,FPR,INO,RAV,AVA,IKA,PRO ------------------------------------------------- Concerning detection of ITW script viruses: 6 "perfect" scanners (last test: 8): AVK,AVP,FSE,NAV,PAV,SCN 3 more "excellent" scanners: FPR,INO,RAV ------------------------------------------------- Findings WXP.3: Assessment of overall (ITW/zoo) detection rates of AV products under Windows XP: ------------------------------------------------------------------ NO WXP product is overall rated "perfect" 1 "excellent" overall scanner: SCN NO "very good" overall scanners: --- -------------------------------------------------- Findings WXP.4: Performance of W2k scanners by virus classes of AV products under Windows XP: ----------------------------------------------------------------------- 0 Perfect scanners for file zoo: --- 8 Excellent scanners for file zoo (last time:5): AVK,AVP,FSE,PAV,SCN,FPR,RAV,NAV 5 Very Good scanners for file zoo: CMD,SWP,INO,AVA,NVC ------------------------------------------------------------- 0 Perfect scanner for macro zoo (last time:1): --- 13 Excellent scanners for macro zoo (last time:5): AVK,AVP,FSE,PAV,SCN,CMD,INO,FPR,NAV,RAV,SWP,DRW,NVC 7 Very Good scanners for macro zoo (last time:0): VBR,BDF,AVG,AVA,ANT,AVA,IKA ------------------------------------------------------------- 2 Perfect scanners for script zoo (last time:0): FSE,SCN 5 Excellent scanners for script zoo (last time:2): AVK,AVP,PAV,RAV,FPR 5 Very Good scanners for script zoo (last time:3): NAV,CMD,INO,DRW,SWP ------------------------------------------------------------- Findings WXP.5: Detection of packed viral (ITW) objects of AV products under Windows XP: -------------------------------------------------------- Concerning OVERALL detection of packed file AND macro viruses, NO product is "perfect": --- And 1 products is "excellent": SCN No product is "very good" or "good": --- ------------------------------------------------------- Concerning detection of packed FILE viruses: NO product is "perfect": --- 1 product is "excellent": SCN ------------------------------------------------------- Concerning detection of packed MACRO viruses: 5 products are "perfect": AVK,AVP,BDF,FSE,PAV 5 products are "excellent": AVA,DRW,QHL,RAV,SCN 2 products are "very good": FPR,INO ------------------------------------------------------- Concerning EQUAL detection of UNPACKED AND PACKED of ITW file AND macro viruses: 2 "perfect" products have NO LOSS for 6 packers: AVP,FSE 3 "excellent" products have NO LOSS for 5 packers: PAV,RAV,SCN 1 "very good" product has NO LOSS for 4 packers: SWP 2 "good" products have NO LOSS for 3 packers: CMD,FPR ------------------------------------------------------- Findings WXP.6: Avoidance of False Alarms of AV products under Windows XP: ------------------------------------------------ Avoidance of False-Positive Alarms is rather well developed, at least for file-FP avoidance. 11 Overall FP-avoiding "perfect" W2k scanners: ANT,AVA,AVG,BDF,GLA,INO,NAV,PRO,SCN,SWP,VSP 7 more products are "excellent": AVK,AVP,CMD,FPR,FSE,PAV,RAV --------------------------------------------------- Concerning file-FP avoidance, ALL 25 products are "perfect": ANT,AVA,AVG,AVK,AVP,BDF, CMD,DRW,FIR,FPR,FSE,GLA,IKA,INO,NAV, NVC,PAV,PER,PRO,QHL,RAV,SCN,SWP,VBR,VSP --------------------------------------------------- Concerning macro-FP avoidance, 11 products are "perfect": ANT,AVA,AVG,BDF,GLA,INO,NAV,PRO,SCN,SWP,VSP 7 products are "excellent": AVK,AVP,CMD,FPR,FSE,PAV,RAV --------------------------------------------------- Findings WXP.7: Detection rates for file/macro/script malware of AV products under Windows XP: -------------------------------------------------------------- Mean file/macro/script malware detection rates have further deteriorated since last test to an inacceptably low level. Mean detection rates: for file malware: 68.2% (73.6% for scanners >10%) for macro malware: 84.2% (91.4% for scanners >10%) for script malware: 62.0% (64.4% for scanners >10%) ---------------------------------------------------- Concerning ALL products, file/macro/script malware detection under WXP for ALL platforms needs significant improvement: 0 products are "perfect": --- 6 products are "excellent": FSE,AVK,AVP,PAV,SCN,RAV 3 products are "very good": NAV,FPR,AVA ---------------------------------------------------- Concerning only file malware detection, 0 product is "perfect": --- 8 products are "excellent": FSE,AVK,AVP,PAV,FPR,SCN,NAV,RAV 3 products are rated "very good": CMD,SWP,INO ---------------------------------------------------- Concerning only macro malware detection, 6 products are "perfect": AVK,AVP,CMD,FPR,FSE,PAV 11 products are "excellent": SCN,INO,RAV,DRW,NAV,SWP,NVC,BDF,AVA,VBR,IKA 1 product is rated "very good": ANT ---------------------------------------------------- Concerning only script malware detection, 0 products are "perfect": --- 6 products are "excellent": AVK,FSE,AVP,SCN,PAV,RAV 1 product is rated "very good": NAV,FPR,AVA ---------------------------------------------------- Grading WXP products according to their detection performance: ============================================================== Under the scope of VTCs grading system (see 4), we summarize our results for WXP-related scanners: ***************************************************************** In VTC test "2004-07", we found *** NO perfect WXP AV product *** and we found *** No perfect WXP AM product *** ***************************************************************** But several products seem to approach our definition on a rather high level (taking into account the highest value of "perfect" defined on 100% level and "Excellent" defined by 99% for virus detection, and 90% for malware detection): Test category: "Perfect" "Excellent" ------------------------------------------------------------------ W2k file ITW test: SCN AVK,AVP,DRW,FPR,FSE,NAV, PAV,RAV,INO,SWP,AVA W2k macro ITW test: ANT,AVK,AVP,BDF,DRW, AVG,CMD,FPR,INO,RAV, FSE,NAV,PAV,SCN,SWP AVA,IKA,PRO W2k script ITW test: AVK,AVP,FSE,NAV, FPR,INO,RAV PAV,SCN ------------------------------------------------------------------ W2k file zoo test --- AVK,AVP,FSE,PAV,SCN, FPR,RAV,NAV W2k macro zoo test: --- AVK,AVP,FSE,PAV,SCN,CMD, INO,FPR,NAV,RAV,SWP,DRW,NVC W2k script zoo test: FSE,SCN AVK,AVP,PAV,RAV,FPR ------------------------------------------------------------------ W2k file pack test: --- SCN W2k macro pack test: AVK,AVP,BDF,FSE,PAV AVA,DRW,QHL,RAV,SCN + W2k pack/unpack test: AVP,FSE PAV,RAV,SCN ------------------------------------------------------------------ W2k file FP avoidance: ANT,AVA,AVG,AVK, --- AVP,BDF,CMD,DRW,FIR,FPR,FSE, GLA,IKA,INO,NAV,NVC,PAV,PER, PRO,QHL,RAV,SCN,SWP,VBR,VSP W2k macro FP avoidance: ANT,AVA,AVG,BDF, AVK,AVP,CMD,FPR, GLA,INO,NAV,PRO,SCN,SWP,VSP FSE,PAV,RAV ------------------------------------------------------------------ W2k file malware test: --- FSE,PAV,AVK,FPR, AVP,SCN,NAV,RAV W2k macro malware test: AVK,AVP,CMD,FSE,PAV SCN,INO,RAV,DRW,NAV, SWP,NVC,BDF,AVA,IKA,VBR W2k script malware test: --- AVK,FSE,AVP,SCN,PAV,RAV ------------------------------------------------------------------ In order to support the race for more customer protection, we evaluate the order of performance in this WXP test with a simple algorithm, by counting the majority of places (weighing "perfect" twice and "excellent" once), for the first places: ************************************************************ "Perfect" Windows-XP AntiVirus product: =NONE= (22 points) "Excellent" Windows-XP products: 1st place: SCN (17 points) 2nd place: FSE (16 points) 3rd place: AVP (15 points) 4th place: AVK,PAV (13 points) 6th place: NAV,RAV (11 points) 8th place: FPR ( 9 points) 9th place: BDF,INO,SWP ( 8 points) 12th place: AVA,DRW ( 7 points) 14th place: ANT ( 6 points) 15th place: AVG,CMD,PRO ( 5 points) 18th place: GLA,VSP ( 4 points) 20th place: IKA,NVC,QHL ( 3 points) 22th place: FIR,PER,VBR ( 2 points) ************************************************************ "Perfect" Windows-XP AntiMalware product:=NONE= (28 points) "Excellent" Windows-XP AntiMalware product: 1st place: SCN,FSE (20 points) 3rd place: AVP (19 points) 4th place: AVK,PAV (17 points) 6th place: RAV (14 points) 7th place: NAV (13 points) 8th place: FPR (10 points) 9th place: BDF,INO,SWP ( 9 points) 12th place: AVA,DRW ( 8 points) 14th place: CMD ( 7 points) 15th place: IKA,NVC ( 4 points) 17th place: VBR ( 3 points) ************************************************************ 7. Comparison of detection results under Windows-32 platforms: ============================================================== This is a summary of the comparison of AV/AM products under different W32 platforms (W2k and WXP). For details see 7mevaw32.txt. With the fast deployment of new versions of Microsoft Windows-32 (in past 5 years from W-NT to W-95, W-98, W-2000 and W-XP (and beyond), customers needing protection and producers of security-enhancing software (AntiVirus and AntiMalware) can only cope with the pace when they essentially reuse engines prepared for previous W32 platforms and simply "adapt" them to the intrinsics of the new platforms. Otherwise, "rewriting" the resp. software would consume too much time and efforts, and customers would receive "adapted" products only with some delay. AV/AM testers cannot determine the characteristics of the algorithms in scanning engines, either in following legal objectives (which, in most Copyright laws, prohibit reverse-engineering of proprietory code, except for specific reasons such as collecting evidence for a court case or teaching related techniques, as in Hamburg university IT Security curriculum), or for complexity of related code (and in many cases, for unsufficient p rofessional knowledge of testers). It is therefore worthwhile to analyse whether those AV/AM products versions of which are available for all W32 platforms behave EQUALLY concerning detection and identification of viral and malicious code. Test Hypothesis: "W32-harmonical" behaviour of W32 products: ============================================================ We assume that those products which participate for all W32 platforms in this test (W98 and W2k) for ALL categories shall yield IDENTICAL results (argument for this assumption: likelihood of reuse of engines running on the same platform). We call product behaviour following this hypothesis "W32-harmonical". Eval W32-Harmonicity: Equality of results for all W32 platforms: ================================================================ In comparison with last VTC test, not much progress can be reported. Equal detection this test last test -----------------------------+-----------+------------ of zoo file viruses: 21 (of 25) 9 (of 18) of zoo infected files: 20 (of 25) 7 (of 18) of ITW file viruses: ALL (of 25) 17 (of 18) of ITW infected macro files: ALL (of 25) 17 (of 18) of zoo file malware: 20 (of 25) 13 (of 18) Equal detection this test last test -----------------------------+-----------+------------ of zoo macro viruses: 22 (of 25) 16 (of 18) of zoo infected macro objects: 22 (of 25) 16 (of 18) of ITW macro viruses: ALL (of 25) ALL (of 18) of ITW infected macro files: ALL (of 25) ALL (of 18) of zoo macro malware: ALL (of 25) 15 (of 18) Equal detection this test last test -----------------------------+-----------+------------ of zoo script viruses: 24 (of 25) 16 (of 18) of zoo script viral objects: 24 (of 25) 15 (of 18) of ITW script viruses: ALL (of 25) ALL (of 18) of ITW script viral objects: ALL (of 25) ALL (of 18) of ITW script malware: ALL (of 25) 16 (of 18) Findings W32-Harmonicity: -------------------------------------------------------------- Concerning detection of FILE viruses: many though not all (20 of 25) products behave "W32-harmonically" in all categories: ANT,AVA,AVG,AVK,BDF,CMD,FIR,FPR,FSE,GLA, INO,NAV,PAV,PER,PRO,RAV,SCN,SWP,VBR,VSP Concerning file malware detection, also many though not all (20 of 25) products behave "W32-harmonically" in all categories: ANT,AVG,AVK,BDF,CMD,DRW,FIR,FPR,FSE,GLA, IKA,INO,NAV,PAV,PER,PRO,SCN,SWP,VBR,VSP -------------------------------------------------------------- Concerning detection of MACRO viruses: many though not all (22 of 25) products behave "W32-harmonically" in all categories: ANT,AVA,AVG,AVK,AVP,BDF,CMD,DRW,FIR,FPR,FSE, GLA,INO,NAV,NVC,PAV,PER,PRO,RAV,SCN,SWP,VSP Concerning macro malware detection, ALL 25 products behave in W32-harmonical form: ANT,AVA,AVG,AVK,AVP,BDF,CMD,DRW,FIR,FPR,FSE,GLA, IKA,INO,NAV,NVC,PAV,PER,PRO,QHL,RAV,SCN,SWP,VBR,VSP -------------------------------------------------------------- Concerning detection of SCRIPT viruses: ALMOST ALL (24 of 25) products behave "W32-harmonically" in all categories: ANT,AVG,AVK,AVP,BDF,CMD,DRW,FIR,FPR,FSE,GLA,IKA, INO,NAV,NVC,PAV,PER,PRO,QHL,RAV,SCN,SWP,VBR,VSP Concerning script malware detection: ALL 25 products behave in W32-harmonical form: ANT,AVA,AVG,AVK,AVP,BDF,CMD,DRW,FIR,FPR,FSE,GLA, IKA,INO,NAV,NVC,PAV,PER,PRO,QHL,RAV,SCN,SWP,VBR,VSP -------------------------------------------------------------- Conclusion: regarding economy of AV/AM testing, it seems sufficient to include only AV/AM products at the upper end of the W32 development chain (presently Windows XP). Grading W32-Harmonicity: Grading of W32 harmonical products: ============================================================ The following grid is used to grade W32 products concerning their ability for IDENTICAL detection for ALL categories on ALL W32 platforms: A "perfect" W32-harmonical AV product will yield IDENTICAL results for all categories (file, macro and script viruses). (Assigned value: 5). A "perfect" W32-harmonical AM product will be a perfect AV product and yield IDENTICAL results for all categories (file, macro and script malware). (Assigned value: 2). Grading W32-harmonical AntiVirus products: =========================================================== Grade: "Perfect" W32-harmonical detection: ANT,AVG,AVK,BDF,CMD,FIR,FPR,FSE, GLA,IKA,INO,NAV,PAV,PER,PRO,SCN,SWP,VSP =========================================================== Grading W32-harmonical AntiMalware products: =========================================================== Grade: "Perfect" W32-harmonical detection: ANT,AVG,AVK,BDF,CMD,FIR,FPR,FSE, GLA,IKA,INO,NAV,PAV,PER,PRO,SCN,SWP,VSP =========================================================== ************************************************************* "Perfect" W32-harmonical AntiVirus products: 1st place: ANT,AVG,AVK,BDF,CMD,FIR,FPR,FSE, GLA,IKA,INO,NAV,PAV,PER,PRO,SCN,SWP,VSP (5 points) ************************************************************* "Perfect" W32-harmonical AntiMalware products: 1st place: ANT,AVG,AVK,BDF,CMD,FIR,FPR,FSE, GLA,IKA,INO,NAV,PAV,PER,PRO,SCN,SWP,VSP (7 points) ************************************************************* 8. Results of on-demand detection under Linux(SuSe): ===================================================== This is a summary of the essential findings for AV/AM products under Linux. For details see 7evalin.txt. Meant as a perspective of product results, the following table (LIN-A) lists all results of Linux scanners for zoo detection of (file), macro and script viruses, in last 3 VTC tests. Moreover, differences ("delta") in resp. detection rates for those products which participated in last 2 tests are also given, and mean values are calculated. Table Lin.A: Performance of LINUX scanners in Test 2001-04 until 2004-07: ========================================================================= Table Lin.A1: Detection performance for file viruses: =================================== Scan I ==== File Virus ==== ner I Detection -----+---------------------- Test I 0104 0212 0407 Delta -----+---------------------- ANT I - - 90.9 - AVK I - 99.9 - - AVP I 99.9 - 100~ - CLA I - - 34.8 - CMD I 97.8 99.1 99.1 0.0 DRW I - 98.3 79.0 -19.3 FPR I - 98.9 99.6 +0.7 FSE I 97.1 98.0 100~ +2.0 INO I - - 95.4 - MCV I - - - - OAV I - 9.1 34.1 +25.0 RAV I 93.5 96.7 - - SCN I 99.7 99.8 100~ +0.2 SWP I - - 98.2 - -----+---------------------- Mean : 97.6 87.5 84.7 -2.8 >10% : 97.6 98.7 84.7 -14.0 -----+---------------------- Table Lin.A2: Detection performance for macro and script viruses: ======================================= Scan I ===== Macro Virus ====== + ===== Script Virus ====== ner I Detection I Detection -----+--------------------------+-------------------------- Test I 0104 0110 0212 0407 DeltaI 0104 0110 0212 0407 Delta -----+--------------------------+-------------------------- ANT I - 97.1 - 97.9 - I - 81.8 - 87.6 - AVK I - 100% 100~ - - I - 100% 99.1 - - AVP I 100~ 100% - 100~ - I 99.8 100% - 99.7 - CLA I - - 0.5 - I - - - 27.1 - CMD I 100% 100% 100~ 99.9 -0.1 I 96.9 94.2 89.4 99.4 +10.0 DRW I - 99.5 99.4 99.4 0.0 I - 95.4 94.7 95.4 +0.7 FPR I - - 100~ 99.9 -0.1 I - - 88.7 99.4 +10.7 FSE I 100~ 100% 100~ 100% +0.0 I 96.9 92.3 88.1 99.9 +11.8 INO I - - 99.3 - I - - - 96.6 - MCV I - 9.1 - - - I - 27.6 - - - OAV I - - 0.1 0.1 0.0 I - - 13.9 27.1 +13.2 RAV I 99.6 99.5 99.9 - - I 84.9 82.5 96.1 - - SCN I 100% 100% 100% 100~ -0.0 I 99.8 99.8 99.5 99.8 +0.3 SWP I - - 99.8 - I - - - 97.2 - -----+--------------------------+-------------------------- Mean : 99.9 89.5 74.9 81.6 +6.7 I 95.7 86.0 83.7 84.5 +0.8 >10% : 99.9 99.5 99.8 99.6 -0.2 I 95.7 86.0 83.7 84.5 +0.8 -----+------------------------+---------------------------- Remark: for abbreviations of products (code names), see appendix A5CodNam.txt. Table Lin.A1: Detection performance for file viruses: =================================== Scan I ==== File Virus ==== ner I Detection -----+---------------------- Test I 0104 0212 0407 Delta -----+---------------------- ANT I - - 90.9 - AVK I - 99.9 - - AVP I 99.9 - 100~ - CLA I - - 34.8 - CMD I 97.8 99.1 99.1 0.0 DRW I - 98.3 79.0 -19.3 FPR I - 98.9 99.6 +0.7 FSE I 97.1 98.0 100~ +2.0 INO I - - 95.4 - MCV I - - - - OAV I - 9.1 34.1 +25.0 RAV I 93.5 96.7 - - SCN I 99.7 99.8 100~ +0.2 SWP I - - 98.2 - -----+---------------------- Mean : 97.6 87.5 84.7 -2.8 >10% : 97.6 98.7 84.7 -14.0 -----+---------------------- Table Lin.A2: Detection performance for macro and script viruses: =========================================== Scan I ===== Macro Virus ====== + ===== Script Virus ====== ner I Detection I Detection -----+--------------------------+-------------------------- Test I 0104 0110 0212 0407 DeltaI 0104 0110 0212 0407 Delta -----+--------------------------+-------------------------- ANT I - 97.1 - 97.9 - I - 81.8 - 87.6 - AVK I - 100% 100~ - - I - 100% 99.1 - - AVP I 100~ 100% - 100~ - I 99.8 100% - 99.7 - CLA I - - 0.5 - I - - - 27.1 - CMD I 100% 100% 100~ 99.9 -0.1 I 96.9 94.2 89.4 99.4 +10.0 DRW I - 99.5 99.4 99.4 0.0 I - 95.4 94.7 95.4 +0.7 FPR I - - 100~ 99.9 -0.1 I - - 88.7 99.4 +10.7 FSE I 100~ 100% 100~ 100% +0.0 I 96.9 92.3 88.1 99.9 +11.8 INO I - - 99.3 - I - - - 96.6 - MCV I - 9.1 - - - I - 27.6 - - - OAV I - - 0.1 0.1 0.0 I - - 13.9 27.1 +13.2 RAV I 99.6 99.5 99.9 - - I 84.9 82.5 96.1 - - SCN I 100% 100% 100% 100~ -0.0 I 99.8 99.8 99.5 99.8 +0.3 SWP I - - 99.8 - I - - - 97.2 - -----+--------------------------+-------------------------- Mean : 99.9 89.5 74.9 81.6 +6.7 I 95.7 86.0 83.7 84.5 +0.8 >10% : 99.9 99.5 99.8 99.6 -0.2 I 95.7 86.0 83.7 84.5 +0.8 -----+------------------------+---------------------------- Remark: for abbreviations of products (code names), see appendix A5CodNam.txt. Evaluation: Zoo file virus detection is stable for most scanners but mean value is reduced thru 3 inacceptable results (84.7%) Zoo macro virus detection is stable on a very high level (99.6%) when a new product with inacceptably bad rate is not counted. Zoo script detection has significantly improved (84.5%) but on an insufficient level. Findings LIN.1: General development of zoo virus detection rates: ------------------------------------------------------------------ Findings LIN.01: Concerning zoo virus detection, LINUX products are less developped as compared to other platforms. Detection rates for file viruses are significantly reduced (in zoo: 84.7%) for macro viruses still high (in zoo: 99.6%) for script viruses improved (in zoo: 84.5%) --------------------------------------------------- NO product detects ALL zoo file viruses, 3 products detect almost 100% (grade: excellent): AVP,FSE,SCN (100~) 2 products detect >99% (grade: excellent): FPR (99.6%), CMD (99.1%) 2 more products detect >95% (grade: very good): SWP (98.2%), INO (95.4%) --------------------------------------------------- 1 product detects ALL zoo macro viruses (perfect): FSE (100.0%) 2 products detect almost all zoo macro viruses: (100~, grade excellent): AVP,SCN (100~) 5 products detect >99% (grade: excellent): FPR,CMD(99.9%),SWP(99.8%),DRW(99.4%),INO(99.3%) 1 product detects >95% (grade: very good): ANT(97.9%) --------------------------------------------------- NO product detects ALL zoo script viruses, 5 products detect >99% (grade: excellent): FSE(99.9%),SCN(99.8%),AVP(99.7%),CMD,FPR(99.4%) 3 products detect >95% (grade: very good): SWP(97.2%),INO(96.6%),DRW(95.4%) --------------------------------------------------- As "worst products in test" qualify (for 2nd time): CLA: file virus detection rate: 34.8% macro virus detection rate: 0.5% script virus detection rate: 27.1% OAV: file virus detection rate: 34.1% macro virus detection rate: 0.1% script virus detection rate: 27.1% Findings LIN.2: Development of ITW virus detection rates: --------------------------------------------------------- Findings LIN.02: ONE AV product detects "perfectly" all ITW file macro and script viruses in all files (last test:3): SCN *************************************************** Concerning detection of ITW file viruses: 1 "perfect" scanner: SCN 7 "excellent" scanners: CMD,AVP,DRW,FPR,FSE,INO,SWP Concerning detection of ITW macro viruses: 5 "perfect" scanners: ANT,AVP,FSE,SCN,SWP 3 "excellent" scanner: CMD,FPR,INO Concerning detection of ITW script viruses: 2 "perfect" scanners: AVP,SCN 3 "excellent" scanners: CMD,FPR,FSE Findings LIN.3: Assessment of overall (ITW/zoo) detection rates: ---------------------------------------------------------------- Findings LIN.03: 3 LINUX products overall rated "excellent": SCN,AVP,FSE ------------------------------------------------------- 2 "Useless" overall LINUX scanners: CLA,OAV Findings LIN.4: Performance of LINUX scanners by virus classes: --------------------------------------------------------------- Finding LIN.04: Performance of LINUX scanners by virus classes: ---------------------------------------------------- Perfect scanners for file zoo: --- Excellent scanners for file zoo: AVP,FSE,SCN,FPR,CMD Very Good scanners for file zoo: SWP,INO Perfect scanners for macro zoo: FSE Excellent scanners for macro zoo: AVP,SCN,CMD,FPR,SWP,DRW,INO Very Good scanners for macro zoo: ANT Perfect scanners for script zoo: --- Excellent scanners for script zoo: FSE,SCN,AVP,CMD,FPR Very Good scanners for script zoo: SWP,INO,DRW Findings LIN.5: Detection of packed viral (ITW) objects: -------------------------------------------------------- Findings LIN.05: Detection of packed viral objects needs improvement: Perfect packed ITW file/macro virus LINUX detector: --- Excellent packed ITW file/macro virus LINUX detector: SCN ----------------------------------------------------------------- Concerning detection of packed FILE viruses: NO product is "perfect": --- 1 product is "excellent": SCN ******************************************************* Concerning detection of packed MACRO viruses: 1 product is "perfect": AVP,FSE 5 products are "excellent": DRW,SCN,SWP ******************************************************* Concerning EQUAL detection of UNPACKED AND PACKED of ITW file and macro viruses: 1 "perfect" product has NO LOSS for 6 packers: AVP 2 "excellent" products have NO LOSS for 5 packers: SCN,SWP Findings LIN.6: Avoidance of False Alarms: ------------------------------------------ Findings LIN.06: Avoidance of False-Positive Alarms is insufficient and needs improvement. FP-avoiding perfect LINUX scanners: ANT,INO,SCN,SWP *************************************************** Concerning file-FP avoidance, these 8 products are "perfect": ANT,AVP,DRW,FPR,FSE,INO,SCN,SWP The following product is "excellent": CMD *************************************************** Concerning macro-FP avoidance, 4 products are "perfect": ANT,INO,SCN,SWP 4 product are "excellent": AVP,CMD,FPR,FSE Findings LIN.7: Detection rates for file/macro malware: ------------------------------------------------------- Findings LIN.07: NO LINUX product can be rated "perfect" in detecting ALL file, macro & script malware specimen: NO product is rated "Overall Perfect": --- 3 products are rated "Overall excellent": FSE,AVP,SCN 2 products are rated "Overall Very Good": FPR,CMD ****************************************************** Concerning single classes of malware: A) "perfect" file malware detector: --- "excellent" file malware detector: FSE,AVP,SCN,FPR,CMD,SWP "very good" script malware detector: --- B) "perfect" macro malware detector: AVP,FPR,FSE "excellent" macro malware detector:SCN,CMD,DRW,SWP "very good" macro malware detector: INO,ANT C) "perfect" script malware detector: --- "excellent" script malware detector: FSE,AVP,SCN "very good" script malware detector: FPR,CMD Grading LIN products according to their detection performance: ============================================================== Under the scope of VTCs grading system (see 4), we summarize our results for W2k-related scanners: ******************************************************************* In VTC test "2004-03", we found *** NO perfect LINUX AV product *** and we found *** No perfect LINUX AM product *** ******************************************************************* But several products seem to approach our definition on a rather high level (taking into account the highest value of "perfect" defined on 100% level and "Excellent" defined by 99% for virus detection, and 90% for malware detection): Test category: "Perfect" "Excellent" ------------------------------------------------------------------ LINUX file ITW test: SCN CMD,AVP,DRW,FPR, FSE,INO,SWP LINUX macro ITW test: ANT,AVP,FSE,SCN,SWP CMD,FPR,INO LINUX script ITW test: AVP,SCN CMD,FPR,FSE ------------------------------------------------------------------ LINUX file zoo test: --- AVP,FSE,SCN,FPR,CMD LINUX macro zoo test: FSE AVP,SCN,CMD,FPR, SWP,DRW,INO LINUX script zoo test: --- FSE,SCN,AVP,CMD,FPR ------------------------------------------------------------------ LINUX file pack test: --- SCN LINUX macro pack test: AVP,FSE DRW,SCN,SWP + LINUX pack/unpack test: AVP SCN,SWP ------------------------------------------------------------------ LINUX file FP avoidance: ANT,AVP,DRW,FPR, CMD FSE,INO,SCN,SWP LINUX macro FP avoidance: ANT,INO,SCN,SWP AVP,CMD,FPR,FSE ------------------------------------------------------------------ LINUX file malware test: --- FSE,AVP,SCN,FPR,CMD,SWP LINUX macro malware test: AVP,FPR,FSE SCN,CMD,DRW,SWP LINUX script malware test: --- FSE,AVP,SCN ------------------------------------------------------------------ In order to support the race for more customer protection, we evaluate the order of performance in this LINUX test with a simple algorithm, by counting the majority of places (weighing "perfect" twice and "excellent" once), for the first places: ************************************************************ "Perfect" LINUX AntiVirus product: ===NONE=== (22 points) "Excellent" LINUX AV products: 1st place: SCN (16 points) 2nd place: AVP (15 points) 3rd place: FSE (13 points) 4th place: SWP (10 points) 5th place: FPR ( 9 points) 6th place: CMD ( 8 points) 7th place: INO ( 7 points) 8th place: ANT ( 6 points) 9th place: DRW ( 5 points) LAST PLACE: CLA,OAV ( 0 points) ************************************************************ "Perfect" LINUX AntiMalware product:===NONE=== (28 points) "Excellent" LINUX AM products: 1st place: AVP,SCN (19 points) 3rd place: FSE (17 points) 4th place: FPR,SWP (12 points) 6th place: CMD (10 points) 7th place: DRW ( 6 points) ************************************************************ ********************************************************************* In addition, we regret to grade CLA and OAV into the class of "useless" products! ===================================================================== 9. Comparison of on-demand detection under ALL system platforms: ================================================================= Background of this test: ======================== While W32 products may be designed, implememnted and maintained in forms reusable for all different "variations" of W32 platforms (such as W-NT, W-2000 and W-XP), such "harmonical" behaviour is (while not theoretically impossible, if based on a suitable meta-platform) unlikely with contemporary products. Consequently, design and implementation of engines for such diverse platforms as W32 and LINUX will be performed by different teams and hence differ in quality of implementation and maintenance. If properly planned and done, at least databases (signatures etc) may be shared between different platforms. On the other side, it will be an indication of good planning, work and quality assurance at one AV/AM producer?s sites if its products acting under different ALL different platforms produce similar if not identical detection results. This leads us to the following Test Hypothesis: "Trans-Platform harmonical behaviour: ========================================================= Equal detection on ALL platforms is regarded as indication of excellent design, implementation and maintenance of engines and malware bases. We call a product behaving according to this hypothesis "Trans-Platform harmonical" Eval "Trans-Platform Harmonicity": Equality of results for ALL platforms: =========================================== A detailed analysis produces an interesting result: Trans-platform harmonicity holds for several products for ITW (both virus and object) detection under ALL platforms, and it also holds for several products for file virus detection under ALL platforms. BUT trans-platform harmonicity does NOT hold for any product for file, macro and script ZOO virus and malware detection. In most cases, detection rates of LINUX products are lower than those of W32 products but partially detection rates are significantly higher. Equal detection trans-platform harmonical -----------------------------+---------------- of zoo file viruses: 5 (of 9) of zoo infected files: 4 (of 9) of ITW file viruses: 8 (of 9) of ITW infected files: 8 (of 9) of zoo file malware: 0 (of 9) Equal detection trans-platform harmonical -----------------------------+---------------- of zoo macro viruses: 7 (of 9) of zoo infected macro objects: 7 (of 9) of ITW macro viruses: 8 (of 9) of ITW infected macro files: 8 (of 9) of zoo macro malware: 6 (of 9) Equal detection trans-platform harmonical -----------------------------+---------------- of zoo script viruses: 2 (of 9) of zoo script viral objects: 2 (of 9) of ITW script viruses: 9 (of 9) of ITW script viral objects: 9 (of 9) of ITW script malware: 2 (of 9) Findings Trans-Platform Harmonicity: --------------------------------------------------------------- Concerning detection of FILE viruses: several products (5 of 9) behave "trans-platform harmonical": ANT,AVP,FSE,SCN,SWP Concerning file malware detection, NO products behaves "trans-platform excellent" --- --------------------------------------------------------------- Concerning detection of MACRO viruses: several products (7 of 9) behave "trans-platform harmonical": ANT,AVP,CMD,DRW,FPR,FSE,SCN Concerning macro malware detection, ALL 25 products behave in "trans-platform harmonical" form: AVP,DRW,FPR,FSE,SCN,SWP --------------------------------------------------------------- Concerning detection of SCRIPT viruses: only ONE product (1 of 9) behaves "trans-platform harmonically" in all categories: AVP Concerning script malware detection: few products (2 of 9) behave "trans-platform harmonically" in all categories: AVP,SCN --------------------------------------------------------------- ******************************************************** Conclusion concerning trans-platform harmonicity: ------------------------------------------------- Much work and esp. much more Quality Assurance must be invested into AV product (suites) to achieve comparable (if not equal) detection rates ?n all different platforms aka "trans-platform harmonicity". At the present stage, users are ill advised to move from one platform to a different platform as AV/AM products will behave very differently. ********************************************************* Grading ALL-Harmonicity: Grading of trans-platform harmonical products: ======================================================================= The following grid is used to grade products concerning their ability for IDENTICAL detection for ALL categories on ALL platforms: A "perfect trans-platform harmonical" AV product will yield IDENTICAL results for all categories (file, macro and script viruses). (Assigned value: 5). A "perfect trabns-platform harmonical" AM product will be a perfect AV product and yield IDENTICAL results for all categories (file, macro and script malware). (Assigned value: 2). Grading trans-platform harmonical AntiVirus products: =========================================================== Grade: "Perfect" trans-platform harmonical detection: = NO PRODUCT= =========================================================== Grading trans-platform AntiMalware products: =========================================================== Grade: "Perfect" trans-platform harmonical detection: = NO PRODUCT= =========================================================== 11. Final remark: In search of the "Perfect AV/AM product": =========================================================== This test includes 3 platforms for which engines are hardly comparable, namely DOS (16-bit engines), W98/W2k (32-bit engines, comparable) and LINUX. Moreover, several manufacturers submitted only products for special platforms. ********************************************************** In general, there is NO AntiVirus and NO AntiMalware product which can be rated "PERFECT" in all categories for ALL categories (esp. file, macro AND script). ----------------------------------------------------------- But for SINGLE categories (file, macro OR script viruses), there are SEVERAL products which can be rated "perfect" or "excellent". ************************************************************ Instead of calculating an overall value (e.g. the sum of points divided by number of products in test for a given platform), the following TABLES list product suites by their places, sorting by assigned points (maximum number: 22 points). Table SUM-AV: Survey of Results for AntiVirus Products: ------------------------------------------------------- ===================== AntiVirus Products ==================== Windows-2000 (25) Windows-XP (25) LINUX(SUSE) (11) ------------------------------------------------------------- Place 1: SCN (17) SCN (17) SCN (16) 2: FSE (16) FSE (16) AVP (15) 3: AVP (15) AVP (15) FSE (13) More: AVK,PAV (13) AVK,PAV (13) SWP (10) NAV,RAV (11) NAV,RAV (11) FPR ( 9) FPR ( 9) FPR ( 9) CMD ( 8) BDF,INO,SWP ( 8) BDF,INO,SWP ( 8) INO ( 7) AVA,DRW ( 7) AVA,DRW ( 7) ANT ( 6) ANT ( 6) ANT ( 6) DRW ( 5) AVG,CMD,PRO ( 5) AVG,CMD,PRO ( 5) GLA,VSP ( 4) GLA,VSP ( 4) IKA,NVC,QHL ( 3) IKA,NVC,QHL ( 3) FIR,PER,VBR ( 2) FIR,PER,VBR ( 2) ------------------------------------------------------------- Useless Linux AV products: CLA,OAV ( 0) ------------------------------------------------------------- Remark: Numbers for platforms indicate numbers of products in test. Numbers for products indicates points assigned for that platform. = indicates that products place euqals to previous product. Table SUM-AM: Survey of Results for AntiMalware Products: --------------------------------------------------------- =================== AntiMalware Products ==================== Windows-2000 (25) Windows-XP (25) LINUX(SUSE) (11) ------------------------------------------------------------- Place 1: SCN,FSE (20) SCN,FSE (20) AVP,SCN (19) 3: AVP (19) AVP (19) FSE (17) AVK,PAV (17) AVK,PAV (17) FPR,SWP (12) NAV,RAV (13) RAV (14) CMD (10) FPR (10) NAV (13) DRW ( 6) BDF,INO,SWP ( 9) FPR (10) AVA,DRW ( 8) BDF,INO,SWP ( 9) CMD ( 7) CMD ( 7) IKA,NVC ( 4) IKA,NVC ( 4) VBR ( 3) VBR ( 3) ------------------------------------------------------------- Useless Linux AV products: CLA,OAV ( 0) ------------------------------------------------------------- Remark: see remark for table SUM-AV Generally, we hope that these rather detailed results help AV producers to adapt their products to growing threats and thus to protect their customers. 14. Availability of full test results: ====================================== Much more information about this test, its methods and viral databases, as well as detailed test results are available for anonymous FTP downloading from VTCs HomePage (VTC is part of Working Group AGN): ftp://agn-www.informatik.uni-hamburg.de/vtc Any comment and critical remark which helps VTC learning to improve our test methods will be warmly welcomed. The next comparative test will evaluate file, macro (VBA/VBA5) and script virus and malware detection. This test (started in July 2004, with testbeds frozen on April 30, 2004 and products submitted mid-June 2004) will be published about February 2005 (results for macro and script viruses/malware) and summer 2005 (file virus/malware). 15. Copyright, License, and Disclaimer: ======================================= This publication is (C) Copyright 2004 by Klaus Brunnstein and the Virus Test Center (VTC) at University of Hamburg, Germany. Permission (Copy-Left) is granted to everybody to distribute copies of this information in electronic form, provided that this is done for free, that contents of the information are not changed in any way, and that origin of this information is explicitly mentioned. It is esp. permitted to store and distribute this set of text files at university or other public mirror sites where security/safety related information is stored for unrestricted public access for free. Any other use, esp. including distribution of these text files on CD-ROMs or any publication as a whole or in parts, are ONLY permitted after contact with the supervisor, Prof. Dr. Klaus Brunnstein or authorized members of Virus Test Center at Hamburg University, and this agreement must be in explicit writing, prior to any publication. No responsibility is assumed by the author(s) for any injury and/or damage to persons or property as a matter of products liability, negligence or otherwise, or from any use or operation of any methods, products, instructions or ideas contained in the material herein. Prof. Dr. Klaus Brunnstein Faculty for Informatics University of Hamburg, Germany (December 31, 2004)