Professional Documents
Culture Documents
Desktop Search
Desktop Search
Benchmark Study of
Desktop Search Tools
There’s More to Search than Google & Yahoo!
An Evaluation of 12 Leading Desktop Search Tools
Tom Noda
Shawn Helwig
www.uwebc.org/decisiontools
Decision Tools | Desktop Search
Executive Summary
A new generation of desktop search tools is emerging that allows users to quickly find relevant documents in computers
across the enterprise the same way search engines help locate information on the Internet. Companies expect that this
technology will boost employee productivity and creativity and allow them to compete successfully in today’s knowledge-
driven economy.
Desktop search technology itself is nothing new. In fact, it has been around for years. However, some well know names
(i.e. Google and Yahoo!) have recently entered the space giving this technology a well-deserved boost in visibility. In an
effort to help understand the differences between the latest desktop search tools on the market, the UW E-Business Con-
sortium recently conducted a benchmark study of 12 popular desktop search tools. The benchmark criteria that were used
for the evaluation included usability, versatility, accuracy, efficiency, security, and enterprise readiness.
When all the results were reviewed, it was determined that most of the desktop search tools were still too immature for
significant business use due primarily to a lack of mature security and overall manageability. However, considering the
evolution of Instant Messaging from a pure consumer tool to a valuable enterprise application, desktop search may have
similar potential.
Key Findings
TOP 3 Desktop Search Based on our evaluation, the best overall desktop search tool is Copernic 1.5
Usability Beta with Coveo.
Enterprise Versatility Yahoo! Desktop Search 1.1 Beta was rated the second best tool in our evalu-
Readiness
ation. See other notes.
Overall Ratings
These are the overall benchmark evaluation ratings. Some tools are very good in specific areas such as usability, versatil-
ity or search accuracy (explained later), but to be the best desktop search tool, a balance of all criteria is critical.
Desktop Search Tool Version Score (Min = 1.00, Max = 5.00) Better
Benchmark Criteria
Our benchmark evaluation was performed across six main criteria. Each criterion was quantified and was given a rating,
ranging from 1 (worst) to 5 (best). The rating is based on sub criteria, which align with the main criterion’s objective. For
sub criteria and their rating details, please refer to Appendix A - Comparison Table.
1. Usability 2. Versatility
Good desktop search tools must be easy to use, have Versatility describes how wide and deep the tool al-
a lower learning curve, have professional aesthetics, lows you to search. This includes factors such as sup-
and require fewer steps to reach desired output. ported document types, web/e-mail integration, and
multi-language support.
3. Accuracy 4. Efficiency
“Can you find what you are looking for?” This criterion This criterion assesses the tool’s technical efficiency
addresses accuracy of search results as well as other including memory usage, indexing time or indexed file
factors that help users find the desired information. sizes. The best tool should not jeopardize overall PC
performance.
Criteria Ratings
The following charts summarize the best tools’ ratings for each criterion. Blinkx and ISYS are versatile tools but struggle to
deliver their powerful features in a user-friendly fashion. On the other hand, Ask Jeeves excels in usability, efficiency and
security, but lacks versatility. Copernic is excellent in almost all criteria.
1. Usability 2. Versatility
Copernic 4.80 Copernic 4.14
3. Accuracy 4. Efficiency
Copernic 4.50 Archivarius 4.40
Product Reviews
This section examines the details for each desktop search tool individually. The benchmark performance for each tool is
expressed with a Spider Chart (see description), in order to convey the performance in each one of six criteria as well as
the overall balance.
Security Accuracy
3.50 3.00 Achieving the maximum scores in all criteria and maintaining a good hexagon
shape are ideal, but that is not required by all users. For instance, if Enterprise
Efficiency Readiness is not critical for a specific user, an unbalanced shape that lacks
2.00 Enterprise Readiness features may still be a solid fit.
Appendix B
Test Environment
We performed benchmark evaluations for all the desktop search tools on the same machine. To prevent any index con-
flicts, we installed/uninstalled one tool at a time. The details of the computer environment information are shown below:
1. Usability
1.1 Application Types Is the tool standalone, browser based, toolbar or deskbar? (not rated)
1.2 Features How many useful features, preferences and options are available?
1.3 Simplicity How does the tool deal with the following tradeoffs (more features vs. simpler application
design)?
1.4 Navigations How simple and easy is it to execute the search and results? How many steps does it
take from inserting search keywords to reaching the target file?
1.5 Aesthetics How are the user interface components and functions refined and organized? Does it
look professional? How about commands, forms, icons and images?
1.6 Others in Usability Other remarkable usability features if any. This score is only applied if it raises the
average.
2. Versatility
2.1 Supported PC Environment Which operating systems does the tool support? Windows, Mac OS, Linux?
2.2 Supported Files Which file formats are supported? Office, PDF, IM files, Zip, RSS and folder names?
2.3 Supported Media Files Which image/audio/video files are supported?
2.4 Supported Applications Check which applications are supported. This is related to the criteria above, but what
about IE or FireFox in terms of web history searches? What about e-mail clients? Does it
support Outlook, Express, Thunderbird, Lotus Notes or Eudora? What about IM?
2.5 Multi-language Support Does the tool support multi language searches? Can it search Asian text? Does it support
Unicode or other specific encoding types?
2.6 Web Search Integration How does the tool seamlessly integrate local machine search, web history, and web site
search into one platform?
2.7 E-mail Integration How far does the tool search in the e-mail client? Does it search just e-mail messages, or
does it also search attachments, address books, schedules and tasks as well? Does it
require the e-mail client be running while indexing?
2.8 Others in Versatility Other remarkable versatility features if any. This score is only applied if it raises the
average.
3. Accuracy
3.1 Word Accuracy How exactly does the tool recognize keywords? If a user types “apples,” does it also look
for the word “apple”? What about “criterion/criteria” or “it/IT”? Does it support synonyms
or a thesaurus?
3.2 Additional Word Support Does the tool have spell checker? What happens if users misspell “Massatusets”? How
does the tool handle an ambiguous person’s name? Does it support wildcard
(* character)? What about double equations or boolean keywords?
3.3 Index Accuracy What will happen if users move or delete indexed files and then try to search them? What
about new files or modified files? Does it support dynamic indexing, or does it require
reindexing? What about received/sent e-mail?
3.4 Output Format How accurate and user-friendly is the output? Does it pinpoint exact word locations in
files or just display the file name? How easy is it for users to find documents from
hundreds of outputs?
3.5 Filter & Sort Can users easily filter or sort search output? What kinds of filtering/sorting options are
available? How easy are they to use?
3.6 Others in Accuracy Other remarkable accuracy features if any. This score is only applied if it raises the
average.
4. Efficiency
4.1 Download/Indexed File Size How large are downloaded and installed file sizes? Are they small or large, considering
its features and capabilities?
4.2 Indexed File Size How large are the indexed files? Are they small or large, considering its supported file
types?
4.3 Initial Index Time How long does the tool initially take to index files and e-mail? Considering its indexed file
size and supported file types, is it fast or slow?
4.4 Index Controls How can users control index performance and frequency? Can users control how much
hardware resources the tool can use? How to schedule indexing? Automatic indexing
during idle time?
4.5 Memory & CPU Usages How much memory does the tool require during the idle and indexing time? How much
CPU power does the tool require during the usage and indexing time?
4.6 Others in Efficiency Other remarkable efficiency features if any. This score is only applied if it raises the
average.
5. Security
5.1 HTTPS Cache Indexing Can users search SSL web histories? Do users have an option to prevent those pages
from being indexed?
5.2 Personal Folder Search Can the tool allow users to search someone else’s personal folders? Or does it restrict
indexable folders, primarily for privacy/security reasons?
5.3 Possible Intrusion Is there any possible intrusion or security breach?
5.4 Protection Features Can users protect certain folders or documents from desktop search? How about
password protected documents? Does the tool index them or ask users for a decision?
5.5 Privacy How does the vendor address privacy and security issues? Is it clearly stated on the web
site or during installation?
5.6 Spyware & Adware Does the tool secretly install Spyware or Adware? Is there any unusual network activity
occuring when the application is running?
5.7 Product Update Does the tool have auto update features so that users can apply updates as quick and
easily as possible? Or does it require uninstall/install? How easy is it to uninstall and
reinstall the new one (keep indexed files)?
5.8 Others in Security Other remarkable security features if any. This score is only applied if it raises the
average.
6. Enterprise Readiness
6.1 Enterprise Products Does the vendor provide enterprise desktop search solutions?