Showing posts with label benchmark. Show all posts
Showing posts with label benchmark. Show all posts

Browser comparison with end user benchmarks!

Ever since Internet Explorer 8 released, I was excited about the situation in the browser market. The war is on! But how close it is? As I am using only IE8 in my Windows 7, and also have it in my Windows XP, I got used to it. And felt it is giving a decent performance. Whenever I search for browser comparisons, all I get is browsers compared with SunSpider test, Acid test etc. I am not saying it is not the right way to go, but I wanted a browser test done, the way we use it. Another reason for the same was the Microsoft's statement that, thought IE8 has a slower Javascript engine, in practical usage, it outperforms its peers. So, I made a few test cases, based on everyday usage patterns.

The subjects
I took 6 major web browsers for this test. They included, IE8, Safari, Firefox 3.0.7, Firefox 3.1 Beta, Google Chrome and Opera. The results were quite surprising, at least for me. More of that ahead.
The tests were done on a modest AMD Sempron 2800+ PC carrying 1GB RAM running a Windows XP Service Pack 3 over a 512KBPS internet connection. Each browser was tested as a fresh installation with other browsers uninstalled, but Internet Explorer 6 that comes with XP was never uninstalled. 5 trials were conducted for each test and average taken as the standard result.
Tests
Installing time: This includes the total installation time taken from the moment you click the installation executable. This includes the time taken for restarts and other system operations if any. In this test clearly Internet Explorer was bound to lose. As we all know it would take at least one restart. So the contest came down to the rest 5. Surprisingly Chrome took relatively longer time to get installed even without a restart, and Firefox Beta was lightning fast. You can see the exact results in the table.
Loading time: The amount of time the browser takes to load itself (with no homepage) from the moment the browser icon is clicked. Firefox has a bad reputation of being very slow to startup. But again Firefox 3.1Beta proved that wrong, being the fastest. Opera loaded a fraction of a second slower. IE 8 loaded almost 10 times slower than the latest firefox.
Browsing test: I randomly tested the browser for a few websites which I browse the most. Many of which had a considerable amount of AJAX built in it. So this could also be possibly be looked upon as the Javascript efficiancy test. The websites I used were:

  • http://www.news.com/
  • http://www.yahoo.com/
  • http://www.igoogle.com/
  • http://www.orkut.com/
  • http://www.gmail.com/
  • http://www.yahoomail.com/
  • http://www.espnstar.com/
  • http://www.cricinfo.com/
  • The winner varied from site to site. Though majorly IE8 was the slowest, Chrome also shared the place. Firefox 3.1Beta beat everyone with a huge margin, the same way IE lost huge margins. Other browsers exchanged places, pretty uniformly between them. But, the most unbelievable result was that of yahoomail. Firefox Beta took just 3.5 Seconds to load the mail page, while Chrome, IE and Opera took around 25 seconds, while Safari was ready in 17secs and Firefox 3.0.7 loaded in 7secs.
    Memory Usage test: This is something every naive user might not be very interested in knowing. Each browser uses different methods to handle tabs. While Chrome uses entirely different process to isolate each tab and manage crashed tabs, IE does the same maintaining them in a single process. Safari was the lightest here using just around 25 MB, while firefox 3.0.7 was the fattest using almost 120Mb. Both Chrome and Opera used less than 100MB, while other used around 110Mb.
    Multiple tab load time: This is to test the amount of time taken to load multiple tabs together, for browsers which support restore tabs upon opening. So, Chrome, Safari and IE8 does not qualify for this test. Again Firefox 3.1Beta made it no contest, while Opera and Firefox 3.0.1 clocking almost the same.
    So that were the tests. The winner without any doubt is Firefox3.1. Though I am not a great Mozilla fan, I have to bow to it. And unfortunately, my favorite IE lost the race by a few yards. Other browsers were neck to neck, with no clear winner.

    Other general remarks about each of these browsers include,

  • IE 8 still has problems with some sites including GMail.
  • Google Chrome does not even lets you decide where to install.
  • Opera had problem with back/forward navigation. In some cases, it does not happen at all.
  • Safari is generally very fast, but even if when you make it the default browser, links open in the previous default browser.
  • In terms of browsing speed Firefox 3.1Beta is a killer, Chrome is fastest from the rest of the lot.

  • How to use this table: This table shows how slow the other browsers are, compared to the fastest. For example, in the Installation test, if IE 8 has 23.2 compared to Firefox Beta's 1, means IE takes 23.2 x times the time taken by Firefox. So it shows the relative slowness of the browser to the fastest.
    So, that is all about. The tests shows IE 8 is in average 5 times slower than Firefox 3.1Beta. The ranking goes like this.
    1. Firefox 3.1 Beta
    2. Firefox 3.0.7
    3. Opera 9.64
    4. Chrome 2.0.169.1
    5. Safari 4 Public Beta (528.16)
    6. Internet Explorer 8.

    In closing note, thanks to my sister for helping me carry out these test cases.

    -Codevalley

    Intel and the benchmark scams?

    We all believe that now, it is Intel's time to beat the hell out of AMD after the release of the Core Microarchitecture. But, for so long AMD had been shouting foulplay by Intel to mislead the analyst, Wall street and the consumers to make them believe this. Until recently, I never gave my ear to it, despite being an AMD fanboy. But, now analysis and post mortem of the presentation meet by, Intel Server Platforms Group general manager Kirk Skaugen on February 21st, and later on 28th is shocking to say the least.

    The way Intel has presented the presentation has been very unprofessional and unethical. They have tried to impose and highlight their supremacy over AMD in a very cheap manner. These are the few of those.

    • Intel has compared their latest Xeon processors with pretty much old Opterons to declare themselves the leader. To make it worse, they even have a unnamed processor compared with the Xeon. Some of the benchmarks are with dual cores, some with single cores, and one nameless processor. All the while the Intel counterpart remains the same.
    • Lot of footnotes and disclaimers on each of the slide, which are hardly legible and the scores have been twisted, the slide title showing two high end Intel and AMD processors while tests being done between, the best Intel and a weaker AMD.
    • In the 1st presentation (on 21st), a footnote says " the latest opteron XXX has not been used because the benchmarks are not available, and hence we are comparing against an older version". Next presentation(on 28th) , the footnote is missing, so we naturally assume that it uses the latest Opteron, but on close examination, it is still the older one.
    • A slide show titled comparison of Xeon 5xxx with Opteron 2xxx, and not a single comparison is with 2xxx and this fact is safely buried in the cryptic footnotes.
    • They were using a six year old SPEC benchmark to prove that they were leaders and the wide gap in the benchmarks, while no one is using those benchmarks and the new ones are pretty common.

    These medley of errors, or delibarate makeovers flooded the whole presentation. I wonder how could a high profile company even dare to do this? Obviously, every single analyst would have figured out these. This has simply been shameful, to say the least.
    Adopted from: blogs.ZDnet.com

    myLot User Profile

    download the Intel presentation and check for yourself.
    Read the original article.
    Post-mortem report (must see).

    codevalley

    technorati tags:, , , , , , , , , ,