We have already seen few lazy load techniques here and here. If we have large set of HTML , it takes so much time to load page.
Here is a trick that allows you to bundle all of your modules into a single resource without having to parse any of the JavaScript. Of course, with this strategy, there is greater latency with the initial download of the single resource (since it has all your JavaScript modules), but once the resource is stored in the browser's application cache, this issue becomes much less of a factor.
To combine all modules into a single resource, we wrote each module into a separate script tag and hide the code inside a comment block (/* */). When the resource first loads, none of the code is parsed since it is commented out. To load a module, find the DOM element for the corresponding script tag, strip out the comment block, and eval() the code. If the web app supports XHTML, this trick is even more elegant as the modules can be hidden inside a CDATA tag instead of a script tag. An added bonus is the ability to lazy load your modules synchronously since there's no longer a need to fetch the modules asynchronously over the network.
200k of JavaScript held within a block comment adds 240ms during page load, whereas 200k of JavaScript that is parsed during page load added 2600 ms. That's more than a 10x reduction in startup latency by eliminating 200k of unneeded JavaScript during page load! Take a look at the code sample below to see how this is done.
<html>
...
<script id="lazy">
// Make sure you strip out (or replace) comment blocks in your JavaScript first.
/*
JavaScript of lazy module
*/
</script>
<script>
function lazyLoad() {
var lazyElement = document.getElementById('lazy');
var lazyElementBody = lazyElement.innerHTML;
var jsCode = stripOutCommentBlock(lazyElementBody);
eval(jsCode);
}
</script>
<div onclick=lazyLoad()> Lazy Load </div>
</html>
“It's not about how to achieve your dreams, it's about how to lead your life, ... If you lead your life the right way, the karma will take care of itself, the dreams will come to you.” ― Randy Pausch, The Last Lecture
Sunday, May 1, 2011
File based caching on high traffic website
If you have a lot of traffic, and you’re caching a page for an hour… Over the course of one hour, you may read the file possibly 200 times, and write just once… The time taken to write is minimal, around 10ms? - So, out of one hour, you have 10ms ‘downtime’ for your cache reads.
Which is 0.00027% downtime… Extremely minimal, and not really worth worrying about.
Locks will only cause this fractional locking time, and, if it only effects 2 people trying to write at the same time.
The correct way to work around it is explained here:-
* 1 page is cached for an hour.
* When that cache expires, 2 people visit the site with 10ms of each other (the only time this will cause an ‘issue’, because otherwise, the first will have written the cache before the second requests the page).
* The first user ‘reads’ the cache file, sees it’s expired so begins rendering the page normally with PHP/MySQL
* The second user has the same thing going on… the cache hasn’t been updated by the second user.
* The first user finishes building the page, and begins writing the file, locking it.
* The second user also finishes building the page, and attempts to write it.
* Because the first user has locked the file, the second user can’t write it…
* This is a problem for the second user, so you simply build a little fix to make it all work like magic.
If the second user can’t write the the cache (because it is locked), simply echo the contents of the rendered page, not saving it to the cache.
The first user is dealing with the cache, so just render the output.
With Codegniter, this is how it works…
Line 299 of Output.php:-
if ( ! $fp = @fopen($cache_path, 'wb'))
{
log_message('error', "Unable to write cache file: ".$cache_path);
return;
}
The cache file is only written if it’s ‘really_writable’ (not locked)
Line 59 of Output.php:
elseif (($fp = @fopen($file, 'ab')) === FALSE)
{
return FALSE;
So… if we cant write it, just render the second users’ request.
Simple, and it works, beautifully... :-)
Which is 0.00027% downtime… Extremely minimal, and not really worth worrying about.
Locks will only cause this fractional locking time, and, if it only effects 2 people trying to write at the same time.
The correct way to work around it is explained here:-
* 1 page is cached for an hour.
* When that cache expires, 2 people visit the site with 10ms of each other (the only time this will cause an ‘issue’, because otherwise, the first will have written the cache before the second requests the page).
* The first user ‘reads’ the cache file, sees it’s expired so begins rendering the page normally with PHP/MySQL
* The second user has the same thing going on… the cache hasn’t been updated by the second user.
* The first user finishes building the page, and begins writing the file, locking it.
* The second user also finishes building the page, and attempts to write it.
* Because the first user has locked the file, the second user can’t write it…
* This is a problem for the second user, so you simply build a little fix to make it all work like magic.
If the second user can’t write the the cache (because it is locked), simply echo the contents of the rendered page, not saving it to the cache.
The first user is dealing with the cache, so just render the output.
With Codegniter, this is how it works…
Line 299 of Output.php:-
if ( ! $fp = @fopen($cache_path, 'wb'))
{
log_message('error', "Unable to write cache file: ".$cache_path);
return;
}
The cache file is only written if it’s ‘really_writable’ (not locked)
Line 59 of Output.php:
elseif (($fp = @fopen($file, 'ab')) === FALSE)
{
return FALSE;
So… if we cant write it, just render the second users’ request.
Simple, and it works, beautifully... :-)
Subscribe to:
Posts (Atom)