- Code: Select all
<script type="text/javascript" src="/scripts/jquery-min-1.3.3.js?v=132" local="com.jquery.min.133.js" sig="eaa41fbd734596533e98e557eae39b8b" />
What is the problem you are trying to solve?
I have to load jquery, et al, all the freaking time
What is the feature you are suggesting to help solve it?
Signature validated tag support for persistent cache of commonly used scripts, css etc
What is the processing model for that feature, including error handling?
I haven't given much thought to attribute names but essentially within any tag with a src the attribute local is optional and specifies a script using a common naming convention (I used reverse DNS but obviously this can be improved). The browser checks for the local item in its library, if id doesn't find it it uses the src and optionally caches it with the local name as written in the tag. The sig attribute optionally specifies a unique signature. This signature (I just used an MD5 but I'm sure there's better options?) MUST be validated if it is specified- this is true whether the resource was loaded from local (which would get priority) or src. If the sig doesn't match whichever resource is loaded then the script fails to load.
Browsers could handle how the cache is maintained but one model might be a local repository in the distribution that includes the most common scripts. The repository may contain multiple script versions as long as each has a unique name and sig. It might only when the browser is updates as part of the normal upgrade procedure.
Another option is a remote repository that the browser can sync with when it encounters a script id doesn't know about. Combine this with the previous.
Still another option (my vote) is to cache every script that has local declared, calculating it's sig on the way into the cache. The user has an option to discard scripts after [quit, 30 days, year, never]. Each script has a last-use timestamp which is reset every time the script is called by a tag. Additionally the cache could optionally include the first-load timestamp for the script and an incremented, thus allowing the average frequency of a scripts use to be calculated. When the last-use delta now exceeds the user limit the script is discarded. This should be pretty safe if a good signing method is used. If the cache fills if could drop the least frequently used scrips or the earliest last-use. Comments?
Why do you think browsers would implement this feature?
Everyone wants to be faster these days. This will eliminate css and js load time from the equation for all popular sites as well as sites that use the same scripts (they are many). Plus it's a pretty dang simple idea with blessedly simple implementation.
Why do you think authors would use this feature?
It saves on bandwidth, is copy-pastable and easier to maintain than any other cache mechanism. It's also a happy thing when their site loads faster and uses less bandwidth and fewer server resources.
What evidence is there that this feature is desparately needed?
The polar ice caps are melting j/k. No seriously, what's the point of loading the same file from different servers over and over and over. This way we don't even need to query http to see if it's 304. If the sig matches, use the local copy.