async on version 2013.01

May 23, 2013 at 7:04 PM
I have an issue with a function not running async when upgrading SPServices to the new version. The issue I found was at line 1220 of SPServices. In the previous version, the async value was set to the async option I provide. In the new version, it defaults to false if a completefunc exists. In my case, I have a completefunc and still need it to be true for async. Is there something I am not seeing to do this? When I delete the "thisHasCompletefun ? false : " from that line, everything works for me but rather than changing spservices on my system, I'd rather understand the intent better.
Coordinator
May 24, 2013 at 4:52 AM
Edited May 24, 2013 at 4:52 AM
I'd be surprised if you didn't have sporadic issues with previous versions if you didn't specify async: false and tried to process results in the completefunc. Can you explain in more detail?

M.
May 24, 2013 at 5:24 AM
This is actually one of Peter Allen's tools he deployed for us but I will explain what I understand so far. He is building a combobox of lists/libraries on a site using getlistcollection. He is relying on async to be able to build the list in a function. Due to the async, the remainder of the calling function gets a parameter from the url params to identify which list should be already selected before the completefunc actually executes. I hope that makes enough sense. I'd post code but it's not fully mine to post.
May 24, 2013 at 5:25 AM
The point here is that there are obviously ways to re-write but I'm going live in production tomorrow so I was looking for a quick solution without a lot of re-write.
Coordinator
May 24, 2013 at 12:54 PM
I'd say that the quick solution is to stick with v0.7.2 or whatever version you've been using and work it out later.

M.
May 24, 2013 at 3:09 PM
Marc,
Have not. Had a chance to take a full look at this thread and the new version of SPServices. But it sounds as if there is something new in the latest version that makes it backwards incompatible?

I always though that async was by default true and that the completefunc parameter had no bearing on the value of this parameter.

If what this thread indicates is true - mainly: if completefunc is set, then the call is always async: false - thats not backwards compatible and could cause significant ui/ux issues for those looking to upgrade. I realize there is a way to get around it: use Deferreds. But that implies refactoring for those wanting to just stay current.


_____
Paul

Sent from mobile device.
Coordinator
May 24, 2013 at 3:45 PM
Edited Jun 8, 2013 at 6:19 PM
Paul:

Yes, you have a valid point. I want to think about it, but if you have input, I'd love to hear it. I certainly didn't intend to break anything: backward compatibility is one of my mantras, as you know.

M.
Coordinator
May 24, 2013 at 3:46 PM
This discussion has been copied to a work item. Click here to go to the work item and continue the discussion.
May 25, 2013 at 10:01 PM
Hi Marc... I had a chance to look at the code and to think about this... It does in fact look like all calls are made with Async false if the completefunc is set, as @thamera points out. I did a few test with the code below and watched the sequence of output to the console to validate this...
I think the item that got me is the fact that the async is overwritten. I can't think of a reason to change that based on whether the input options has a completefunc option defined or not.

My suggestion is to always honor the caller's options, in this cased async... The Completefunc should always be executed if defined...

Tested with the following:

(function(){

    $.getScript("http://cdnjs.cloudflare.com/ajax/libs/jquery.SPServices/2013.01/jquery.SPServices-2013.01.min.js");

    /**
     * Test deferreds from SPServices 2013.01
     * Using: http://cdnjs.cloudflare.com/ajax/libs/jquery.SPServices/2013.01/jquery.SPServices-2013.01.min.js
     */
    window.test = function() {

        console.log("1) Before SPServices (v" + $().SPServices.Version() + ")");
        
        $().SPServices({
                operation: "GetListItems",
                listName: "states",
                async: true,
                completefunc: function(){
                    
                    console.log("3) completefunc() Done!");
                    
                }
            })
            .then(function(){
                
                console.log("4) .then() Done!");
            
            });
        
        console.log("2) After SPServices call...");
        
    }
    
})(jQuery);

When I removed the completefunc, the ouput I got was the expected one (1,2,4).
Coordinator
Jun 8, 2013 at 6:25 PM
Paul:

Sorry for the delay in replying.

I agree that I'm violating my backward compatibility rule here. I've been thinking about it since this thread started and while I'm not respecting the async option setting, I'm not sure that I'll cause people problems.

Can you think of a legitimate case where someone would want to have a completefunc with async set to true? The only use case I can think of is where the code in the completefunc does something that has nothing at all to do with the Web Service call. That makes little sense, of course.

If async is true and there's code in the completefunc that is meant to process the result, it will only function properly sporadically.

The reason I ended up where I did is that async isn't valid with deferred objects, so I wanted to avoid using it if it wasn't valid.

Thanks,
M.
Jun 8, 2013 at 8:47 PM
Marc:
No problem on the response time... I knew you would eventually get around to catch up on this item...

I guess I'm still a little confused as to why the completefunc "...only function properly sporadically.". I have never encountered an issue with it, but would like to hear the troubles you have encountered/seen/heard. And, as you know, I wrap SPServices today in $.Deferred() calls and have have not seen any issues - both completefunc processing and Promise methods execute properly and achieve the expected results.
The only issue I can think of is if xData data (the XMLDocument object) is manipulated in one callback which would impact the expectation of the next callback (ex. .done() would change it... completefunc() would now possibly not get a the real XMLDocument ... but even if this is what you refer to, it has no association to the async setting being true/false).

Taking Promises out of the discussion (I think all you did there was returned the jQuery .ajax() return value), the change you made will cause all existing (let's call it "legacy" :) ) code to perform in synchronous mode. As you know, in the browser, all synchronous calls block the UI and thus user experience will be hurt by this change for anyone who has build significant applications around SPServices. I, for one, cannot just drop this new version into an existing environment anymore because all existing calls would suddenly lock up the UI/browser. I would have to refactor everything to use .promises if I wanted calls to be done in async=true. I have several customizations that do "background" calls for data (ex. to refresh cached items - used with MVC frameworks) and then act on those values without the user's knowledge... This change would almost certainly introduce a "stutter" affect on the page or cause it to be come momentarily unresponsive. Also, in cases were multiple pieces of data need to be retrieved from different Lists/Libraries in order to join them together would now take a 2-fold or more hit on performance as all of those calls would be done serially. Below is an example of a piece of code I lifted from a project I worked on that would see a big performance/UI hit...

Now, Back to support of Promises: I agree with you that defining a completefunc in addition to using Promises is silly... they both accomplish the same thing but in different ways (execute code later, after call returns from the server). But, completefunc is still a valid input parameter and its setting should not have (IMHO) have a impact to the user's setting of async. And, as you probably very well know: users will use both. Take for example the jQuery Ajax methods... They all now - I think - return a Promise to the caller (with the exception of .load()), but also continue to support a callback method, as they did in the pass. Setting the callback and/or using the Deferred method (.then(), .always(), .done(), etc...) has no impact on the synchronous setting of the server call.

I also agree that creating/using a Deferred with async=false is again silly and defeats the purpose, but as long as your default setting support what is "right", it falls back on the user to know just exactly what they are doing. I understand you are trying to protect the end user, but given you existing user base, you may actually be doing more harm than good (again :) IMHO).

Paul.

(ps. yes... I'm probably not a good candidate to represent the majority of SPServices users... many are just interested in small changes and look for copy-and-paste code... but... your library has enabled me to make some very powerful application. :) )

Example of UI and performance hit:
The code below would be executed serially and in synchronous mode under v2013.01... UI would lock up and time to complete would probably increase more than 2-fold.
$.When(
    // Get Document info.
    $.Deferred(function(dfd){
        
        $().SPServices({
            operation: "GetListItems",
            async:      true,
            listName:   "Documents",
            CAMLQuery:  "<Query><Where><Eq><FieldRef Name='DocNumber'/><Value Type='Text'>" + 
                        k + "</Value></Eq></Where></Query>",
            CAMLRowLimit: 0,
            completefunc: function(xData, Status) {
                
                dfd.resolveWith(xData, [xData, Status]);
                
            }
        });
        
    }),
    // Get Doc. Issues
    $.Deferred(function(dfd){
        
        $().SPServices({
            operation:  "GetListItems",
            async:      true,
            listName:   "Baselined Documents",
            CAMLQuery:  "<Query><Where><Eq><FieldRef Name='DocNumber'/><Value Type='Text'>" + 
                        k + "</Value></Eq></Where></Query>",
            CAMLRowLimit: 0,
            completefunc: function(xData, Status) {
                
                dfd.resolveWith(xData, [xData, Status]);
                
            }
        });
        
    }),
    // Get Author info
    $.Deferred(function(dfd){
        
        $().SPServices({
            operation:      'GetUserProfileByName',
            AccountName:    userAcctName,
            async:          true,
            completefunc:   function (xData, Status){
                
                dfd.resolveWith(xData, [xData, Status]);
                
            }//end:completefunc()
        });
        
    })
)
.done(function(DocInfo, DocIssues, DocAuthor){
    
    // Build Doc./User profile UI
    
    
})
Coordinator
Jun 9, 2013 at 3:25 PM
Perfect. You've told me exactly what I wanted to understand.

I agree that you aren't the typical SPServices user - I've never seen anyone else wrap SPServices calls in deferred objects - but the fact that 2013.01 would break your code is unacceptable, nonetheless.

As for completefunc behaving sporadically, what I meant was that if you don't set async: false and you simply have code in the completefunc (forgetting about deferred objects altogether), then the majority of the time your completefunc would be operating on an empty result. The call simply doesn't complete fast enough. You're smart enough (and smarter than I am in this respect) to have gotten around this deficiency by wrapping your calls in deferred objects.

How about this: I'll release a 2013.01a version to cover just this one - admittedly core - fix? You should then be able to use it along with anyone else who is smart enough to be doing what you are. It won't be a recommended release unless you have this issue, i.e., those who are just using the value added functions simply won't care.

Thanks for taking the time to enlighten me, as always!

M.
Jun 9, 2013 at 4:54 PM
Thanks Marc - for entertaining my response. I'm glad that you are open to discussions like this. If I remember correctly, our first exchange here on CodePlex took place in a similar way (SPCascadingDropdown support for item IDs and the business need).

I'm still not sure of the issue around async: false. I am, in some cases, using it that way as well and have not experienced any problems (that I can tell). I'm going to maybe do some experiments to see if I can detect anything. I may have to revisit past implementations if I find something.
Do you have a code sample of how the problem could surface?

Re: The Release
Don't rush the release out on my account. :) - I can always get around the issue by forking the library (temporally) if needed until you roll it in (I did that last with the URL hash (#) issue... I may have to do that for the #SPWhiteHats project - I think I dropped this version in there).
Thanks again. Enjoy your Sunday.

_____
Paul

Sent from mobile device.
Jun 9, 2013 at 6:02 PM
I would recommend a slightly different approach. Issue the fix as a core update with a note that its is a depreciated option to removed in the next release. The issue for me is not about right vs. wrong way to handle this. It's a about time to refactor code. Yes, promises is a much better way to handle this but it will be at least another month before I have time to refactor.
Coordinator
Jun 10, 2013 at 1:16 AM
thamera:

I'm not sure how your suggestion makes sense. The issue is that I've introduced a potentially breaking "feature". I need to issue a fix for that. If you choose to stick with v0.7.2 or go to 2013.01 or 2013.01a based on your needs, those are all fine options. I need to figure out the best approach for the user base at large.

M.
Coordinator
Jul 1, 2013 at 8:03 PM
Paul:

I think the fix for the backward compatibility issue with SPServices and async is as simple as changing from this:
  async: thisHasCompletefunc ? false : opt.async,
to this
  async: opt.async,
as @thamera suggested in the open issue. I should have put this into the first alpha, but I missed it. If you agree this is the best answer, I'll get alpha 2 out there for you to try.

Any other thoughts?

M.
Jul 2, 2013 at 2:34 AM
Marc:

Thanks for posting.

I just reviewed the code and that looks correct to me as well... opt.async is set to $.fn.SPServices.defaults and then the defaults are overwritten by the user's options on input, so the user's requested mode will always be honored. :)

but... :)

In looking around that area of code, I noticed that completefunc does not seem to always receive a consistent set of input params (xData, status)... I would like to do a little more testing on my assumptions before I post anything back to you (soon... I promise)...

Did you want me to post it here? or a new thread if I have question/concerns?

Paul.
Coordinator
Jul 2, 2013 at 12:27 PM
Paul:

Here is fine. I know that there is one inconsistency: when I pull from the cached promises, I give completefunc a null Status. Do you think I should make that "success" to mirror a successful call?

M.
Jul 2, 2013 at 7:53 PM
Marc,
This is going to be long… I did not want to post yesterday night because I wanted to first do some testing so that I could prove my theories/observations... :)

(on a side note, before I begin: good job today at SPChat)

Issue 1: When cacheXML is true, completefunc() is not provided with the ajax “status”

This one is an easy fix… You already have the Deferred object, which is resolved because it was cached… Just “add yourself” on to the chain and call the completefunc… Because the Deferred is resolved, your callback is executed immediately. Here is the code:
// Call the completefunc if there is one
if(thisHasCompletefunc) {
        
    cachedPromise.done(function(xData, status){

        opt.completefunc(xData, status);
      
    });
        
}
// Return the cached promise
return cachedPromise;
You don’t want to force the status to be successful because you don’t currently have code that prevents the caching of the request if it status is ‘error’… So even if the request failed to be sent to the server (ex. invalid XML) it would still be cached (more on caching below)…

When I initially saw this yesterday, I was concerned as to why I have not seen this in the application I’m mostly working with now a days… Specially since I do have instances where I use cacheXML true (ex. retrieving list information) and I normally check the status field for “error” or “success”... Answer: I’m still using v0.7.2 which does not have this issue – you were using the cacheXML input option to determine if it should be cached or not.

(I also checked the “Black Magic for White Hats SharePoint” project I recently did and looks like I check for “error” explicitly and don’t do something like “If not success”, so I’m good there as well)

Issue 2: SPServices is always caching requests

This one is the more concerning one and the one I wanted to do more testing on… SPServices currently caches all Ajax requests regardless if the user sets cacheXML or not! (all operations – queries and updates a like)

My concerns are around memory and performance on the browser… From a functional standpoint, you still return the proper response based on cacheXML setting.
For a “casual” user, this will not be an issue – they make a few calls to SPServices… no problem… but… What about SPA’s (Single page applications)? These can make hundreds, if not thousands of calls for data during the life of the browser (tab) session… The memory footprint can grow to be very large. This all seems to have been introduced with the move to supporting Promises.

(At this point I should say: shame on me… for not contributing to your request for feedback when you first implemented the cacheXML and Promises functionality… specially since I have a vested interest in this library for my own projects :) )

I have a code suggestion which supports the following (personal) feeling toward caching – which echo’s the principle behind this original post about the async flag:
  1. Only cache when asked (cacheXML: true)
  2. Keep cached version until a subsequent request comes through with cacheXML == false… at this point: delete the cached version from memory.
I’m not going to worry about “cacheXML” being true by default (in the defaults object), which would make updates (ex. UpdateListItems) not actually make updates in some cases… (in SPA pages, it could happen). I would suggest adding something to the documentation under the cacheXML option that most likely 1) it should not be set globally... and 2) should probably not be used on updates.


Proposed Changes:

The change is fairly simple and only moves around some of your variables that you already defined… basically, you assign the ajax object to cachedPromise and only add that to promisesCache[] object if cacheXML is true. I took your code from v2013.01 and made only a few edits... Here it is:
// Check to see if we've already cached the results
var cachedPromise;
if(opt.cacheXML) {
    cachedPromise = promisesCache[msg];
}

// If we don't have a completefunc, then we won't attempt to call it
var thisHasCompletefunc = $.isFunction(opt.completefunc);

if(typeof cachedPromise === "undefined") {

    // Finally, make the Ajax call
    cachedPromise = $.ajax({
        // The relative URL for the AJAX call
        url: ajaxURL,
        // By default, the AJAX calls are asynchronous.  You can specify false to require a synchronous call.
        async: thisHasCompletefunc ? false : opt.async,
        // Before sending the msg, need to send the request header
        beforeSend: function (xhr) {
            // If we need to pass the SOAPAction, do so
            if(WSops[opt.operation][1]) {
                xhr.setRequestHeader("SOAPAction", SOAPAction);
            }
        },
        // Always a POST
        type: "POST",
        // Here is the SOAP request we've built above
        data: msg,
        // We're getting XML; tell jQuery so that it doesn't need to do a best guess
        dataType: "xml",
        // and this is its content type
        contentType: "text/xml;charset='utf-8'",
        complete: function(xData, Status) {
            // When the call is complete, call the completefunc if there is one
            if(thisHasCompletefunc) {
                opt.completefunc(xData, Status);
            }
        }
    });
    
    // If cache is true, then store the ajax request
    if (opt.cacheXML) {
    
        promisesCache[msg] = cachedPromise;
    
    // ELSE, cache must be false... Remove cached version (case it exists)
    } else if (promisesCache[msg]) {
        
        delete promisesCache[msg];
        
    }
    
    // Return the promise
    return cachedPromise;

}
I wish I had line number to tell you exactly what's I'm suggesting... Hopefully its not to hard to pick out...

Testing Code:

I create the script below that I loaded dynamically through firebug and played with from the console by calling the functions…
/**
 * Validate theory of v2013.01:
 *      1.  Always caches requests, regardless of cacheXML settings.
 *          Memory/Performance impact.
 *      2.  completefunc() receives different input params when using cached request.
 * 
 *
 */

// Load script via dom so that we can see it in firebug
function loadSPServices(url) {

    // Insert SPSErvices into page
    // Need to use a non-minified version for debuging/stop points.
    var head    = document.head || document.getElementsByTagName("head")[0] || document.documentElement,
        s       = document.createElement("script");
    s.type      = "text/javascript";
    s.src       = url;
    s.onload = s.onreadystatechange = function( _, isAbort ) {
        if ( isAbort || !s.readyState || /loaded|complete/.test( s.readyState ) ) {
        
            s.onload = s.onreadystatechange = null;
            s = undefined;
            
            console.log( "SPServices v" + $().SPServices.Version() );

        }
    };
    head.insertBefore(s, head.firstChild);
        
}

function makeSPCall(options, loopCount, list) {
    
    if (!loopCount) {
        
        loopCount = 1;
        
    }
        
    var n   = 1,
        opt = $.extend(
                {},
                {
                    operation: "GetListItems",
                    listName:   "Announcements",
                    cacheXML:   false,
                    CAMLRowLimit: 1,
                    completefunc: function(xData, status){
                        
                        console.log("status: " + status);
                        console.log("xData: ");
                        console.log(xData);
                        
                    }
                },
                options
            );
    
    
    
    for(; n<=loopCount; n++){
      
        $().SPServices( $.extend(opt, { CAMLRowLimit: n }) );
      
    };
    
}


function makeUpdate(itemId, cache) {
    
    $().SPServices({
        operation:  "UpdateListItems",
        listName:   "Announcements",
        cacheXML:   (cache === true ? true : false),
        batchCmd:   "Update",
        ID:         itemId,
        valuepairs: [
            ["Title", "Title updated..."]
        ],
        completefunc: function(xData, status){
            
            console.log("status: " + status);
            console.log("xData: ");
            console.log(xData);
            
        }
    });

}


console.log("TEST FILE: Loaded");
Hope this helps in explaining my concerns and in testing... On a plus side: I used Chrome's profile tools to do Heap snapshots... I eventually crashed Chrome... :) (a few times)


Paul.
Coordinator
Jul 2, 2013 at 10:06 PM
Edited Jul 2, 2013 at 10:07 PM
Paul:

You're awesome for taking a look at this, though from the looks of it you've found that I've some something pretty horrible with caching. I tested it over and over and I was sure I had it right, but clearly I don't.

I'll take a look at what you've sent. You'll get a big credit for this!

M.
Coordinator
Sep 18, 2013 at 4:06 AM
This discussion has been copied to a work item. Click here to go to the work item and continue the discussion.
Coordinator
Oct 24, 2014 at 3:58 AM
Paul:

I'm trying to figure out if this is still happening in the 2014.02 that I'm working on, and I don't think it is. I refactored the code in 2014.01 and it seems to me that it's healthier. If you have the time and the inclination to look at the latest alpha to see if you agree, I'd appreciate it.

Thanks,
M.
Oct 25, 2014 at 12:28 AM
Sure Marc. I'll try to do it this weekend.
Oct 25, 2014 at 4:33 PM
Marc,
Just did a code reviewed the last alpha (ALPHA6), and both problems still exists...
Problem #2 above (the more critical one in my opinion) is caused by line number 1297:
        if (typeof cachedPromise === "undefined") {

            // Finally, make the Ajax call
            promisesCache[msg] = $.ajax({
You are automatically caching all request by the statement above (promisesCache[msg]), regardless of cacheXML is true or not... Over time (specially with SPA's), the memory footprint will blow up (memory leak)...
I created a test file to show you this (issue #2), here: https://gist.github.com/purtuga/3761c35d55e787e7918a

File is also on your O365 account, where you can test... under: https://YOUR-O365-SP-ACCOUNT-ID.sharepoint.com/sites/PT2013/Shared%20Documents/spservices.forum.444799.test.aspx

Hope this helps.

/Paul.
Coordinator
Oct 25, 2014 at 10:04 PM
Thanks! I was having a little trouble wading through the thread. I'm on it. You're awesome.

M.
Coordinator
Oct 25, 2014 at 10:16 PM
Your gist absolutely proves it and I understand the problem now. Yikes. Luckily, most of the time people aren't making that many calls, but you're right about SPAs.

M.
Oct 25, 2014 at 10:21 PM
Yeah, its pretty serious... In one project at work (scrum management system), I'm using a forked version of SPServices with the fix I mention above... its an SPA that users have loaded probably all day and that does data pulls every x amount of seconds...

and... NO, NO, NO - You and SPServices are awesome! :)
Coordinator
Oct 25, 2014 at 10:32 PM
Not when I see this kind of bone headed code, I'm not!

M.