GetListItems - is web recursion possible?

May 18, 2010 at 8:27 AM
Edited May 18, 2010 at 8:28 AM

Good morning folks,

I have happily used a DVWP to get data from a List in all sub-webs by setting the DataSource with DataSourceMode="CrossList" and adding "<Webs Scope="Recursive"></Webs>" into the SelectCommand XML.

I'd really like to be able to do something similar using a JavaScript call - e.g. to GetListItems. Having looked at the documentation on MSDN, the most hopeful looks like the ViewAttributes in the QueryOptions, however this is limited to how Folders within the specific List are treated.

I suppose I could build up a whole series of AJAX calls using the Webs web service and recursively call GetAllSubWebCollection, and then within each of these call GetListItems and build up an XML document of the returned results.

Is there another (easier!) way to achieve this? Have I missed something in the options that I could use??

Thanks,
Alex

Coordinator
May 18, 2010 at 1:16 PM

The second method is the right one, based on my experience.  There isn't any concept in the Web Services for "CrossList" that I'm aware of.

M.

May 19, 2010 at 7:01 PM

I had the same problem, I wanted to "Ajax load" an aggregated view of a list type to a page. The problem was there were too many sites to make web service calls to every single site (around 50 subsites - too slow). The ugly workaround I ended up with was creating a blank aspx page with a DVWP crosslist/recursive that i loaded to my page using the jQuery $.get() function. Worked better than expected.

/Fredrik

Coordinator
May 19, 2010 at 7:55 PM

Was there a specific reason either of you didn't just stick to the DVWP with DataSourceMode="CrossList"?  There's no need to jQuery at all in that scenario.  CrossList DVWPs are very efficient, in my experience.

M.

May 19, 2010 at 8:18 PM

First I wanted to use filters/drilldowns of the list view (using url filters in the $.get call) without reloading the whole page, then I had a few different DVWP on the page (quite slow and heavy, joining several long lists). It felt like the page loaded much faster when I used asynch ajax to load the content because it loads the components multithreaded (I think?). Also loading drop down filters using $.get calling the _layouts/filter.aspx directly returning the unique values of each column - was much faster than using dvwp to get those.

/F

Coordinator
May 20, 2010 at 2:43 AM

fereko:

Interesting.  Sounds like you've done a great job "AJAXifying" the page. The AJAX call can be asynch, which it sounds like you have done. Of course, even though the page loads faster initially, there will still be a lag as those AJAX calls finish up.

Another way to think about it (depending on your data volumes) is to take the hit up front for the data load and do the filtering entirely client-side in the DOM.

M.

May 20, 2010 at 8:24 AM

Too much data for client side filtering unfortunately :-(

There will be a lag, but the total load time will be less. Not 100% sure, but I think that we can have, say 10 server calls running at the same time when making asynch calls and the webparts will then be generated (server side) simultaneously instead of one by one.

I also used this approach when i wanted OC-integration for the People-fields (and some SP xslt functions) but still wanted a centrally managed web part. A normal DVWP as far as I know only has the possibility to put the xsl in a separate file, so if you want a change of the data source, add/remove a column or something else, you don't want to update it on 100 subsites. Also it could be used for "paging" without reloading the whole page. Guess this is as far from classic SP best practice as you can get, but it actually works good for some (not all) purposes.

May 21, 2010 at 9:48 AM

Interesting thread now!

fereko - do you load data from another page using $.get() ? That sounds quite interesting...

My immediate requirement is way more simple!! I have a hierarchy of sites in the form Root >> Programme >> Project. From this hierarchy, I want to get a list of projects & programmes (e.g. in a cascaded drop-down list).

Given that the projects can be created by end-users, I don't really want to have to ask them to also add an entry into a List for the project they have just created. However, projects are not so dynamic that I need to query across the hierarchy every time!

I've thought about a few solutions, some more "hacky" than others...

  1. do the whole AJAX recursive calls to get the list at runtime (might be a bit slow)
  2. store the project / programme info in another list (additional effort by end users)
  3. run a batch process (e.g. hourly) that populates this list (item 2) by recursing through (but not hooked into a "new site" event)

Right now I have gone with option 2 as it was the simplest and doesn't stop me from doing something different later...

Alex

May 25, 2010 at 9:35 PM

Yes, it's possible to load data from any page within the same domain but it's better to use an empty aspx page where you put a DVWP (the form tags must be removed in the "empty" page not to cause errors). In this case it might just be easier to use the GetAllSubWebCollection or GetWebCollection see http://spservices.codeplex.com/wikipage?title=Webs&referringTitle=%24%28%29.SPServices

If you need more information than you get from those web services you can use a central list to store more information with a site id/url in one column. That information you can then display filtered on each subsite (and even update the information from there)

/Fredrik