Tuesday, August 2, 2011

Faking a file upload to Sharepoint

I am working on a project that requires me to do some heavy lifting with javascript in Sharepoint.  The project involves using a custom forms web part (which mimics the old Microsoft CMS forms functionality) to create a page where a user can browse a list of PDFs and opt to receive an email with a formatted set of selected PDF articles (links to them, anyway, with a title and description).

The problem with this web part is that you need to edit the page, paste in the XML, and then hit a button in the form to submit before you publish the page.  Hitting this button can't be simulated (I'm going to call this the Evil button from now on).  Further, in order to send out the custom email, it doesn't rely on adding an item to a list: you need to upload an XSLT file to parse the form submission.

Since this "application" is supposed to eventually be presented at trade-shows (as well as openly on the web), there is the distinct possibility that there might be some javascript injection.  The forms web part is pretty rigid in what it allows, but how would we transfer the PDF titles and abstracts over to the XSLT file?  It would be simple to do if we are editing the XML and XSLT manually each time as we could just code it right in, but for a better experience, I am opting to let the client use a cutom list to hold all the PDF information.  This is then pulled into the page via a content query web part (I am using this in place of SPServices now for certain situations: javascript being disabled and/or a need for caching).

The problem now is getting the PDF details into the XML and XSLT in such a way that we are not relying on it being transfered through the form itself, as this can be manipulated via javascript to send out arbitrary links or text in emails (worse, coming from the trusted company domain!).

The best solution I came up with was to make a compromise when editing the form.  After changes are made to the custom list, someone has to go edit the form page and hit some buttons to auto-generate the XML and XSL through javascript, and then the person would hit the Evil button to save the form definition.  They then have to copy the XSLT to ta local file and upload it to the right place.  The PDF title, abstract and link are all contained in the inaccessible XSLT file and only the ID is displayed on the form, which allows us to determine which PDF was chosen on submission - just what we want.

On digging a bit further, I found that you are able to fake a file upload to MOSS via the copy web service.  The web service has a loophole in that while copying, you are able to change metadata of the what is being copied, which includes the contents of a file.  Using SPServices, I was able to inject the XSLT contents into a fake file copy (pretending to copy a file in from a different location) during the transfer, so that the file contained my XSLT after.  The thing about the copy web service is that it completely ignores the initial file being copied if you specify some file contents.  This is a Base-64 stream.

So converting the XSLT to Base64 and injecting this into the XSLT file allowed me to bypass the need for someone to manually download the XSLT and upload a file to MOSS which could have opened the door to a great many things-that-could-and-probably-would-go-wrong.  The injection still retains my logged-in user information in the last update, so the XSLT file shows that I modified the file and also indicates the time of the modification.  The copy also creates a new version of the doc when it finds a duplicate, so we are able to keep the filename and just copy over it repeatedly.

Now the whole process has been distilled to editing the page, pushing the Evil button (all the parsing is done on page load to read in the latest values from the custom list and generate the XML and XSLT), and then pushing the "upload XSLT" button.  The Evil button causes the page to reload, but the upload button doesn't - it just updates a status when done.  The user then publishes the page and the form is ready for the end-user to use.

This whole process took a week to engineer.  I have learned a few very useful things this time - the file copy web service is quite useful, forms architecture is much better handled as a custom list item addition triggering an email workflow, and almost anything is possible in middle-tier development!

Labels: , , , ,

0 Comments:

Post a Comment

Subscribe to Post Comments [Atom]

<< Home