R – Large file upload into WSS v3

large-filessharepointuploadwss-3.0

I'd built an WSSv3 application which upload files in small chunks; when every data piece arrives, I temporarly keep it into a SQL 2005 image data type field for performance reasons**.

Problem come when upload ends; I need to move data from my SQL Server to Sharepoint Document Library through WSSv3 object model.

Right now, I can think two approaches:

SPFileCollection.Add(string, (byte[])reader[0]); // OutOfMemoryException

and

SPFile file = folder.Files.Add("filename", new byte[]{ });
using(Stream stream = file.OpenBinaryStream())
{
    // ... init vars and stuff ...
    while ((bytes = reader.GetBytes(0, offset, buffer, 0, BUFFER_SIZE)) > 0)
    {
        stream.Write(buffer, 0, (int)bytes); // Timeout issues
    }
    file.SaveBinary(stream);
}

Are there any other way to complete successfully this task?

** Performance reasons: if you tries to write every chunk directly at Sharepoint, you'll note a performance degradation as file grows up (>100Mb).

Best Answer

I ended with following code:


myFolder.Files.Add("filename", 
   new DataRecordStream(dataReader, 
      dataReader.GetOrdinal("Content"), length));

You can find DataRecordStream implementation here. It's basically a Stream whos read data from a DbDataRecord through .GetBytes

This approach is similar to OpenBinaryStream()/SaveBinary(stream), but it's doesnt keeps all byte[] in memory while you transfer data. In some point, DataRecordStream will be accessed from Microsoft.SharePoint.SPFile.CloneStreamToSPFileStream using 64k chunks.

Thank you all for valuable infos!

Related Topic