Resumable Uploader

  1. The uploaded content is videos. Raw, uncompressed, videos are uploaded usually from the video-cam directly or from DVD. On an average, upload content size is around 4 to 5 GB
  2. We have always focussed on providing the service as web application (no installations, just browser, as our users are Coaches who use school devices where they may not have administrative privileges to install applications) and do not have any desktop application, that can help out with upload and resume upload process. Though desirable and maybe at some point, we might end up having one but we got none for now
  3. We cannot really control the upload speeds at user’s end. And most of the Internet Service Providers out there, provide an extremely low upload bandwidth
  4. So uploads take hours. And the laptops, desktops or any other kinds of machines used for uploads, may go into sleep / hibernate (especially if you are on windows) or may just be out of battery power (in case of laptops). For umpteen number of reasons, the upload process might break!
  5. Hence we needed some ‘resume’ upload functionality. We did have it. Or did we? Well, if it were an upload of 5 files and upload broke on 3rd, next time, you would need to upload only the 3 remaining files. Though, 3rd had to be started again. And you had to notify the user about failed upload and user needed to select the remaining files and re-initiate the upload for those remaining files
  6. So after spending an hour or more at uploading, if it broke for any reason, it’s quite frustrating for the end user to bother about broken upload and figuring things out to be able to resume
  7. while upload happens, user cannot go to any other page. we tried to solve this problem by using a small pop-up, that gets triggered when user starts the upload process. Who doesn’t hate pop-ups?
  1. Save the start-state of your upload, that has all information about the upload, even before upload starts
  2. Start the upload and constantly update the upload-state information, when a part of the upload is complete
  3. In case upload broke and upload has to be resumed, implement a mechanism at the server-side that:
  4. detects automatically, that there is a broken upload for a particular user, when that user revisits (be user-friendly)
  5. provide server-side service (api) that can be called to provide information about the upload-state of a particular upload
  6. trigger the resume upload process
  7. Once the resume-upload process is triggered, it needs access to the content, that did not get uploaded completely. Browser does NOT allow that, for obvious security reasons. So you need to tackle that problem
  8. Finally, after having access to the content locally on user’s machine and getting the upload-state from server-side api, resume the upload process
  1. There’s javascript ajax call, that sends all the upload information, before the upload actually starts
  2. In return, the server-side api provides a GUID i.e. the upload identifier, for this particular upload. The upload-state information gets saved and identified by the guid henceforth
  3. Javascript triggers the upload process via Silverlight component
  4. The Silverlight component actually spawn 2 Silverlight worker threads — 1st to upload the content to the remote server chunk by chunk and 2nd copy the content locally into the user’s ‘Isolated Storage’, where Silverlight has access permissions (use of Silverlight’s Isolated Storage has been our answer to overcome the limitation that browser puts forth against direct access to the user’s system)
  5. The local, isolated storage copy is done wisely. It involves copying in the reverse order (i.e. last parts are copied first) and it constantly checks if it has reached the part that been already uploaded. If so, simply stop, cause there’s already enough content copied locally on the isolated storage to be able to resume the upload
  6. On every page, there’s a light-weight service, that checks if there is any upload to be resumed for the user. If there is, trigger it
  7. This gives user ability to be able to not just resume upload but browse to any page and upload still continues under the hood. Of course, it is assumed that local copy has already finished (for most practical purposes, local copy takes less than 5 minutes, hence we simply tell user that ‘upload is being prepared’ during this time…and then actually start showing the upload progress)
  8. There’s also a javascript monitor thread, that constantly monitors upload and calculates the upload speed and provides information about how-much time it might take for the complete upload
  9. For the resume-upload, there’s server-side API that pulls upload-state information from database, as well as, retrieve uploaded files information from the local filesystem. That helps the Silverlight at the client end to be able to exactly determine, which byte, it should resume the upload at.
  10. To add more to the mix, to make the uploads scalable, the uploads need to happen on different machines, so that many users uploading at the same time (burst traffic loads) don’t bottleneck on one machine’s upload bandwidth. This complicates resume upload, as the upload needs to be resumed only on that particular server, where it was started
  11. For video content, most likely you might want to transcode and compress the uploaded files. In our case, we even ‘stitch’ those videos into a single file, that corresponds to a game. To be able to do all that, you definitely would want a given upload to completely end up on the same machine, to keep things relatively simple for transcoding and stitching

--

--

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Sandip Chaudhari

Sandip Chaudhari

Tech evangelist with an entrepreneurial bent of mind