Implementing High‑Performance Large File HTTP Upload with Resumable Support Using JavaScript and C
This article explains how to build a high‑performance large‑file HTTP upload service with resumable support by using a C‑based server, client‑side hash generation, cookie‑based browser IDs, AJAX queries for upload progress, and JavaScript code that slices files and tracks upload speed and progress.
To meet the product requirements of a R&D group, the author outlines a solution for high‑performance large‑file HTTP uploads with resumable support, emphasizing a C‑implemented server, client‑side hash generation, and progress tracking.
Server Side
The server is written in C rather than interpreted languages to avoid extra memory buffering and to write directly to disk.
Client‑Side Hash Generation
The browser generates a unique hash for each file using a combination of a cookie‑stored browser ID, file modification time, name, and size. The hash is calculated with MD5 (or a similar algorithm) without reading the entire file content.
function setCookie(cname,cvalue,exdays){
var d = new Date();
d.setTime(d.getTime()+(exdays*24*60*60*1000));
var expires = "expires="+d.toGMTString();
document.cookie = cname + "=" + cvalue + "; " + expires;
}
function getCookie(cname){
var name = cname + "=";
var ca = document.cookie.split(';');
for(var i=0; i
Querying Upload Progress
Before uploading, the client queries the server for the already uploaded size using the file hash, then resumes from that offset.
var fileObj = currentfile;
var fileid = getFileId(fileObj);
var t = (new Date()).getTime();
var url = resume_info_url + '?fileid=' + fileid + '&t=' + t;
var ajax = new XMLHttpRequest();
ajax.onreadystatechange = function () {
if(this.readyState == 4){
if(this.status == 200){
var result = JSON.parse(this.responseText);
var uploadedBytes = result.file && result.file.size;
if(!result.file.finished && uploadedBytes < fileObj.size){
upload_file(fileObj, uploadedBytes, fileid);
} else {
showUploadedFile(result.file);
}
} else {
alert('获取文件断点续传信息失败');
}
}
};
ajax.open('get',url,true);
ajax.send(null);
Uploading File Chunks
The
upload_file
function slices the file from the given offset, builds a
FormData
object, and sends it via
XMLHttpRequest
while updating a progress bar, bitrate, and finished size.
function upload_file(fileObj,start_offset,fileid){
var xhr = new XMLHttpRequest();
var formData = new FormData();
if(start_offset >= fileObj.size) return false;
var totalFilesize = fileObj.size;
var uploadProgress = function(evt){
if(evt.lengthComputable){
var uploadedSize = evt.loaded + start_offset;
var percentComplete = Math.round(uploadedSize * 100 / totalFilesize);
// update UI elements here
}
};
xhr.upload.addEventListener('progress', uploadProgress, false);
var blob = fileObj.slice(start_offset, totalFilesize);
var fileOfBlob = new File([blob], fileObj.name);
formData.append('filename', fileObj.name);
formData.append('fileid', fileid);
formData.append('file', fileOfBlob);
xhr.open('POST', upload_file_url);
xhr.send(formData);
}
The UI displays upload percentage, bitrate (Kbps/Mbps), and the amount already uploaded, allowing the user to resume after network interruptions.
Additional Notes
The author mentions that splitting files into fixed‑size chunks (e.g., 4 MB) works for small files but becomes inefficient for hundreds of megabytes or gigabytes due to long merge times. Using a hash‑based resumable approach avoids this problem.
For testing, a simple HTML interface and a GitHub‑hosted upload server (
https://github.com/wenshui2008/UploadServer
) are provided.Top Architect
Top Architect focuses on sharing practical architecture knowledge, covering enterprise, system, website, large‑scale distributed, and high‑availability architectures, plus architecture adjustments using internet technologies. We welcome idea‑driven, sharing‑oriented architects to exchange and learn together.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.