Image Image Image Image Image
Scroll to Top

To Top

News Articles

27

May
2014

News Articles

Permalink

Mobile Apps and Chunk Uploads

May 27, 2014 - News Articles

Higher camera resolutions, retina pixel densities and broader bandwidth have given us the web with greater, more beautiful detail. And the HTML5 canvas element now gives us the ability to manipulate imagery armed with nothing more than Javascript. But, as with all growth, there are growing pains, and this one can be found in the

Error Code 413:

Request Entity Too Large.

Mozilla’s error code detail says: “Request entity is larger than limits defined by server; the server might close the connection or return an Retry-After header field.”

https://developer.mozilla.org/en-US/docs/Web/HTTP/Response_codes

Essentially, you’re transmitting the decoded pixel information, 4 bytes at a time (1 byte for each, RGB and Alpha). A byte is hardly going to bust the server limits, but for even a 1-megapixel photo (1200×900), this amounts to 4,320,000 bytes. Most digital phone cameras have capabilities much higher, so you can see how this issue might suddenly become a big possibility in your own web or mobile application.

But if you don’t have access to change some of these server directives, and if you’re working with canvas elements and wish to transmit some very high-resolution pixel data, using toDataUrl(), for example, you might be bumping into this grumpy error code often.

With toDataURL, you can pass an optional argument to prepare the canvas pixel data as a JPEG file, which is generally much smaller than the default PNG format. But not all browsers and phones support this enhancement, such as a Samsung Galaxy SIII, so you now you’ve got to plan for this liability.

 

Chunk Uploading

The traditional solution for sending any payload without reaching a limit is to break it down into parts, and transmit in sequence. It’s how space stations, carnivals, and even the Statue of Liberty was transported. In our mobile apps, we can do it with jQuery, just like this:

var can = document.getElementById("canvas");
var imagedata = can.toDataURL("image/jpeg"); //this will still be a PNG on some phones/browsers
var hash = [any unique id here];
//break the image up into chunks
function chunkup(imagedata, hash){
	var chunks = imagedata.match(/.{1,500000}/g); //break the imagedata into manageable parts
	for (part in chunks){
		var islast = ((part-0)+1 >= chunks.length ? 1 : 0);
		chunkupload(part, chunks[part], hash, islast); //send the chunks, one at a time
	}
}
//transmit each, one at a time
function chunkupload(part, chunk, hash, last) {
    var data = {"chunk": chunk, "hash": hash, "part": part, "last": last};
    $.ajax({
        type: "POST",
        url: "http://path/to/your/server/script.php",
        data: data,
        dataType: "JSON",
        timeout: 60 * 1000,
        async: false
    }).done(function (data) {
		return;
    }).fail(function (XMLHttpRequest, textStatus, errorThrown) {
        alert("Error: "+errorThrown);
    });
}

You can run this function synchronously, but you might also be pushing another limit for maximum simultaneous requests. So even if it’s slower, it might be worth your while to perform these uploads in sequence, instead.

On the server side, we’re getting image data “chunks” fired at us, so our script must be smart enough to store these chunks temporarily for later reassembly. PHP is terrific for reading and writing to the file system. So here’s an example of how this is done in PHP:

header('Content-type: application/json');
if(!empty($_POST)){
	if(isset($_POST['chunk']) && isset($_POST['hash']) && isset($_POST['part'])) {
		$hash = (string) trim($_POST['hash']);
		$part = (integer) trim($_POST['part']);
		$last = (boolean) trim($_POST['last']);
		//save the data to temporary sequential files on the filesystem
		$filename = "tmp/chunk/{$hash}.{$part}";
		if($fp = fopen($filename,"w"))
		{	$contents = fwrite($fp, (string) trim($_POST['chunk']));
			fclose($fp);
		}
		//if all of the parts have been received, let’s reassemble
		if($last){
			$composite = ""; $chunk = 0;
			while($chunk <= $part){
				if(file_exists("tmp/chunk/{$hash}.{$chunk}")){
					$composite .= trim(implode("",file("tmp/chunk/{$hash}.{$chunk}")));
					//we don’t need the chunk file anymore
					unlink("tmp/chunk/{$hash}.{$chunk}");
				}
				$chunk+=1;
			}
			$filename = "tmp/{$hash}.jpg";
			//strip out the imagetype header
			preg_match("/data\:image\/([png|jpeg]*)\;base64\,(.*)/", $composite, $img);
			if(count($img)==3){
				$rawdata = imagecreatefromstring(base64_decode($img[2]));
				imagejpeg($rawdata, $filename);
			}
			//send the good news, your image has been transmitted!
			if(file_exists($filename)) die( json_encode( array("filename" => $hash) ));
		}
	}
}
//let AJAX know you got the chunk
die( json_encode( array("1") ));

That’s all there is to it!

There are countless mods (sanitizing, security, accounts, filetype handlers, etc.) that can be worked into this workflow, but now you can concentrate on your customizations and not have to worry about error code 413.

Tags | chunk upload, hi-rez, high resolution, information, javascript, jpeg, jpg, jquery, megabytes, megapixels, mozilla, pixel, png, request entity, retry-after, todataurl