javascript - large JSON could not be handled by browser - Stack Overflow

We have a Apache server that provides a website. The website creates a GET-Request, that runs a C++ pro

We have a Apache server that provides a website. The website creates a GET-Request, that runs a C++ program on the server. The program creates a 3D-Scene and answers the Get-Request with a json containing the scene. The scene is then rendered in the browser using WebGL.

This works perfectly fine for small scenes. Chrome throws an error when the json is greater than ~125 mb. Firefox can handle jsons up to ~260 mb.

I create the Get-Request using jquery:

BP2011D1.ServerProxy.prototype.loadMesh = function(requestParameter, callbackOnSuccess,   callbackOnError)
{   
$.ajax({
    type: "GET",
    url: this.getServerURL() + "/cgi-bin/" + this._treemapDirectory + "/hpi_bp2011_app_fcgi",
    data: requestParameter + "&functionName=getMesh",
    dataType: "json",
    success: callbackOnSuccess.execute,
    error: callbackOnError.execute
});
};

For large jsons the callbackOnError is executed, so the json seems to be invalid.

I know that the json should be perfectly valid.

I think that the browser cannot handle a big json or a big string. He clips some of the characters at the end, so the missing brackets make the json invalid.

Is there a way to handle the problem? I need to handle a json up to 800 mb.

We have a Apache server that provides a website. The website creates a GET-Request, that runs a C++ program on the server. The program creates a 3D-Scene and answers the Get-Request with a json containing the scene. The scene is then rendered in the browser using WebGL.

This works perfectly fine for small scenes. Chrome throws an error when the json is greater than ~125 mb. Firefox can handle jsons up to ~260 mb.

I create the Get-Request using jquery:

BP2011D1.ServerProxy.prototype.loadMesh = function(requestParameter, callbackOnSuccess,   callbackOnError)
{   
$.ajax({
    type: "GET",
    url: this.getServerURL() + "/cgi-bin/" + this._treemapDirectory + "/hpi_bp2011_app_fcgi",
    data: requestParameter + "&functionName=getMesh",
    dataType: "json",
    success: callbackOnSuccess.execute,
    error: callbackOnError.execute
});
};

For large jsons the callbackOnError is executed, so the json seems to be invalid.

I know that the json should be perfectly valid.

I think that the browser cannot handle a big json or a big string. He clips some of the characters at the end, so the missing brackets make the json invalid.

Is there a way to handle the problem? I need to handle a json up to 800 mb.

Share Improve this question edited May 23, 2012 at 15:38 Jens Ehrlich asked May 23, 2012 at 15:05 Jens EhrlichJens Ehrlich 8732 gold badges11 silver badges19 bronze badges 4
  • 5 800mb of json? that's rather ludicrous... Try sending it over in multiple smaller chunks. – Marc B Commented May 23, 2012 at 15:06
  • @MarcB event at large chunks, 800MB is just too massive for one time parse. The browser even chokes for several seconds looping 100k iterations, how much more 800MB data. – Joseph Commented May 23, 2012 at 15:10
  • @joseph: I'm guessing this is on an internal network, so 800mb at gigabit speeds isn't too nasty, but it is definitely a HUGE hit on the browser. – Marc B Commented May 23, 2012 at 15:11
  • our apache uses gzip, so the json is only ~30 mb big when transferred over the network – Jens Ehrlich Commented May 23, 2012 at 15:32
Add a ment  | 

3 Answers 3

Reset to default 2

You could try using a more pact format

http://code.google./p/webgl-loader/

You could also roll your own format and download the large parts directly in binary by using binary XHR

http://www.html5rocks./en/tutorials/file/xhr2/

You need to identify what creates the problem:

  1. fetching json that big
  2. evaluating json that big (eval())
  3. asking javascript to store this

I assume you fetch the json by xhr, in the first 2 cases you could try creating some pagination where you you add a GET-parameter part={0,1,2,3,4,5...}, allowing the browser to fetch a huge json in multiple xhr requests (by implementing this server-side). This would require the server to split it, and the browser to merge it:

{a:1, b:5} - split -> {a:1} and {b:5} - merge -> {a:1, b:5}

or:

[1, 5] - split -> [1] and [5] - merge -> [1, 5]

please understand that while doing this, you need to find a good place to split and merge, i.e. in this case:

{small: <1mb object>, huge:<799mb object>}

Or you might decide to just fetch string and split and merge it.

Download them by chunks (by batches of several MB) via AJAX. Assuming that you have an advanced browser (due to WebGL), you can use web workers like "threads" to parse them in the background by batch then add them to a main object for use.

For every chunk:

download -> create worker -> parse JSON in worker -> add to main object

Even if it circumvented the 800MB JSON parsing issue, I don't know the effects of an 800MB object though. It would still be heavy on the browser heap.

发布者:admin,转转请注明出处:http://www.yc00.com/questions/1744159423a4561020.html

相关推荐

  • javascript - large JSON could not be handled by browser - Stack Overflow

    We have a Apache server that provides a website. The website creates a GET-Request, that runs a C++ pro

    8天前
    10

发表回复

评论列表(0条)

  • 暂无评论

联系我们

400-800-8888

在线咨询: QQ交谈

邮件:admin@example.com

工作时间:周一至周五,9:30-18:30,节假日休息

关注微信
能,也可根据模型 hook 不同模板 switch ($forum['model']) { /*case '0': include _include(APP_PATH . 'view/htm/read.htm'); break;*/ default: include _include(theme_load('read', $fid)); break; } } break; case '10': // 主题外链 / thread external link http_location(htmlspecialchars_decode(trim($thread['description']))); break; case '11': // 单页 / single page $attachlist = array(); $imagelist = array(); $thread['filelist'] = array(); $threadlist = NULL; $thread['files'] > 0 and list($attachlist, $imagelist, $thread['filelist']) = well_attach_find_by_tid($tid); $data = data_read_cache($tid); empty($data) and message(-1, lang('data_malformation')); $tidlist = $forum['threads'] ? page_find_by_fid($fid, $page, $pagesize) : NULL; if ($tidlist) { $tidarr = arrlist_values($tidlist, 'tid'); $threadlist = well_thread_find($tidarr, $pagesize); // 按之前tidlist排序 $threadlist = array2_sort_key($threadlist, $tidlist, 'tid'); } $allowpost = forum_access_user($fid, $gid, 'allowpost'); $allowupdate = forum_access_mod($fid, $gid, 'allowupdate'); $allowdelete = forum_access_mod($fid, $gid, 'allowdelete'); $access = array('allowpost' => $allowpost, 'allowupdate' => $allowupdate, 'allowdelete' => $allowdelete); $header['title'] = $thread['subject']; $header['mobile_link'] = $thread['url']; $header['keywords'] = $thread['keyword'] ? $thread['keyword'] : $thread['subject']; $header['description'] = $thread['description'] ? $thread['description'] : $thread['brief']; $_SESSION['fid'] = $fid; if ($ajax) { empty($conf['api_on']) and message(0, lang('closed')); $apilist['header'] = $header; $apilist['extra'] = $extra; $apilist['access'] = $access; $apilist['thread'] = well_thread_safe_info($thread); $apilist['thread_data'] = $data; $apilist['forum'] = $forum; $apilist['imagelist'] = $imagelist; $apilist['filelist'] = $thread['filelist']; $apilist['threadlist'] = $threadlist; message(0, $apilist); } else { include _include(theme_load('single_page', $fid)); } break; default: message(-1, lang('data_malformation')); break; } ?>