-
Notifications
You must be signed in to change notification settings - Fork 16
FIREFLY-53: Saving a table with large number of columns results in HTTP 400 (Bad Request) #829
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
…TP 400 (Bad Request)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This all looks really good except for my concerned about holding large data in memory in the download process. I think you need to do a large build then and download a 1 GIG file before you merge.
const a = document.createElement('a'); | ||
a.href = window.URL.createObjectURL(blob); | ||
a.download = filename; | ||
a.click(); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
download Blob
is much cleaner. What is interesting is the that you are not having to append the to body to make the process work. Effectively this is done with out affected the DOM. This is nice.
|
||
const params = Object.fromEntries( | ||
Object.entries(searchObject) | ||
.map(([k, v]) => [k, (isPlainObject(v) ? JSON.stringify(v) : v)]) ); // convert object back into JSON if needed. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I was just reading about this using Object.fromEntries
in combination with Object.entries
and wondering if we would have a use case for that.
Object.entries(searchObject) | ||
.map(([k, v]) => [k, (isPlainObject(v) ? JSON.stringify(v) : v)]) ); // convert object back into JSON if needed. | ||
|
||
fetchUrl(url, {method: 'post', params}) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So you are do a form type post
without, building DOM form.
filename = filename || resolveFileName(resp); | ||
return resp.blob(); }) | ||
.then( (blob) => { | ||
downloadBlob(blob, filename); }) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am a little concerned about this step. Maybe I am wrong but it appears that we don't stream a file out anymore but hold it in memory. This might not work with a large FITS or table file. Have you tested that?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It's not clear to me based on the documentation of Response.blob() that it will returns a fully populated content from the response or it'll return a blob backed by the response's input stream.
I've tried a 1GB fits file and a 1 million rows table (800MB). I don't see any increase in memory usage on the browser. It looks like it's doing the later.
I did notice a BIG performance problem with TableSave when it's a big table. The UI may need to change to account for a slow prep-stage as well. I will create a new ticket for this.
@robyww is this acceptable?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's fine as long as you tested it. I guess it could be backed my the stream.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
ticket created for UI changes when file is large. https://jira.ipac.caltech.edu/browse/IRSA-2975
https://jira.ipac.caltech.edu/browse/FIREFLY-53
As described in the ticket, when the URL length exceed its limit, table save will get a
Bad Request
error. In this PR, the logic is to switch to a HTTPpost
method to avoid the limit constraint. I've also refactored Image's PNG save to make use of common code.Please include that in your testing as well.
Test: https://irsawebdev9.ipac.caltech.edu/FIREFLY-53_bad_request/firefly/
Things affected: tables and images save/download.