I'm publishing a Shiny app written in R
to Shinyapps.io.
In RStudio, the "memory used" is 4.9 GiB after I run the app and come back to the editor:
When I do gc()
, this number doesn't change by more than 0.02 GiB.
When I do sum(sapply(ls(), function(x){object.size(get(x))}))
I get 62,186,696
, or about 62 MB.
This number includes a list object (class "magick-image") which stores about 33 low-quality images using the magick
package, and the size of that object is 15MB. The Github repo storing these (and nothing else) is 43MB.
When I push this to ShinyApps.io, I have to set the "instance size" to 8GB or it crashes. My memory use reported by Shinyapps is over 5GB:
So RStudio and Shinyapps.io are reporting comparable memory usage, but it is almost 2 orders of magnitude more than what is used to store my data. As the data slowly grows, I'm afraid I will hit the 8GB limit on my ShinyApps account.
I'm using a few packages as well, notably tidyverse
, and I know these take up memory but it doesn't seem like they would take up enough to account for the two orders increase in memory used.
Question: Am I looking at this the right way? And what else can I check to give me a clue as to how I can reduce my memory usage?
发布者:admin,转转请注明出处:http://www.yc00.com/questions/1743761351a4502688.html
评论列表(0条)