Analyze big data with small RAM

A lot of people are using quanteda to analyze social media posts because it is very fast and flexible, but they sometimes face dramatic slow down due to memory swapping caused by insufficient sizes of RAM. quanteda requires the size of RAM to be 5 times larger than the data to analyze, but it can be 10 times when the data is comprised of many short documents. For example, in the first block of code, the original texts (txt) are only 286 MB on memory, but the tokens object (toks) is 987.8 MB and the document-feature matrix (mt) is 1411.8 MB (measured by object.size()):

require(quanteda)
txt <- readLines('tweets.txt') # 286 MB
length(txt) # 3000000

toks <- tokens(txt) # 987.8 MB
mt <- dfm(toks) # 1411.8 MB

I recommend users to install as much RAM as possible into their machines, but there is a way to analyze big data with small RAM. You cannot avoid having the large document-feature matrix because this is what you need to analyze, but we can skip the tokens object as it is only an intermediate here.

In the second bloc of code, I split the data (txt) into chunks of 10,000 documents and pass them to dfm() chunk-by-chunk to avoid creating a larger tokens object. dfm() still creates a tokens object internally but the size is only around 5MB for that number of documents. Output dfm() is then append to mt using rbind() so that I have all the documents in mt in the end. gc() asks R’s garbage collector to delete unused objects to release memory.

index <- seq(length(txt))
batch <- split(index, ceiling(index / 10000)) # each batch has 10000 indices

for (i in batch) {
    cat("Constructing dfm from", i[1], "\n")
    if (i[1] > 1) {
        mt <- rbind(mt, dfm(txt[i]))
    } else {
        mt <- dfm(txt[i])
    }
    gc()
}
ndoc(mt) # 3000000

You might wonder why the size of a chunk is 10,000 and not 20,000 or 50,000,? The reason is that quanteda performs tokenization iteratively for every 10,000 documents, and it is the largest possible size that do not trigger another loop.

Posts created 113

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Posts

Begin typing your search term above and press enter to search. Press ESC to cancel.

Back To Top