Tiny bit about AWS EMR on big data
Dawid Laszuk published on
4 min,
729 words
One of the recent projects I've worked on involved processing billions of row stored in AWS S3 in terabyte size data. That was the biggest I've worked so far and, even though I don't like the term, it broke through the big data barrier. Just handling the data with popular toolkits, such as scikit-learn or MXNet, created so many problems that it was easier to create our own solutions. But the biggest surprises came from places least expected.
Read More