cranium for Node.js – Pincer.io


cranium for Node.js – Pincer.io

The amount of data you need to build a good classifier increases with the number of features you have, so out of memory errors become a problem when dealing with thousands of features. For example, Weka fails to perform logistic regression with more than a couple thousand features on a 5mb dataset. Cranium never assumes that your instances can fit in memory, so you can use it on terabytes of data.

Cranium works with node streams, so you have a lot of flexibility with your input. Using streams sacrifies speed for memory efficiency — Cranium uses a constant amount of memory that is typically below 100mb. The speed penalty is significant: Cranium runs about 500x slower than LearnKit. If your dataset can fit in memory, Cranium is probably not right for you.

Related Posts