Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Interesting, sounds like you have a very specific usecase. I'm mostly dealing with huge datasets and Spark is a lifesaver.


My use case is just general computing on huge datasets for ML and statistics.

Spark only serves a few special use cases that usually don’t justify its cost, similar to map reduce on Hadoop.

Databricks though (distinct from Spark) serves no use cases.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: