Ramandeep Kaur gives us several cases when it makes sense not to use Apache Spark:
There can be use cases where Spark would be the inevitable choice. Spark considered being an excellent tool for use cases like ETL of a large amount of a dataset, analyzing a large set of data files, Machine learning, and data science to a large dataset, connecting BI/Visualization tools, etc.
But its no panacea, right?Let’s consider the cases where using Spark would be no less than a nightmare.
No tool is perfect at everything. Click through for a few use cases where the Spark experience degrades quickly.
Comments closed