WHAT IS APACHE SPARK
Apache Spark concepts is a framework which is open source and has cluster computing and is a part of Hadoop. For the programming of the whole cluster, the spark is giving an interface between the implicit data parallelism and fault tolerance
APACHE SPARK ARCHITECTURE INCLUDES
The codes that are written for the analytics this designs enables all the codes that are same, which gives easy access and implementation on the lambda architecture. And the convenience like this comes up with the penalty of latency on the duration spent on the small batches.