Both Jet and Apache Spark™ are tools for distributed cluster computing. Both split processing jobs into parallel tasks that are distributed across the cluster. Data can be split and cached in the cluster for co-location leading to better performance.
A Spark cluster has many components and moving parts. Spark is designed for large, multi-tenant clusters.
Jet represents a lightweight approach. Jet has a low operational overhead so you can run more Jet clusters for isolation. Jet cluster can be deployed in a traditional client-server mode or it can be embedded and packaged with an application.