All Versions
Latest Version
Avg Release Cycle
Latest Release

Changelog History

  • v0.4 Changes


    • The project compiles with both Scala 2.11.12 and 2.12.12
    • ⚡️ Updated Apache Spark to 2.4.6
    • ⚡️ Updated the spark-xml library to 0.10.0
    • ✂ Removed the com.databricks:spark-avro dependency, as avro support is now built into Apache Spark
    • ✂ Removed the shadow org.apache.spark.Loggin class, which is replaced by the org.tupol.spark.Loggign knock-off


    • ➕ Added [SparkFun](docs/, a convenience wrapper around [SparkApp](docs/ that makes the code even more concise
    • ➕ Added FormatType.Custom so any format types are accepted, but of course, not any random format type will work, but now other formats like delta can be configured and used
    • ➕ Added GenericSourceConfiguration (replacing the old private BasicConfiguration) and GenericDataSource
    • ➕ Added GenericSinkConfiguration, GenericDataSink and GenericDataAwareSink
    • ✂ Removed the short ”avro” format as it will be included in Spark 2.4
    • ➕ Added format validation to FileSinkConfiguration
    • ➕ Added [](docs/ and [](docs/ docs


    • ➕ Added the StreamingConfiguration marker trait
    • ➕ Added GenericStreamDataSource, FileStreamDataSource and KafkaStreamDataSource
    • ➕ Added GenericStreamDataSink, FileStreamDataSink and KafkaStreamDataSink
    • ➕ Added FormatAwareStreamingSourceConfiguration and FormatAwareStreamingSinkConfiguration
    • Extracted TypesafeConfigBuilder
    • API Changes: Added a new type parameter to the DataSink that describes the type of the output
    • 👌 Improved unit test coverage
  • v0.3 Changes


    • ➕ Added support for bucketing in data sinks
    • 👌 Improved the community resources


    • ➕ Added configuration variable substitution support


    • Split SparkRunnable into SparkRunnable and SparkApp
    • Changed the SparkRunnable API; now run() returns Result instead of Try[Result]
    • Changed the SparkApp API; now buildConfig() was renamed to createContext() and now it returns Context instead of Try[Context]
    • Changed the DataSource API; now read() returns DataFrame instead of Try[DataFrame]
    • Changed the DataSink API; now write() returns DataFrame instead of Try[DataFrame]
    • Small documentation improvements
  • v0.2 Changes


    • Added DataSource and DataSink IO frameworks
    • Added FileDataSource and FileDataSink IO frameworks
    • Added JdbcDataSource and JdbcDataSink IO frameworks
    • Moved all useful implicit conversions into org.tupol.spark.implicits
    • Added testing utilities under org.tupol.spark.testing