Skip to main content

Write to a Collection

The Macrometa Collections Databricks Connector allows you to integrate Apache Spark with Macrometa collections, allows you to write data to Macrometa collections using Apache Spark.

  1. Set up your target options:

    val targetOptions = Map(
    "regionUrl" -> "<REGION_URL>",
    "apiKey" -> "apikey <API_KEY>",
    "fabric" -> "<FABRIC>",
    "collection" -> "<COLLECTION>",
    "batchSize" -> "<BATCH_SIZE>",
    "primaryKey" -> "<PRIMARY_KEY>"
    )
  2. Write to the Macrometa collection:

    modifiedDF
    .write
    .format("com.macrometa.spark.collection.MacrometaTableProvider")
    .options(targetOptions)
    .mode(SaveMode.Append)
    .save()
  3. Close SparkSession:

    spark.close()