In Prophecy, is it possible to pass runtime parameters, similar to how they are used in standard Databricks jobs?
Within our S3 setup, we’ve established a designated folder for uploading files that require processing. Since this folder contains multiple files, there are instances where we need to process only specific ones. To facilitate this, I’m considering passing the file names as parameters in Prophecy. So that Prophecy would process the parameters pass file only.
Once you’ve published a pipeline from prophecy, you could create the job parameters in the the databricks as shown in the screenshot below and you can access these configs in the pipeline:
You can click “Schema” left to the config button, and create configs with corresponding data types. And on the right you can see a drop down where you customize these config values for various environments (Default/Dev/Prod etc). You can also refer to the the following documentation for more info on the configs: https://docs.prophecy.io/engineers/configurations/