Good question. At the moment, there isn’t a way to add PyPI dependencies directly within SQL projects. The only supported approach is to install Python dependencies on the Spark cluster that the Script gem runs on.
This is currently possible only for Databricks fabrics. In Databricks fabrics, the Script gem executes in your Databricks environment, so any libraries installed on the cluster are available at runtime. For all other fabric types, the Script gem runs in Prophecy Automate, and there’s no UI yet to manage or install Python dependencies there. (More details are covered in the Script gem documentation.)
If you’re using a Databricks fabric, you can install dependencies directly:
Open Prophecy.
Make sure your Databricks fabric is configured to use a specific Spark cluster (compute) for Script gem execution.