Import a pypi module in a SQL fabric

Hi community, If I want to import a pypi module in a SQL fabric (to be used in a script gem), is this possible? If so how?

I can see that if I have a Spark based project, I can include pypi repos, but not so with SQL (it has to be in github, dbt or prophecy)…

Hey Susan!

Good question. At the moment, there isn’t a way to add PyPI dependencies directly within SQL projects. The only supported approach is to install Python dependencies on the Spark cluster that the Script gem runs on.

This is currently possible only for Databricks fabrics. In Databricks fabrics, the Script gem executes in your Databricks environment, so any libraries installed on the cluster are available at runtime. For all other fabric types, the Script gem runs in Prophecy Automate, and there’s no UI yet to manage or install Python dependencies there. (More details are covered in the Script gem documentation.)

If you’re using a Databricks fabric, you can install dependencies directly:

  1. Open Prophecy.

  2. Make sure your Databricks fabric is configured to use a specific Spark cluster (compute) for Script gem execution.

  3. Open Databricks.

  4. Follow Databricks’ instructions to install libraries on that compute.

Hope this helps!

Katherine