Google Cloud Functions
Create a Cloud Function with Python.
Pants can create a Google Cloud Function-compatible zip file or directory from your Python code, allowing you to develop your functions in your repository.
Under-the-hood, Pants uses the PEX project, to select the appropriate third-party requirements and first-party sources and lay them out in a zip file or directory, in the format recommended by Google Cloud Functions.
Step 1: Activate the Python Google Cloud Function backend
Add this to your pants.toml
:
[GLOBAL]
backend_packages.add = [
"pants.backend.google_cloud_function.python",
"pants.backend.python",
]
This adds the new python_google_cloud_function
target, which you can confirm by running pants help python_google_cloud_function
Step 2: Define a python_google_cloud_function
target
First, add your Cloud function in a Python file like you would normally do with Google Cloud Functions, such as creating a function def my_handler_name(event, context)
for event-based functions.
Then, in your BUILD file, make sure that you have a python_source
or python_sources
target with the handler file included in the sources
field. You can use pants tailor ::
to automate this.
Add a python_google_cloud_function
target and define handler
and type
fields. The type
should be either "event"
or "http"
. The handler
has the form handler_file.py:handler_func
, which Pants will convert into a well-formed entry point. Alternatively, you can set handler
to the format path.to.module:handler_func
.
For example:
- project/BUILD
- project/google_cloud_function_example.py
# The default `sources` field will include our handler file.
python_sources(name="lib")
python_google_cloud_function(
name="cloud_function",
# Pants will convert this to `project.google_cloud_function_example:example_handler`.
handler="google_cloud_function_example.py:example_handler",
type="event",
)
def example_handler(event, context):
print("Hello Google Cloud Function!")
Pants will use dependency inference based on the handler
field, which you can confirm by running pants dependencies path/to:cloud_function
. You can also manually add to the dependencies
field.
You can optionally set the output_path
field to change the generated zip file's path.
Use layout to determine whether to build a .zip
file or a directory
resource
instead of file
file
/ files
targets will not be included in the built Cloud Function because filesystem APIs like open()
would not load them as expected. Instead, use the resource
/ resources
target. See Assets and archives for further explanation.
Specifying a runtime explicitly
When building an Cloud function artifact, Pants and the underlying Pex tool need to know details about target runtime to be able to choose appropriate artifacts for third-party dependencies that have native code. These details can be inferred or provided in three ways, from highest precedence to lowest precedence:
-
An explicit value for the
complete_platforms
field. The "complete platforms" are the underlying source of truth.BUILDfile(name="gcf-platform", source="gcf-platform.json")
python_google_cloud_function(
name="cloud_function",
handler="google_cloud_function_example.py:example_handler",
type="event",
# Explicit complete platforms:
complete_platforms=[":gcf-platform"],
)If needed, this file can be generated for a specific Cloud Function runtime using the function below:
gcf-complete-platform-generator.pyimport subprocess
import json
import functions_framework
@functions_framework.http
def generate_pex_complete_platforms(request):
subprocess.run(
"python -m pip install --target=/tmp/pex pex",
shell=True,
check=True,
)
result = subprocess.run(
"PYTHONPATH=/tmp/pex /tmp/pex/bin/pex3 interpreter inspect --markers --tags",
shell=True,
capture_output=True,
text=True,
)
return result.stdoutIf you run this function in the Cloud Function testing environment, it will print out a formatted JSON object to the console. You can then copy this JSON object and add it to a file named
gcf-platform.json
. -
An explicit value for the
runtime
field: Pants uses this to pick an appropriate "complete platforms" value, from options that Pants has pre-packaged. These are static exports from docker images provided by GCP, relying on the environment being relatively stable. (If Pants doesn't have an appropriate "complete platforms" default built-in, you will be prompted to use option 1 above.)BUILDpython_google_cloud_function(
name="cloud_function",
handler="google_cloud_function_example.py:example_handler",
type="event",
# Explicit runtime, `complete_platforms` taken from Pants' built-in defaults:
runtime="python312",
) -
Inferred from the relevant interpreter constraints: the interpreter constraints may unambiguously imply a value for the
runtime
and thuscomplete_platforms
fields. For example,interpreter_constaints = ["==3.12.*"]
impliesruntime="python312"
. This only works with interpreter constraints that cover all patch versions of a given minor release series:>=3.11,<3.13
is too wide (it covers both 3.11 and 3.12), while==3.12.0
is too specific (GCF'spython312
runtime may not use that exact patch version).- pants.toml
- project/BUILD
[python]
interpreter_constraints = ["==3.12.*"]python_google_cloud_function(
name="cloud_function",
handler="google_cloud_function_example.py:example_handler",
type="event",
# `runtime` inferred and `complete_platforms` from built-in defaults
)
This guide is written using the last option, but you can add runtime
or complete_platforms
to any examples using the python_google_cloud_function
target.
Step 3: Run package
Now run pants package
on your python_google_cloud_function
target to create a zipped file.
For example:
$ pants package project/:cloud_function
Wrote dist/project/cloud_function.zip
Handler: handler
Cloud Functions must run on Linux, so Pants tells PEX and Pip to build for Linux when resolving your third party dependencies. This means that you can only use pre-built wheels (bdists). If your project requires any source distributions (sdists) that must be built locally, PEX and pip will fail to run.
If this happens, you must either change your dependencies to only use dependencies with pre-built wheels or find a Linux environment to run pants package
.
If a build fails with an error like Encountered collisions populating ... from PEX at faas_repository.pex:
, listing one or more files with different sha1
hashes, this likely means your dependencies package files in unexpected locations, outside their "scoped" directory (for instance, a package example-pkg
typically only includes files within example_pkg/
and example_pkg-*.dist-info/
directories). When multiple dependencies do this, those files can have exactly matching file paths but different contents, and so it is impossible to create a GCF artifact: which of the files should be installed and which should be ignored? Resolving this requires human intervention to understand whether any of those files are important, and hence PEX emits an error rather than making an (arbitrary) choice that may result in confusing and/or broken behaviour at runtime.
Most commonly this seems to happen with metadata like a README or LICENSE file, or test files (in a tests/
subdirectory), which are likely not important at runtime. In these cases, the collision can be worked around by adding a pex3_venv_create_extra_args=["--collisions-ok"]
field to the python_google_cloud_function
target.
A better solution is to work with the dependencies to stop them from packaging files outside their scoped directories.
Step 4: Upload to Google Cloud
You can use any of the various Google Cloud methods to upload your zip file or directory, such as the Google Cloud console or the Google Cloud CLI.
You must specify the --entry-point
as handler
. This is a re-export of the function referred to by the handler
field of the target.
For example, if using layout="flat"
:
gcloud functions deploy --source=dist/project/cloud_function --entry-point=handler --trigger-topic=<TOPIC> --runtime=python38 <FUNCTION_NAME>
Advanced: Using PEX directly
In the rare case where you need access to PEX features, such as dynamic selection of dependencies, a PEX file created by pex_binary
can be used as a Google Cloud Function package directly. A PEX file is a carefully constructed zip file, and can be understood natively by Google Cloud Functions. Note: using pex_binary
results in larger packages and slower cold starts and is likely to be less convenient than using python_google_cloud_function
.
The handler of a pex_binary
is not re-exported at the fixed main.handler
path, and the Google Cloud Function handler must be configured as the __pex__
pseudo-package followed by the handler's normal module path (for instance, if the handler is in some/module/path.py
within a source root, then use __pex__.some.module.path
). This may require being configured via GOOGLE_FUNCTION_SOURCE
. The __pex__
pseudo-package ensures dependencies are initialized before running any of your code.
For example:
- project/BUILD
- project/gcf_example.py
python_sources()
pex_binary(
name="gcf",
entry_point="gcf_example.py",
# specify an appropriate platform for the targeted GCF runtime:
complete_platforms=["path/to:platform-json-target"],
)
def example_handler(event, context):
print("Hello GCF!")
Then, use pants package project:gcf
, and upload the resulting project/gcf.pex
to Google Cloud Functions. You will need to specify the handler as example_handler
and set GOOGLE_FUNCTION_SOURCE=__pex__.gcf_example
(assuming project
is a source root).
Migrating from Pants 2.16 and earlier
Pants implemented a new way to package Google Cloud Functions in 2.17, which became the only option in 2.19, resulting in smaller packages and faster cold starts. This involves some changes:
- In Pants 2.16 and earlier, Pants used the Lambdex project. First, Pants would convert your code into a Pex file and then use Lambdex to adapt this to be better understood by GCF by adding a shim handler. This shim handler first triggers the Pex initialization to choose and unzip dependencies, during initialization.
- In Pants 2.17, the use of Lambdex was deprecated, in favour of choosing the appropriate dependencies ahead of time, as described above, without needing to do this on each cold start. This results in a zip file laid out in the format recommended by GCF, and includes a re-export of the handler.
- In Pants 2.18, the new behaviour is now the default behaviour.
- In Pants 2.19 and later, the old Lambdex behaviour has been entirely removed.
If your code can be packaged without warnings using Pants 2.18, no change is required when upgrading to Pants 2.19 (except removing the [lambdex]
section in pants.toml
if that still remains). If not, follow its instructions to upgrade to Pants 2.18 fully first, and upgrade to Pants 2.19 after that.
If you encounter a bug with the new behaviour, please let us know. If you require advanced PEX features, switch to using pex_binary
directly.