Google Cloud Functions makes it easy to build serverless Python programs. This post will show you how you can use the Google Secret Manager to safe...
For further actions, you may consider blocking this person and/or reporting abuse
Thank you for the well structured content.
I faced the same issues related to unit tests as Sergio Sánchez Ramírez but I am also getting the value of 'None' at this line:
project_id = os.environ["GCP_PROJECT"]
I am not really sure what's the reason.
Are you saying you're getting this in your unit test? You'll need to monkeypatch
project_id
as well, asGCP_PROJECT
won't exist as an environment variable in your testing environment.No, this is in the context of Google Cloud Build. I have a trigger for my master branch and once a new commit is pushed a build gets fired. I just did:
project_id = os.environ["GCP_PROJECT"]
print(project_id)
to see what is the result. And, it returns 'None' in the cloud execution environment.
Hmm, not sure I totally understand your setup. Why is your build executing your function?
If you could include any more details like your Cloud Build configuration I might be able to help.
Otherwise, an alternative would be checking whether this variable is set or not and not continuing if it is:
I am using Google's Cloud Build (cloud.google.com/cloud-build) as a CI/CD tool for my cloud function written in Python (cloud.google.com/functions/docs/ca...)
So, I have a trigger defined on Cloud Build linked to my master branch. The .yaml file in my project looks like this: dev-to-uploads.s3.amazonaws.com/i/...
And, when executed, the build step creates a new container and I am not sure if inside of it:
os.environ.get("GCP_PROJECT")
is relevant.
Which step is failing here, the test step or the deploy step?
Can you include the test that is testing the function in question?
The step which fails is the one which executes pytest: dev-to-uploads.s3.amazonaws.com/i/...
I also tested without this step and it gets successfully deployed and the project id is available in a "production" situation. Maybe I will have to mock/stub the code for my tests.
Yes, so this is happening in your tests? You'll need to monkeypatch
project_id
like I mentioned in my original reply.If you can include the test that's failing I can try to show you how to do that.
Also it's a lot easier for me to help if you share actual text and not screenshots!
My main.py looks like this:
and if I deploy it like this on GCP it works as expected. Google Cloud Build builds the function and deploys it. Project and respectively project secret can be accessed. But, when I uncomment my test step in .yaml and it gets executed on Google Cloud Build
I start getting the error. As you say, I need to mock it somehow. This is how my current test looks like:
OK, so your test should monkeypatch the environment like this:
You'll probably need to monkeypatch
secretmanager.SecretManagerServiceClient
as well.Hi Dustin, thanks for the post, is awesome!. Just a little fix. On the cloud shell command it needs to be pip3 as the python3 version that is used afterwards.
And I think the resource_name doesnt need the plus sign at the end
Thanks again!
Cheers.
Thank you, nice catch!
Thanks for the post! It has been very useful!
The only problem I'm facing is how can test the function locally or even on my CI pipeline on the repo, as
secretmanager.SecretManagerServiceClient()
is trying to connect to Secret Manager service as soon as I import mymain.py
file onmain_test.py
file, and I don't have any GCP auth credentials on the environment.Not sure if there is a way to mock up the client without changing the whole structure.
Thanks again!
Hi Sergio, I'd advise monkey-patching the
SecretManagerServiceClient
to something you can use in your tests.For example, if you use the pretend for stubbing, it could be something like:
This may be a stupid question, but I was wondering if you could explain more the part above about "the function is doing this work outside of the Python function itself. This will speed up our function by only accessing the secret once when the Cloud Function is instantiated, and not once for every single request." Does this keep the secret being retrieved secure or is it the same thing as storing it as an environment variable then, if the call to the Secrets Manager isn't made every time the function runs?
Also, if willing and able, can you explain the difference wrt to the cloud functions framework how it runs code that is included in the main.py file outside of any of the specific functions within the main.py file? I'm thinking in terms of Google's pricing structure here for frequency of calls to functions and duration. Thanks!
Yep, great questions actually. Doing it this way keeps the secret as secure as doing it inline on each request -- either way, the secret will only be stored as a variable in the memory of the execution environment, it's just a matter of whether that variable is scoped to the function or not, which doesn't make a difference here.
With regards to the Functions Framework, it behaves exactly like the Cloud Functions runtime, so you shouldn't see any difference in behavior if you use the framework vs. the vanilla runtime.
With regards to pricing: anything done outside the function itself happens once per instance per cold start, so moving as much execution outside the function itself should reduce the overall compute time across all invocations of your function.
Thanks Dustin, this helps clarify things a lot more. I'm having trouble translating this to node.js though (which is what I'm needing to use due to other API's we use having only js helper libraries) given the async structure of js. So I'm wondering whether this kind of global scoping so that certain functions run only at instance cold start is not possible with js. If you have any info on that please let me know. Thanks!
Hey Anthony, one of my colleagues just published a similar guide for Node, hopefully this helps! dev.to/googlecloud/serverless-myst...
Sweet! Wow, great timing hahaha. You weren't kidding with the just published part (today). This totally helped clarify the way things should be setup in the node environment. Thanks for sharing and taking the time to comment back!
Thanks Dustin, very helpful article, I was able to set it up following your steps, it works!
I was just wondering if there a way to share the same code for accessing the secret between a few different google cloud functions. It's just does not feel right to copy-paste the secret-related code in each google cloud function. I was trying to find the answer from the Google documentation, but so far it looks like there is no easy way.
This is fantastic, thank you v. much Dustin.
Not sure if others will run into the same, but I had to explicitly grant permissions to enable KMS.decrypt for the service account email used by my cloud functions.
Harnit Singh ,
We are in the process of implementing new cloud function.
Am very new to that can you please help me how effectively we can use KMS in cloud function ,our setup is mobile application http request and respond back
Thank you, nice catch!
Hi Dustin, Great article!
There's one thing I didn't understand - why can't I commit the .env.yaml file to the repository?
The secret is encrypted and the only if you have access to the encryption key you can decrypt it.
Assuming you don't have access to the encryption key, what is the risk here?
Because the benefit of committing to the repo is that you have a full and ready to deploy code on your master at any given point.
I understand that exposing the encrypted secret is some kind of a threat, but I guess that if your encryption key was compromised you have bigger problems...
You're right, it can be included, as long as you're sure all secrets are properly encrypted. (This is also what CI services like Travis do: docs.travis-ci.com/user/encryption...)
These kind of long and probably error prone procedures are exactly what Google should make simple, fast and easy.
Back to Netlify for me.