Managing the Google Cloud Platform: On deprecation notices and pricing policies
It's all fun and games until Google Cloud Platform forcibly upgrades your application.
This blog post was originally meant to showcase a neat trick to simplify secret handling in GCP, but when I started writing this I learned that they've deprecated the code I wanted to write about.
So, let's talk about GCP's deprecations instead.
“Deprecate: To mark (a component of a software standard) as obsolete to warn against its use in the future so that it may be phased out (source).”
Blindsided by a platform upgrade
I was first bitten by Google's deprecations in january 2019. My phone rang and a worried customer reported that they were not able to conduct business in their web application. This was an internal business tool which they heavily relied on for their daily work, it was now dead in the water, when they tried to save or edit data it simply threw errors.
I went to the application logs to investigate and after some sweaty minutes I learned that this Java application had been forcibly upgraded from java 7 to java 8! After a quick chat with my manager I cleared my work calendar and built an emergency fix over the next couple of days and evenings.
Thankfully, it didn't take a lot of code to add a compatibility layer between our code and java 8, but it took time to research and verify that the fix worked as intended. I was happy that we were able to fix the issue so quickly, and at the same time quite annoyed that we had missed the java 7 deprecation notice.
Upgrading into new pricing policies
After this stressful and formative experience I was motivated to ensure that we wouldn't get bitten by deprecations again. So, I invested time into migrating the application's database from MySQL 5.5 to MySQL 5.7. MySQL 5.5 which is a first generation Cloud SQL offering is scheduled to be decommisioned march 25th 2020.
Interestingly, I also had to look into Google's new database pricing policy. When I did some cursory testing of the second generation databases I saw that activating binary logging would fill up the database's disk space over time. I also saw that there was a setting for automatically and permanently increasing the disk space when free space was below some percentage treshold. Both binary logging and diskspace-auto-increase are activated by default when provisioning new database instances (at the time of writing).
Considering these default settings I would argue that it can easily lead to unecessarily overprovisioned databases. It's a great deal for Google but not so much for users that might be forced to quickly do a database migration without enough time to consider the implications of these settings.
The trick that became obsolete
I originally wanted to share a nice trick around managing secrets when deploying a java application to GCP. And I still want to share that trick, even though it will be made irrelevant by this deprecation notice.
In the example above I show how you can deploy an application to Google app engine and provide it with credentials as parameters. This solves the problem of storing secrets (like database passwords) inside your codebase (see 12factor.net/config).
We are currently using this approach to provide database passwords when deploying a java application to GCP. However, with the deprecation notice we'll need to change that in order to avoid breaking our app. I've already tried this approach with the new Cloud SDK setup and it doesn't work. I don't blame Google for this breaking change as this was probably not how they intended users to provide secrets to their cloud deployments.
Moving forward we'll need to re-visit the recommendations on how to connect to Cloud SQL from app engine. This example shows how to provide secrets as environment variables on the server, which is a very standard way of providing secrets. We should just do that.
Beyond environment variables there's also Google's Cloud Key Management Service. The KMS documentation has a lengthy article on considerations for secret management. So, there's no shortage on advice and services to buy.
Wisening up regarding cloud providers
Through one hard lesson I've become more vigilant about platform as a service (PaaS) providers such as Google Cloud Platform and Amazon Web Services. They constantly evolve in an effort to outcompete each other and improve their earnings.
This rate of change does differ amongst PaaS providers. For example there's Heroku which in my experience doesn't try to do everything like GCP and AWS and thus evolves more slowly with less or no need for deprecation notices.
So, when evaluating and re-evaluating PaaS providers consider how they choose to drive change:
- Does the PaaS evolve with no changes needed by you, or do they use deprecation notices that eventually force your hand to update the code? On one side you're getting stability, while on the other side you might be getting a lot of improvements and new features.
- Do they require you to learn a lot of details that are only applicable do their platform? Investing a lot of time into learning the particularities of a platform in order to properly leverage it is also a certain lock-in. On one hand you get to use sophisticated features but it does make it harder to switch to a different PaaS.
- Do deprecations come with pricing changes? Once you're on their platform a PaaS provider might experiment with the pricing scheme to see how much of a price hike you're willing to put up with without switching to another PaaS.
Best of luck in keeping up with your PaaS provider. I'll be over here upgrading my app to java 11 before GCP beats me to it.