Cloud storage is cheap, but your CSVs and Parquet logs still hold valuable insights. Instead of loading them into a warehouse, BigLake lets you query them in place β securely, efficiently, and with full SQL support.
In this step-by-step guide, I cover:
β
How to create a Connection Resource (your secure data βlibrarianβ)
β
How to define a BigLake table on top of cloud files
β
How to apply column-level security for sensitive fields
β
How to upgrade an existing external table to BigLake via CLI
β
How to keep compatibility with Spark, Presto, and Python β with security still enforced
π‘ Why it matters:
β Query data directly from GCS, S3, or Azure β no ingestion needed
β Enforce fine-grained access controls on rows and columns
β Unified logs, audits, and caching out of the box
β One SQL layer across multi-format, multi-cloud files
π Full walkthrough here:
https://medium.com/google-cloud/from-csv-to-secure-analytics-a-hands-on-beginners-guide-to-biglake-in-bigquery-2025-edition-ad34badaf983
Top comments (0)