Snowflake Migration Series — Lesson 4: People and Process: The Work That Makes The Tech Stick
Great Migrations Fail When People Are Left Behind. You Don’t Just Move Code; You Move Habits
zPhoto by Suzanne D. Williams on Unsplash
tl;dr
Migrations don’t fail because Snowflake or dbt are bad tools. They fail when people are left behind. Many focus only on the technical aspects—like configuring Snowflake or rewriting code—and assume the team will naturally adapt to the new processes. This oversight can cause frustration, hinder adoption, and prevent realizing the platform’s full benefits.
You aren’t just moving code — you’re changing habits. This lesson gives you a simple plan: the skills to grow, the roles to staff, the rituals to run every week, and how to use AI as a helpful co-pilot.
To achieve long-term success, the people and process work stream should be prioritized equally with the technical migration outlined in the project charter, ensuring sufficient resources are allocated. This lesson gives you a simple plan: the skills to grow, the roles to staff, the rituals to run every week, and how to use AI as a helpful co-pilot.
In Lesson 1, RetailCorp shipped a thin, working slice. In Lesson 2, we assessed the legacy world while building. In Lesson 3, we translated business logic into Snowflake-native patterns. Lesson 4 is how those patterns become the new normal.
Why People & Process Matter
After our initial successes, RetailCorp's team began to drift: a few developers kept rebuilding nightly batches as a backup, analysts maintained private spreadsheets, and reviews became infrequent. Once we simplified and made the new process more visible, with short sprints, small demos, and clear tests, adoption increased significantly.
The New Ecosystem of Skills
Adopting a modern data stack such as Snowflake and dbt isn't just about switching tools—it’s about embracing a whole new environment that requires fresh skills. Expecting an experienced Informatica team to quickly acquire these new competencies amid a migration is unrealistic. A modern data team now must excel in:
Build well
- Modular SQL with dbt Proficiency: Writing modular SQL, using Jinja for templating, writing macros, defining tests, and understanding materializations.
- Data contracts as tests (not_null, unique, relationships, accepted values).
Ship safely
- Version Control: Branching and pull requests in Git are fundamental for collaborative code development. Make sure there are clear code reviews (one data engineer + one analytics engineer).
- Automation: Building automated CI/CD pipelines for testing and deployment.
- Git for branches/PRs; CI to run tests on every PR; CD to deploy on merge.
Spend wisely
- Snowflake Specifics: Understanding virtual warehouses, cost management, query profiles, role-based access control, semi-structured data types, and simple cost budgets/alerts.
- Cloud Cost Management: A crucial new skill in a consumption-based cloud world.
Keep it teachable: one page per topic, one example, one exercise. A Snowflake like quickstart made for your use-case is a good example.
Embedded and Agile Learning is Key
Embed training in every sprint, not an afterthought. Rather than isolated, one-off sessions, skill development must be woven into the migration plan using agile methods. Success depends on:
- Adopting Agile: Implement a sprint-based method that involves data engineers, analysts, and key business users directly in planning and developing the MVP and following phases.
- Continuous Learning Cycles: Embed learning opportunities within the sprints. For instance, one sprint could focus on building dbt models, while another emphasizes setting up dbt test or configuring a CI pipeline. Pair programming is especially useful in this setting.
- Develop the CI/CD pipeline gradually during the initial parallel phase. This method provides fast gains and encourages robust software engineering practices.
- Promoting new norms entails actively encouraging code reviews, establishing shared coding standards, and enhancing collaboration through Git.
Embrace Formal Change Management
This marks a major change in work dynamics. Developing a formal change management plan is crucial, involving clear communication of the reasons for the change, continuous support, celebrating milestones such as the MVP launch, and actively managing resistance to the new processes. Supporting your team throughout this transition is vital for securing long-term success.
A Week-by-Week 90-day enablement plan
Adapt the plan to fit your organization; For teams under 5 people, collapse to 60 days. For teams over 20, extend to 120 days and add role-specific tracks. You can also create versions of this as business units move to the new platform.
Weeks 1–2: Set the stage
- Create a one-page team rulebook that outlines naming conventions, folder structure, default dbt tests, and a PR checklist. This document will serve as a template Git project, promoting best practices and can be reused as needed.
- Implement stand-up CI: execute dbt tests on pull requests and prevent merges if tests fail.
- Pick a roadmap item as a learning slice that will ship in 2 weeks.
Weeks 3–6: Ship small, learn fast
- Two-week sprints, with each one delivering a small production update.
Rituals:
- Daily 10-minute standup: discuss updates and obstacles.
- Mid-sprint partner rotation hour.
- End-of-sprint demo and decision log entry, keep it simple, summarized in one paragraph: describe what changed and why.
- Begin on-call duties for data, rotate schedules; the person on duty addresses red builds that day.
Weeks 7–12: Make it normal
- Include cost checks such as budgets and query alerts.
- Include two cross-team reviews, analytics ↔ data engineering, to distribute patterns more effectively.
- Display a basic scoreboard (see suggested below) on a wall or wiki.
Metrics that change behavior
- Lead time (from roadmap to prod)
- % of PRs with tests
- Red builds are fixed within 24h
- Duplicate logic removed
- Number of unique presenters in demos (shared ownership)
How AI Catalyze Team Adoption
AI can serve as a valuable resource in implementing this people-centric, agile learning approach. They function as immediate tutors and support the quick adoption of new norms.
- During a learning-focused sprint, team members can utilize AI to get instant, personalized guidance. For example: “I need to write a dbt test to ensure my foreign keys are valid. Can you show me an example of a relationship test?”
- Code Review Assistants: To foster the development of new standards, you can use AI to pre-screen code. Developers might ask an AI to “Review my SQL code against our team’s style guide and suggest improvements” before submitting a pull request for human review.
- On-Demand Onboarding Content: AIs can instantly generate learning materials for your sprints. For example, they can create a short tutorial with a practical exercise on using Jinja to build a dynamic date filter in a dbt model. This supports continuous learning while reducing the workload for senior team members.
Conclusion
Lessons 1–3 focused on showing value and clarifying the logic, while Lesson 4 highlights making the new approach sustainable. Successful implementation isn't just flipping switches; it's about building skills, creating simple weekly routines, and documenting straightforward rules for everyone. When teams actively work on the tasks, deliver gradually, and learn during the sprint instead of after, the platform you develop is more likely to be used as intended.
Adopt a teachable approach by dedicating one page per topic, including an example, and providing an exercise. Ensure visibility through brief demonstrations, a dynamic decision log, and a public scoreboard. Focus on humane methods: prefer pairing over policing, utilize checklists instead of lectures, and see AI as a helpful co-pilot rather than the pilot. This will help the desired habits become normal quickly.
Your next step is simple, select a small segment, run the 2-week cycle, and monitor the results — including lead time, PR test durations, and fixing red builds within a day. This evidence helps build trust across the organization and enables you to confidently retire outdated methods.
Next: Lesson 5 — The Final Gate: proving trust with automated, cell-level validation.
I am Augusto Rosa, a Snowflake Data Superhero and Snowflake SME. I am also the Head of Data, Cloud, & Security Architecture at Archetype Consulting. You can follow me on LinkedIn.
Subscribe to my Medium blog https://blog.augustorosa.com for the most interesting Data Engineering and Snowflake news.
Top comments (0)