The Azure Data Fundamentals course is designed for data professionals of all levels who are moving their workloads from on-premises deployments to the Azure cloud. As a fundamentals course it covers a wide array of topics at a varying level of technical depth. Just see the exam outline found here to get a sense.
I took, and passed, this exam recently so as my content on http://www.cbtnuggets.com rolls out for this course I thought I'd provide some insight into how and what you'll need to know in order to pass it yourself!
Describe core data concepts
The first section of the exam outline is the highest-level conceptually, but also the subject you're most likely to have some comfort level with already prior to beginning your studies.
You'll need a basic understanding of what relational databases are and how they compare to non-relational or NoSQL databases. Questions about what technologies are used to query relational databases (SQL) may be found as well as general questions about how to recognize a relational database.
This section also covers some basic data analysis topics, such as charts and graphing. Importantly you'll want to have a good grasp on the differences between the five listed analysis techniques: descriptive, diagnostic, predictive, prescriptive,
cognitive.
On top of that, recognizing what ETL (Extract, Transform, and Load) is and how it's used in a data pipeline will come up during this section of the exam.
Describe how to work with relational data on Azure
With this section your knowledge is expected to dive a little deeper into the details. Questions for this topic are all about relational databases and how to leverage them on the Microsoft Azure cloud platform.
Knowing the basic components of a relational database is important (Tables, Queries, Views, Indexes) as is a high-level understanding of an SQL query syntax. Do you know what SELECT means? How about INSERT? If the terms DDL and DML don't mean anything to you take some time to brush up on that.
The exam will also spend some time testing your knowledge of the various Azure data services and how they are categorized and used. Know the difference between Azure's IaaS databases, PaaS databases, and SaaS databases. Understand pros and cons of choosing one of those solutions versus another for a given workload.
You'll be expected to answer questions about using the Azure CLI versus PowerShell, or the GUI interface for database deployment. While there won't be specific questions about the syntax or commands you will need to be able to recognize the different methods. You will also need to understand which tools Azure offers for data management, what the important security systems in place are, and how to troubleshoot basic connectivity issues.
Describe how to work with non-relational data on Azure
Similar to the topic above, this section of the exam will be taking a deeper dive on the non-relational or NoSQL services available to you on Azure.
The exam will ask you to recognize the traits of NoSQL data and how it differs from relational data. You'll be asked to identify types of data and sort it into NoSQL or relational buckets.
On top of that you will need to know about the various Azure data service offerings for non-relational data and identify their use-cases and differences. Types of non-relational structures will be addressed such as key-value, graph, tree, and others. You will also need to know about Azure Table, Blob, and Files and what specific needs they address.
Finally you'll also need to understand the same management and connectivity topics about Azure's non-relational stores as you did about the relational data services.
Describe an analytics workload on Azure
I found this final section on analytics to be the most product-centric questions of the exam. While many of the data storage options are somewhat platform agnostic (MySQL can run on any number of cloud providers, for example) the analytics tools are very specific to Microsoft Azure.
You need to know the pipeline of data processing from ingestion through OLTP storage, ETL, and warehousing or data lake storage. Yes you'll need to know what OLTP is and how it differs from OLAP. You'll also need to understand the role of Azure Synapse Analytics, Azure Data Factory, Azure Data Lake, Azure HDInsight, and Databricks. While each of these tools may have some overlap they also serve their own specific purposes in a data stream pipeline.
Finally you'll be expected to have worked with and understand how PowerBI is used to provide various chart, graph, and dashboards for analytics. Know what a PowerBI workflow looks like and how you can leverage it against the previously mentioned data stores.
Summary
The DP-900 exam is not a difficult test. It doesn't require in-depth technical knowledge, there are no fill-in-the-blank coding challenges, and you won't have to dive too deeply into any one technology. However it is a broad exam that covers a wide variety of topics across the entire Azure data platform so prepare yourself accordingly.
Good luck!
Top comments (1)
Thank You for sharing this blog post.
I also want to add some important points to this discussion according to my experience.