<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Marcus Wiens</title>
    <description>The latest articles on DEV Community by Marcus Wiens (@m_aireadycompass).</description>
    <link>https://dev.to/m_aireadycompass</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/m_aireadycompass"/>
    <language>en</language>
    <item>
      <title>A Brief History of Artificial Intelligence</title>
      <dc:creator>Marcus Wiens</dc:creator>
      <pubDate>Sun, 18 May 2025 05:04:58 +0000</pubDate>
      <link>https://dev.to/aireadycompass/a-brief-history-of-artificial-intelligence-1p47</link>
      <guid>https://dev.to/aireadycompass/a-brief-history-of-artificial-intelligence-1p47</guid>
      <description>&lt;p&gt;Happy Weekend Friends!&lt;/p&gt;

&lt;p&gt;Today's article covers a brief history of AI. It is an incredibly rich history of research and experimentation. The goal is to cover topics lightly without diving in too deeply and to avoid rambling.&lt;/p&gt;

&lt;h2&gt;
  
  
  Introduction
&lt;/h2&gt;

&lt;p&gt;My aim is to cover the birth of AI as a field of study, some key milestones in development and recent advancements in Natural Language Processing (NLP). If that tickles your fancy then buckle up. &lt;/p&gt;

&lt;p&gt;It is important to outline the two driving factors that lead to AI in 2025. The first is processing power. The invention of the transistor  computer processing units (CPUs) and graphical processing units (GPUs) and the power of compute has completely changed the way we view computers and AI. &lt;/p&gt;

&lt;p&gt;The second is the amalgamation of data. Not just the connection but our ability to structure, label, and query that data in a way that is easier and faster for machines to digest in large formats. Just having a large, clean data set was unthinkable, let alone a trainable and labelled dataset or words and images. &lt;/p&gt;

&lt;h2&gt;
  
  
  Origin
&lt;/h2&gt;

&lt;blockquote&gt;
&lt;p&gt;"Can Machines Think?"&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt;It is important to understand is that artificial intelligence is NOT NEW. That's right the field of machines that can imitate humans goes further back then the 1950's. For our sake we will start in 1950. With our British friend, Alan Turing, and his paper, "Computing Machinery and Intelligence,(1) where he posed the question, "Can Machines Think?".&lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh4pspe188x8znqpnzrc7.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fh4pspe188x8znqpnzrc7.png" alt="Image description" width="180" height="230"&gt;&lt;/a&gt;&lt;br&gt;
Alan Mathison Turing, Mathematician and Computer Scientist, 1951 (2)&lt;/p&gt;

&lt;p&gt;Turing moved beyond the hard to define concepts of the words "think" and "machine" by proposing "The Imitation Game," now known as the Turing Test.  By moving beyond the definitions and vagueness of the words he offer's a scenario. Imagine there are two contestants. A machine and a human who are both asked natural language questions and give answers to another person. The machine and human are hidden from the third person. This third person must not be able to tell the machine from the human based on the answers given. This would mean that the machine has passed the Turing Test (3).&lt;/p&gt;

&lt;h2&gt;
  
  
  Etymology
&lt;/h2&gt;

&lt;p&gt;The term "Artificial Intelligence" was coined at Dartmouth Conference in 1956. It is attributed to John McCarthy (4), the founder of LISP. Combining experts in Neural Networks, the Theory of Computation, and Automata Theory to see which aspects of humans could be replicated by artificial intelligence.&lt;/p&gt;

&lt;h2&gt;
  
  
  Advancements in Computer Technology
&lt;/h2&gt;

&lt;p&gt;Truly a story of legends. The origin story of Silicon Valley is hardly rivaled in modern history for the scale of innovation. From 1956 with William Shockley to today's modern giants like Intel, Google, and many others. The invention of semiconductor, transistor, and the microprocessor  changed the face of the world we live in and set the stage for legendary tales and titans of industry. Although, it is outside the scope of today's writings I highly advise you to listen to Acquired Podcasts (6). It is incredible Bell Labs, the traitorous eight, and Fairchild Semiconductors were so instrumental in the world we live in today. Including direct progress to the chatbots we are using in 2025.&lt;/p&gt;

&lt;p&gt;When it comes to the advancement of computational power. One law stands above them all. Moores Law (7), which dictates that "the number of transistors on an integrated circuit board doubles every two years." Considering this law has held true since 1975 it is a critical component on our journey towards AI plausibility.  &lt;/p&gt;

&lt;h2&gt;
  
  
  Advancements from the 1990's and early 2000's
&lt;/h2&gt;

&lt;p&gt;It took decades for AI to become a main stream talking point when IBM's Deep Blue defeated the world chess champion Garry Kasparov (8). In the late 1990's and early 2000's the amalgamation of data and significant increase in computing power allowed major strides in AI Research. Even by 2007, Sony's Smile Shutter technology was already identifying faces in camera screens (9). &lt;/p&gt;

&lt;p&gt;During this time the Stanford ImageNet Project had been growing. Started in 2006 by Data Scientist Li Fei-Fei (10) (11). The dataset has since grown to over 14 million labelled images for academic use in training AI models. This has been a huge advancement for the "data" portion of the AI requirements.    &lt;/p&gt;

&lt;h2&gt;
  
  
  The Godfather of AI
&lt;/h2&gt;

&lt;p&gt;Geoffrey Everest Hinton was awarded the Nobel Prize in Physics in 2024 (12) for "foundational discoveries and inventions that enable machine learning with artificial neural networks." Geoffrey E. Hinton, David Rumelhart, and Ronald J. Williams published a paper on the backpropogration algorithm (13) (14) in 1986. Not only contributing to the establishment of Neural Networks but the "fine-tuning" algorithms necessary to make them more useful to humans. Truly great accomplishments that are still critical for training and tuning models in 2025. &lt;/p&gt;

&lt;h2&gt;
  
  
  Advancements from 2012 to 2022
&lt;/h2&gt;

&lt;p&gt;The biggest advancements from 2012 to 2022 was the increase in computational power and available data. More effort was put into building labelled and structured datasets the were optimized for machine to learn on. For example, ImageNet hosts yearly competitions for training. The team that won that year, AlexNet from Toronto (15), had GPUs that took two weeks to train. That same model only took ~five minutes to train in November of 2022. A monumental effort to tie computational abilities together. Large amounts of data were now being processed (trained) simultaneously. A massive breakthrough on two fronts. Both computational power and organized data were primed to lead to an explosion of AI. &lt;/p&gt;

&lt;p&gt;A third metric was added to computational power and data availability and organization. The MLPerf metric (16) which was a set of benchmarks to help measure AI's pace. If you think Moore's Law is impressive, the gains in AI models from 2022 to present are just incomprehensible. Imagine what the future holds for these models. &lt;/p&gt;

&lt;p&gt;Queue OpenAI (17) and eventually ChatGPT (18). Followed by Anthropic, Gemini and beyond. I am extremely excited to see what the future holds and I hope you are too. For better of for worse we are all in this together. As the world changes I hope it's for the betterment of all. &lt;/p&gt;

&lt;p&gt;Thanks for reading, let me know what I should add.&lt;/p&gt;

&lt;p&gt;And add. (19)&lt;/p&gt;

&lt;p&gt;REFERENCES&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Turing, Alan M. (1950). Computing Machinery and Intelligence, &lt;em&gt;Mind&lt;/em&gt;, 59/433–460. &lt;a href="https://doi.org/10.1093/mind/LIX.236.433" rel="noopener noreferrer"&gt;https://doi.org/10.1093/mind/LIX.236.433&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.turing.org.uk/sources/archive.html" rel="noopener noreferrer"&gt;https://www.turing.org.uk/sources/archive.html&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://en.wikipedia.org/wiki/Turing_test" rel="noopener noreferrer"&gt;https://en.wikipedia.org/wiki/Turing_test&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.britannica.com/biography/John-McCarthy" rel="noopener noreferrer"&gt;https://www.britannica.com/biography/John-McCarthy&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.acquired.fm/episodes/adapting-episode-3-intel" rel="noopener noreferrer"&gt;https://www.acquired.fm/episodes/adapting-episode-3-intel&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.acquired.fm/episodes/adapting-episode-3-intel" rel="noopener noreferrer"&gt;https://www.acquired.fm/episodes/adapting-episode-3-intel&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://en.wikipedia.org/wiki/Moore%27s_law" rel="noopener noreferrer"&gt;https://en.wikipedia.org/wiki/Moore%27s_law&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://en.wikipedia.org/wiki/Deep_Blue_versus_Garry_Kasparov" rel="noopener noreferrer"&gt;https://en.wikipedia.org/wiki/Deep_Blue_versus_Garry_Kasparov&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.cnet.com/culture/say-cheese-sony-technology-focuses-on-smiles/" rel="noopener noreferrer"&gt;https://www.cnet.com/culture/say-cheese-sony-technology-focuses-on-smiles/&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://image-net.org/about.php" rel="noopener noreferrer"&gt;https://image-net.org/about.php&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://www.historyofdatascience.com/imagenet-a-pioneering-vision-for-computers/#:%7E:text=Set%20up%20by%20data%20scientist,It%20all%20began%20in%201985" rel="noopener noreferrer"&gt;https://www.historyofdatascience.com/imagenet-a-pioneering-vision-for-computers/#:~:text=Set%20up%20by%20data%20scientist,It%20all%20began%20in%201985&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;
&lt;a href="https://en.wikipedia.org/wiki/Geoffrey_Hinton#:%7E:text=Geoffrey%20Everest%20Hinton%20(born%201947,%22the%20Godfather%20of%20AI%22" rel="noopener noreferrer"&gt;https://en.wikipedia.org/wiki/Geoffrey_Hinton#:~:text=Geoffrey%20Everest%20Hinton%20(born%201947,%22the%20Godfather%20of%20AI%22&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;&lt;a href="https://en.wikipedia.org/wiki/Backpropagation" rel="noopener noreferrer"&gt;https://en.wikipedia.org/wiki/Backpropagation&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;Rumelhart, D., Hinton, G. &amp;amp; Williams, R. Learning representations by back-propagating errors. Nature 323, 533–536 (1986). &lt;a href="https://doi.org/10.1038/323533a0" rel="noopener noreferrer"&gt;https://doi.org/10.1038/323533a0&lt;/a&gt;
&lt;/li&gt;
&lt;li&gt;&lt;a href="https://www.pinecone.io/learn/series/image-search/imagenet/" rel="noopener noreferrer"&gt;https://www.pinecone.io/learn/series/image-search/imagenet/&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://spectrum.ieee.org/mlperf-rankings-2022" rel="noopener noreferrer"&gt;https://spectrum.ieee.org/mlperf-rankings-2022&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://en.wikipedia.org/wiki/OpenAI" rel="noopener noreferrer"&gt;https://en.wikipedia.org/wiki/OpenAI&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://en.wikipedia.org/wiki/ChatGPT" rel="noopener noreferrer"&gt;https://en.wikipedia.org/wiki/ChatGPT&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;a href="https://people.idsia.ch/%7Ejuergen/who-invented-backpropagation.html" rel="noopener noreferrer"&gt;https://people.idsia.ch/~juergen/who-invented-backpropagation.html&lt;/a&gt;&lt;/li&gt;
&lt;/ol&gt;

</description>
      <category>history</category>
      <category>ai</category>
      <category>discuss</category>
      <category>learning</category>
    </item>
    <item>
      <title>From Chaos To Clarity: Making Your Data AI Ready</title>
      <dc:creator>Marcus Wiens</dc:creator>
      <pubDate>Thu, 15 May 2025 05:52:20 +0000</pubDate>
      <link>https://dev.to/aireadycompass/from-chaos-to-clarity-making-your-data-ai-ready-31hg</link>
      <guid>https://dev.to/aireadycompass/from-chaos-to-clarity-making-your-data-ai-ready-31hg</guid>
      <description>&lt;p&gt;A gold rush causes people to miss their mark. I’ve often heard that when there is a gold rush happening, sell shovels, sifts, and maps. Today the equivalent is selling Large Language Models (LLMs) and AI-Agents. While these models and agents deliver on their promises, in order to work for any real world organization they need data, and not just any data, but clean, structured and well documented data.&lt;/p&gt;

&lt;p&gt;Many organizations are discovering an uncomfortable truth: their data isn't ready for the AI revolution. Not unlike inviting a master chef to your kitchen only to realize you've stocked the pantry with expired ingredients and unlabeled spices. No matter how sophisticated your AI tools are, they can't create gourmet insights from spoiled data.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Data Quality Crisis: Why Your AI Projects Are Struggling
&lt;/h2&gt;

&lt;p&gt;Picture this: Your organization has invested millions in cutting-edge AI solutions. The executive team is eagerly awaiting the transformative insights promised in the sales pitch. But months later, your AI initiatives are stalling, producing unreliable results, or failing to launch entirely.&lt;/p&gt;

&lt;p&gt;Sound familiar? You're not alone. &lt;/p&gt;

&lt;p&gt;According to industry research, approximately 73% of AI projects struggle or fail completely, and poor data quality is often the primary culprit.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Real Cost of Dirty Data
&lt;/h2&gt;

&lt;p&gt;Bad data isn't just an inconvenience—it is expensive and not just in time and IT costs. When you compile the lost opportunity costs Organizations lose an estimated 15-25% of their revenue due to poor data quality, with further unquantifiable losses in opportunities and insights.&lt;/p&gt;

&lt;p&gt;Slow and steady. The best way to adopt any technology is with a clear head. By rushing technology into data that isn’t ready, you risk damaging customer trust and losing your competitive edge.Make sure your organization understand clean objects and limitations of new technologies as you move forward.&lt;/p&gt;

&lt;h2&gt;
  
  
  The Journey to AI-Ready Data: A Roadmap
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Step 1: Conduct a Data Quality Audit
&lt;/h3&gt;

&lt;p&gt;Before diving into AI adoption, take stock of your current data landscape. Like a chef picking out quality ingredients during prep and before service, your data quality audit should examine:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Completeness&lt;/strong&gt;: Are there missing values in critical fields?&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Accuracy&lt;/strong&gt;: Does your data reflect reality?&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Consistency&lt;/strong&gt;: Does the same information appear differently across systems?&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Timeliness&lt;/strong&gt;: How current is your data?&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Relevance&lt;/strong&gt;: Does your data actually connect to the business problems you're trying to solve?&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Step 2: Establish Data Governance (The Fun Way!)
&lt;/h3&gt;

&lt;p&gt;Think of it as establishing the "rules of the road" or “kitchen hierarchy” for your organization's most valuable asset. Effective governance includes:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Data Stewardship&lt;/strong&gt;: Appoint "Data Champions" across departments who advocate for quality,&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Clear Ownership&lt;/strong&gt;: Define who is responsible for what data (no more "that is not my problem" syndrome),&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Quality Standards&lt;/strong&gt;: Create specific, measurable criteria for "good data,"&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Documentation&lt;/strong&gt;: Build a data dictionary that even non-technical staff can understand.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Pro Tip&lt;/strong&gt;: Gamify your governance! Create friendly competition between departments for data quality improvements, with real rewards for teams that excel.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Step 3: Implement Automated Data Cleaning Processes
&lt;/h3&gt;

&lt;p&gt;Manual data cleaning is like trying to empty the ocean with a teaspoon—technically possible but wildly impractical. Automation is essential:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Data Validation Rules&lt;/strong&gt;: Set up guardrails that prevent bad data from entering your systems,&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Standardization Routines&lt;/strong&gt;: Automatically format addresses, phone numbers, and other common fields,&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Deduplication Tools&lt;/strong&gt;: Identify and merge duplicate records,
Outlier Detection: Flag statistically improbable values for human review,&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Data Enrichment&lt;/strong&gt;: Automatically supplement internal data with trusted external sources.&lt;/li&gt;
&lt;/ul&gt;

&lt;h3&gt;
  
  
  Step 4: Create a Continuous Improvement Cycle
&lt;/h3&gt;

&lt;p&gt;Data quality is not a one-time project, but an ongoing discipline. Establish:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Regular Quality Metrics&lt;/strong&gt;: Monitor and share data quality scores across the organization,&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Feedback Loops&lt;/strong&gt;: Make it easy for users to report data issues,
Root Cause Analysis: Don't just fix errors—understand why they occurred,&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Regular Training&lt;/strong&gt;: Ensure everyone understands their role in maintaining data quality.&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Real-World Success Stories: Data Cleaning Hero
&lt;/h2&gt;

&lt;h3&gt;
  
  
  Paying for Storage
&lt;/h3&gt;

&lt;p&gt;The most common problem we encounter is companies paying tens of thousands to hundreds of thousands of dollars a month to maintain and store data that they are not gaining anything from. By understanding what data your organization needs to collect versus what they are collecting you can start saving your company money. &lt;/p&gt;

&lt;p&gt;From this money savings you can invest in data engineers and data scientists that can not only manage and organize data but can glean critical business insights that can multiply earnings and profits. &lt;br&gt;
One of our early customers, a major retail chain, discovered they had over 30% duplicate customer records in their database. After implementing an enterprise-wide data cleaning initiative, they not only eliminated the duplicates but also discovered $230,000 in yearly cost savings. These duplicates were later diagnosed as missed upselling opportunities by correctly streamlining duplicate customer purchase histories.&lt;/p&gt;

&lt;h2&gt;
  
  
  Getting Started: Your 30-Day Data Readiness Plan
&lt;/h2&gt;

&lt;p&gt;Ready to prepare your data for AI success? Here is your first month's action plan:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;Take our AI Readiness Questionnaire at aireadycompass.com&lt;/strong&gt;&lt;br&gt;
&lt;strong&gt;Enter for your chance to win during our May 2025 draw!&lt;/strong&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion: Data Quality as Competitive Advantage
&lt;/h2&gt;

&lt;p&gt;In the AI era, clean, well-structured data is nott just a technical requirement—it is a strategic asset that creates sustainable competitive advantage. Organizations that master data quality will see their AI investments pay dividends, while those that neglect it will continue to wonder why their competitors continue to gain market share.&lt;/p&gt;

&lt;p&gt;&lt;em&gt;Remember&lt;/em&gt;: The most sophisticated AI in the world can't overcome the limitations of poor-quality data. Make data readiness the foundation of your AI strategy, and you'll be positioned for success in the intelligence revolution.&lt;/p&gt;

&lt;p&gt;About the Author: Marcus is a data strategy consultant specializing in helping organizations prepare their data infrastructure for AI adoption.&lt;/p&gt;

</description>
      <category>ai</category>
      <category>dataengineering</category>
      <category>tutorial</category>
      <category>discuss</category>
    </item>
    <item>
      <title>5 Key Indicators your Business is AI Ready (and 3 warning signs its not)</title>
      <dc:creator>Marcus Wiens</dc:creator>
      <pubDate>Thu, 15 May 2025 05:50:44 +0000</pubDate>
      <link>https://dev.to/aireadycompass/5-key-indicators-your-business-is-ai-ready-and-3-warning-signs-its-not-adc</link>
      <guid>https://dev.to/aireadycompass/5-key-indicators-your-business-is-ai-ready-and-3-warning-signs-its-not-adc</guid>
      <description>&lt;p&gt;Cut to the point. If you want to be the cutting edge champion at your organization here are the top FIVE key points to know:&lt;/p&gt;

&lt;ol&gt;
&lt;li&gt;Data Maturity &amp;amp; Data governance &lt;/li&gt;
&lt;li&gt;Technical Infrastructure Flexibility&lt;/li&gt;
&lt;li&gt;Strategic Alignment with Clear Use Cases&lt;/li&gt;
&lt;li&gt;Skills &amp;amp; Talent Ecosystem&lt;/li&gt;
&lt;li&gt;Adaptive Culture &amp;amp; Change Readiness&lt;/li&gt;
&lt;/ol&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbg2ybwbkkpj0ts8ysc4f.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fbg2ybwbkkpj0ts8ysc4f.png" alt="Image description" width="800" height="999"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Data Maturity and Data Governance
&lt;/h2&gt;

&lt;p&gt;We have all heard the classic saying, "Garbage In, Garbage Out" when it comes to data science and statistics. &lt;/p&gt;

</description>
      <category>ai</category>
      <category>beginners</category>
      <category>productivity</category>
    </item>
  </channel>
</rss>
