<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:dc="http://purl.org/dc/elements/1.1/">
  <channel>
    <title>DEV Community: Jérôme Urbain</title>
    <description>The latest articles on DEV Community by Jérôme Urbain (@jerome_urbain).</description>
    <link>https://dev.to/jerome_urbain</link>
    
    <atom:link rel="self" type="application/rss+xml" href="https://dev.to/feed/jerome_urbain"/>
    <language>en</language>
    <item>
      <title>Comparison of Popular Enterprise Master Data Management Tools (MDMs)</title>
      <dc:creator>Jérôme Urbain</dc:creator>
      <pubDate>Mon, 27 Feb 2023 14:55:52 +0000</pubDate>
      <link>https://dev.to/jerome_urbain/comparison-of-popular-enterprise-master-data-management-tools-mdms-1eh6</link>
      <guid>https://dev.to/jerome_urbain/comparison-of-popular-enterprise-master-data-management-tools-mdms-1eh6</guid>
      <description>&lt;p&gt;You can think of Master Data Management (MDM) as a set of processes and technologies for  managing and maintaining a single, consistent version of the data used across an entire organization. These tools ensure that all the data used by various units within the organization is accurate, consistent, and up to date.&lt;/p&gt;

&lt;p&gt;MDM tools make it easy to access and analyze data from different sources efficiently while keeping things organized. They also allow for easy reporting to facilitate accurate decision-making and data-driven insights. This central source of master data enables organizations to easily integrate new data sources without losing data consistency and accuracy, thus enforcing data privacy, security, and compliance with data regulations.&lt;/p&gt;

&lt;p&gt;That said, choosing an MDM tool is not an easy task. Let’s compare the most popular enterprise MDM tools on the market so you can make an informed decision based on each tool's features, pros, and cons.&lt;/p&gt;

&lt;p&gt;The factors we’ll consider to compare the tools are:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Data governance compliance&lt;/li&gt;
&lt;li&gt;Cloud support (public, private, hybrid)&lt;/li&gt;
&lt;li&gt;Ability to process big data&lt;/li&gt;
&lt;li&gt;AI and machine Learning for data enrichment&lt;/li&gt;
&lt;li&gt;Data discovery and profiling&lt;/li&gt;
&lt;li&gt;Data quality management&lt;/li&gt;
&lt;li&gt;Integration&lt;/li&gt;
&lt;li&gt;Licensing and pricing&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Let's start with the best enterprise MDM tools available today.&lt;/p&gt;

&lt;p&gt; &lt;/p&gt;

&lt;h2&gt;
  
  
  IBM InfoSphere Master Data Management
&lt;/h2&gt;

&lt;p&gt; &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv5ulefcjben5cr4uki6t.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fv5ulefcjben5cr4uki6t.png" alt="IBM InfoSphere Master Data Management Homepage - Image Credit: IBM website" width="800" height="302"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt; &lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.ibm.com/products/ibm-infosphere-master-data-management" rel="noopener noreferrer"&gt;IBM InfoSphere Master Data Management&lt;/a&gt; is a comprehensive MDM software solution that helps organizations manage their critical data assets. It provides powerful tools for data governance, cloud support, big data processing, AI and machine learning, data discovery and profiling, data quality management, and integration.&lt;/p&gt;

&lt;p&gt;Some of the features and strengths of IBM InfoSphere Master Data Management include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Data governance compliance:&lt;/strong&gt; IBM InfoSphere Master Data Management has a comprehensive suite of data governance and security tools to ensure compliance with industry regulations. These include data access control, data masking and encryption, data audit and activity tracking, and data lineage and impact analysis.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cloud support:&lt;/strong&gt; InfoSphere MDM has &lt;a href="https://www.ibm.com/docs/en/imdm/12.0?topic=installing-infosphere-mdm-into-containers-accelerated-deployment" rel="noopener noreferrer"&gt;native support&lt;/a&gt; for deployment on IBM SoftLayer global cloud infrastructure, on-premise servers, Docker containers, Red Hat OpenShift Container Platform, and vanilla Kubernetes. The platform also supports multi-cloud data integration.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Ability to process big data:&lt;/strong&gt; Data management capabilities are integrated into the platform, allowing businesses to efficiently process and analyze large datasets.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;AI and machine learning for data enrichment:&lt;/strong&gt; IBM InfoSphere MDM in particular &lt;a href="https://www.ibm.com/products/cloud-pak-for-data" rel="noopener noreferrer"&gt;Cloud Pak for Data Edition&lt;/a&gt; offers a range of machine learning-based tools for data enrichment, including natural language processing, sentiment analysis, and predictive analytics.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Data discovery and profiling:&lt;/strong&gt; With its data discovery and profiling capabilities, users can quickly uncover insights from their data and better understand patterns and trends.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Data quality management:&lt;/strong&gt; The platform provides tools for detecting and correcting errors and inconsistencies in data, as well as monitoring data quality over time.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Some drawbacks of IBM InfoSphere Master Data Management include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Integration:&lt;/strong&gt; Unsurprisingly, InfoSphere tends to integrate better with other products of the IBM ecosystem, such as WebSphere Application Server, Message Queue, IBM Cloud Pak for Data, and Db2 PureScale.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Licensing and pricing:&lt;/strong&gt; According to their &lt;a href="https://www.ibm.com/products/ibm-infosphere-master-data-management/pricing" rel="noopener noreferrer"&gt;website&lt;/a&gt;, “IBM InfoSphere Master Data Management is available on IBM Cloud Pak for Data, on-premises and as a cloud-managed offering in different sizes.” This licensing model can make it challenging to choose the ideal alternative. Furthermore, the deployment on IBM Cloud Pak for Data and on-premises has &lt;a href="https://www.ibm.com/products/ibm-infosphere-master-data-management/pricing#2789677" rel="noopener noreferrer"&gt;two editions&lt;/a&gt;, standard and advanced. As for Master Data Management on Cloud, Managed, there are &lt;a href="https://www.ibm.com/products/ibm-infosphere-master-data-management/pricing#2789555" rel="noopener noreferrer"&gt;three price tiers available&lt;/a&gt;: managed small, managed medium, and managed large. All in all, both the pricing and licensing models are clearly aimed at corporations and large businesses rather than small and medium-sized businesses.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Overall, InfoSphere is an excellent choice for large enterprises and corporations looking for an MDM solution with tight integration with the rest of the IBM ecosystem that provides robust data governance, AI and machine learning, discovery and creation of data profiling, and data quality management. But IBM InfoSphere Master Data Management can be expensive for small and medium-sized businesses. It may not be the ideal solution for organizations that avoid vendor lock-in at all costs.&lt;/p&gt;

&lt;p&gt; &lt;/p&gt;

&lt;h2&gt;
  
  
  Informatica Master Data Management
&lt;/h2&gt;

&lt;p&gt; &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fylue2kkp188y5tkhfm6o.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fylue2kkp188y5tkhfm6o.jpg" alt="Informatica Master Data Management Website" width="800" height="295"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt; &lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.informatica.com/products/master-data-management.html" rel="noopener noreferrer"&gt;Informatica Master Data Management&lt;/a&gt; is a powerful MDM solution that allows organizations to manage, integrate, and govern their data centrally. It enables users to accurately identify, cleanse, and link related data across multiple systems, making operations more efficient and enabling better decision-making.&lt;/p&gt;

&lt;p&gt;Some of the features and strengths of Informatica Multidomain MDM include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Data governance compliance:&lt;/strong&gt; Informatica Multidomain MDM is designed to meet regulatory requirements around &lt;a href="https://www.informatica.com/products/data-governance.html" rel="noopener noreferrer"&gt;data governance&lt;/a&gt; and provides strong data quality and integration capabilities that ensure compliance with data governance standards.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Ability to process big data:&lt;/strong&gt; With Informatica, businesses can build automated data pipelines for AI and advanced analytics thanks to the &lt;a href="https://www.informatica.com/about-us/claire.html" rel="noopener noreferrer"&gt;CLAIRE engine&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;AI and machine learning for data enrichment:&lt;/strong&gt; Informatica Multidomain MDM can use AI and machine learning for data enrichment and to help identify areas of potential data quality issues.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Data discovery and profiling:&lt;/strong&gt; Informatica Multidomain MDM has powerful data discovery and profiling tools that can help identify potential data issues and provide insight into data quality.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Data quality management:&lt;/strong&gt; The platform offers robust data quality management capabilities that can help identify and address potential data quality issues, as well as improve the overall quality of data.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Integration:&lt;/strong&gt; Thanks to Informatica’s extensive &lt;a href="https://www.informatica.com/products/cloud-integration/connectivity/connectors.html#t=WWW&amp;amp;sort=%40title%20ascending&amp;amp;numberOfResults=24" rel="noopener noreferrer"&gt;Cloud Connectivity library&lt;/a&gt;, your business can get real-time insights from cloud apps like Salesforce, Marketo, snowflake, and more.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Licensing and pricing:&lt;/strong&gt; Informatica uses convenient consumption-based pricing called &lt;a href="https://www.informatica.com/products/cloud-integration/pricing.html" rel="noopener noreferrer"&gt;Informatica Processing Unit (IPU) pricing&lt;/a&gt;. This pricing model allows organizations to dynamically scale up/down Informatica's Intelligent Cloud Services depending on their needs and thus optimize their IT budget.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Some drawbacks of Informatica Multidomain MDM include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Cloud support:&lt;/strong&gt; While Informatica MDM supports data from public, private, and hybrid clouds, the platform itself is a &lt;a href="https://www.informatica.com/products/master-data-management.html" rel="noopener noreferrer"&gt;cloud-native SaaS solution&lt;/a&gt;. That’s a drawback for organizations looking to deploy the MDM on-premises or in their own cloud.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;On the other hand, its flexible pricing model and its comprehensive data governance, data processing, and integration capabilities make Informatica an excellent MDM solution for small to large businesses looking for an all-in-one SaaS solution.&lt;/p&gt;

&lt;p&gt; &lt;/p&gt;

&lt;h2&gt;
  
  
  SAP Master Data Governance
&lt;/h2&gt;

&lt;p&gt; &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frtla1z6edk0v4vvh0net.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Frtla1z6edk0v4vvh0net.png" alt="SAP Master Data Governance Website" width="800" height="326"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt; &lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.sap.com/products/technology-platform/master-data-governance.html" rel="noopener noreferrer"&gt;SAP Master Data Governance&lt;/a&gt; is an excellent solution for managing corporate master data securely. It allows organizations to easily monitor and manage their data structures in the cloud and on-premises. The platform provides robust data governance features, such as compliance with GDPR and other global data regulations, data discovery and profiling, data quality management, and AI and machine learning for data enrichment. SAP MDG supports on-premises deployments as well as public, private, and hybrid clouds for data integration and sharing.&lt;/p&gt;

&lt;p&gt;Some of the features and strengths of SAP Master Data Governance include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Data governance c\ompliance:&lt;/strong&gt; SAP Master Data Governance &lt;a href="https://www.sap.com/products/technology-platform/master-data-governance/features.html" rel="noopener noreferrer"&gt;excels at data governance compliance&lt;/a&gt; by providing a framework to ensure data is appropriately managed and monitored to meet regulatory requirements. It also offers automated workflows and tasks to ensure that data needed for regulatory compliance is accurately entered within the system.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cloud support:&lt;/strong&gt; SAP Master Data Governance supports data from public, private, and hybrid cloud sources. It also offers a variety of &lt;a href="https://www.sap.com/documents/2018/08/3e8b093f-167d-0010-87a3-c30de2ffd8ff.html" rel="noopener noreferrer"&gt;deployment options&lt;/a&gt; for data sharing and data integration, including on-premise and cloud. This allows customers to choose the best deployment option for their specific needs.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;AI and machine learning for data enrichment:&lt;/strong&gt; SAP Master Data Governance provides advanced AI and machine learning capabilities for enriching data. It can use natural language processing and machine learning algorithms to identify patterns and relationships in data, enabling users to gain deeper insights into their data.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Data quality management:&lt;/strong&gt; SAP Master Data Governance offers comprehensive data quality management capabilities. It provides automated checks and validation to ensure that data is accurate, complete, and up to date.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Some drawbacks of SAP Master Data Governance include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Ability to process big data:&lt;/strong&gt; Even though SAP MDG can perform changes in large volumes of data in real time, it’s not optimized to process big data.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Data discovery and profiling:&lt;/strong&gt; SAP Master Data Governance’s data discovery and profiling capabilities are not as robust as those of some of its enterprise-grade competitors (such as IBM InfoSphere Master Data Management).&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Integration:&lt;/strong&gt; SAP Master Data Governance does not offer comprehensive third-party integration capabilities, leaving customers to build their own custom integration solutions. That said, the platform integrates seamlessly with other SAP applications, such as SAP ECC, S/4HANA, and Business Warehouse. This lets customers quickly and easily synchronize and integrate their master data across other SAP enterprise apps.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Licensing and pricing:&lt;/strong&gt; The platform has &lt;a href="https://www.sap.com/products/technology-platform/master-data-governance/technical-information.html" rel="noopener noreferrer"&gt;two editions&lt;/a&gt;: SAP Master Data Governance on SAP S/4HANA (Private Cloud Edition or on-premise) aimed at large companies and corporations looking for a centralized enterprise-wide MDM solution; and SAP Master Data Governance, cloud edition which offers a somewhat lower entry barrier thanks to a modular MDM approach. However, &lt;a href="https://community.sap.com/topics/master-data-governance/faq" rel="noopener noreferrer"&gt;only the first edition&lt;/a&gt; offers all the data domains, data models, and features, and it’s not an option for companies with a low budget.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;It’s pretty clear that SAP Master Data Governance is a product aimed at large companies that prefer a centralized platform focused on data governance and based on the robust SAP ecosystem.&lt;/p&gt;

&lt;p&gt; &lt;/p&gt;

&lt;h2&gt;
  
  
  Pimcore Open Source Master Data Management
&lt;/h2&gt;

&lt;p&gt; &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fttq705brhgvfhex51op2.jpg" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fttq705brhgvfhex51op2.jpg" alt="Pimcore Open Source Master Data Management Website" width="800" height="394"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt; &lt;/p&gt;

&lt;p&gt;&lt;a href="https://pimcore.com/en/master-data-management" rel="noopener noreferrer"&gt;Pimcore Master Data Management&lt;/a&gt; is an impressive open-source platform that offers a comprehensive suite of data management capabilities. It’s a great tool for organizations that need to manage large volumes of data, as it enables them to ensure data governance compliance, integrate with multiple systems, and manage data quality. &lt;/p&gt;

&lt;p&gt;As far as compliance goes, Pimcore allows users to create custom data governance policies and audit trails, ensuring that all data is properly managed and secured.&lt;/p&gt;

&lt;p&gt;Some of the features and strengths of Pimcore Master Data Management include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Cloud support:&lt;/strong&gt; Pimcore supports public, private, and hybrid cloud deployments, allowing for greater flexibility and scalability. It also has APIs for integration with cloud storage solutions. Furthermore, thanks to a partnership with Bitnami, your team can install Pimcore using public cloud images for AWS, Google Cloud, Microsoft Azure, or Oracle Cloud Platform.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;AI and machine learning for data enrichment:&lt;/strong&gt; Pimcore comes with a flexible and powerful AI and machine learning platform, allowing users to create sophisticated models and algorithms for data enrichment.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Data quality management:&lt;/strong&gt; Pimcore provides a comprehensive set of tools for &lt;a href="https://pimcore.com/en/master-data-management/features/data-quality-and-semantics" rel="noopener noreferrer"&gt;data quality management&lt;/a&gt;, including data profiling, data cleansing, data versioning, and data auditing.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Integration:&lt;/strong&gt; Pimcore supports integration with a wide range of data sources, including databases, messaging systems, and file systems. It also has an &lt;a href="https://pimcore.com/en/master-data-management/features/data-integration-and-delivery" rel="noopener noreferrer"&gt;open API&lt;/a&gt; for integration with third-party applications.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Licensing and pricing:&lt;/strong&gt; Pimcore offers three editions, the free and open-source &lt;a href="https://pimcore.com/en/platform/community-edition" rel="noopener noreferrer"&gt;Community Edition&lt;/a&gt;, the subscription-based software-as-a-service &lt;a href="https://pimcore.com/cloud" rel="noopener noreferrer"&gt;Cloud Edition&lt;/a&gt;, and the on-premises &lt;a href="https://pimcore.com/en/platform/enterprise-edition" rel="noopener noreferrer"&gt;Enterprise Edition&lt;/a&gt;, making it the winner of this showdown in terms of licensing and pricing flexibility.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Some drawbacks of Pimcore Master Data Management include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Data governance compliance:&lt;/strong&gt; While Pimcore supports almost any kind of data, be it unstructured or structured, one of its major drawbacks is its limited out-of-the-box support for &lt;a href="https://pimcore.com/docs/pimcore/current/User_Documentation/Administration_of_Pimcore/Data_Protection_and_GDPR.html" rel="noopener noreferrer"&gt;data privacy regulations and data governance compliance&lt;/a&gt;.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Ability to process big data:&lt;/strong&gt; While Pimcore supports data ingestion from a wide range of sources, it doesn't currently support big data processing natively.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;In short, Pimcore is an ideal choice for small and medium-sized businesses looking for an affordable open-source MDM solution that doesn't lock them into a specific vendor and offers the convenience of using it as Product Information Management (PIM), Digital Asset Management (DAM), or Digital Commerce Platform.&lt;/p&gt;

&lt;p&gt; &lt;/p&gt;

&lt;h2&gt;
  
  
  Ataccama ONE
&lt;/h2&gt;

&lt;p&gt; &lt;/p&gt;

&lt;p&gt;&lt;a href="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhbrxln0t8yvebg69yh9u.png" class="article-body-image-wrapper"&gt;&lt;img src="https://media2.dev.to/dynamic/image/width=800%2Cheight=%2Cfit=scale-down%2Cgravity=auto%2Cformat=auto/https%3A%2F%2Fdev-to-uploads.s3.amazonaws.com%2Fuploads%2Farticles%2Fhbrxln0t8yvebg69yh9u.png" alt="Ataccama Website" width="800" height="337"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt; &lt;/p&gt;

&lt;p&gt;&lt;a href="https://www.ataccama.com/" rel="noopener noreferrer"&gt;Ataccama ONE&lt;/a&gt; is a powerful data analytics platform that offers an impressive range of features and capabilities. It provides an excellent level of data governance compliance, with robust security measures to ensure that data is protected and organizations are meeting all their data governance requirements. Ataccama ONE offers support for public, private, and hybrid cloud storage, making it an ideal solution for organizations of all sizes.&lt;/p&gt;

&lt;p&gt;Some of the features and strengths of Ataccama ONE include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Data governance compliance:&lt;/strong&gt; The platforms provide &lt;a href="https://www.ataccama.com/solutions/protect-personal-customer-data" rel="noopener noreferrer"&gt;robust tools for data governance and privacy&lt;/a&gt;, including a data catalog, data lineage, data profiling, metadata management, and data stewardship. These capabilities make it easy to comply with data governance regulations and best practices.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Cloud support:&lt;/strong&gt; Ataccama ONE supports two &lt;a href="https://www.ataccama.com/deployment" rel="noopener noreferrer"&gt;deployment options&lt;/a&gt;: Platform as a Service (PaaS) and on-premise and hybrid. The PaaS deployment runs on Amazon AWS and Microsoft Azure, while the on-premise and hybrid deployment runs on Azure, AWS, Google Cloud, and Linux. Also, organizations can manage data from multiple cloud sources, be it public, private, or hybrid.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Big data processing:&lt;/strong&gt; ONE offers built-in capabilities for &lt;a href="https://www.ataccama.com/focus/big-data-management" rel="noopener noreferrer"&gt;harnessing big data&lt;/a&gt;. It features an integrated data lake, a scalable data processing engine, and a data integration framework.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;AI and machine learning:&lt;/strong&gt; Ataccama ONE provides comprehensive machine learning and &lt;a href="https://www.ataccama.com/platform/self-driving" rel="noopener noreferrer"&gt;AI capabilities&lt;/a&gt;, including a self-service data enrichment platform, a machine learning library, and an AI-driven data discovery engine.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Data discovery and profiling:&lt;/strong&gt; Ataccama ONE features a strong &lt;a href="https://www.ataccama.com/platform/data-profiling" rel="noopener noreferrer"&gt;data discovery and profiling engine&lt;/a&gt; that can quickly find, profile, and analyze data from multiple sources.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Data quality management:&lt;/strong&gt; &lt;a href="https://www.ataccama.com/platform/data-quality" rel="noopener noreferrer"&gt;Data quality management capabilities&lt;/a&gt; include AI-driven anomaly detection, data cleansing, data validation, data standardization, and data enrichment tools.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;Some drawbacks of Ataccama ONE include:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Integration:&lt;/strong&gt; Ataccama ONE offers a comprehensive &lt;a href="https://www.ataccama.com/platform/data-integration" rel="noopener noreferrer"&gt;data integration framework&lt;/a&gt; that supports data ingestion and replication from multiple sources. However, you should keep in mind that ONE is a fully integrated platform, so if your organization avoids vendor lock-in, it may not be the ideal solution.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Licensing and pricing:&lt;/strong&gt; ONE licensing model can be complex and difficult to understand due to its modular nature. This, in turn, makes it difficult to compare its price with that of the competition since aspects such as the type of deployment and the contracted modules come into play.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;All in all, Ataccama ONE is an excellent choice for organizations with a strong focus on advanced data analytics and big data processing. In fact, the platform markets itself as being specifically tailored for data analysts, data engineers, data scientists, and data stewards.&lt;/p&gt;

&lt;p&gt; &lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;The most popular Master Data Management tools all have their own methods of handling data governance compliance, cloud support, big data, AI and machine learning, and so on.&lt;/p&gt;

&lt;div class="table-wrapper-paragraph"&gt;&lt;table&gt;
&lt;thead&gt;
&lt;tr&gt;
&lt;th&gt;MDM Tool&lt;/th&gt;
&lt;th&gt;Strengths&lt;/th&gt;
&lt;th&gt;Drawbacks&lt;/th&gt;
&lt;/tr&gt;
&lt;/thead&gt;
&lt;tbody&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;IBM InfoSphere&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Comprehensive data compliance&lt;/td&gt;
&lt;td&gt;Works best with other IBM products&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Informatica&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Extensive library for third-party integrations&lt;/td&gt;
&lt;td&gt;SaaS, so not possible to deploy on-premises&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;SAP Master Data Governance&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;Excels at data compliance&lt;/td&gt;
&lt;td&gt;Not an option for a low budget&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Pimcore Open Source&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;The clear winner in pricing flexibility&lt;/td&gt;
&lt;td&gt;Limited out-of-the-box compliance support&lt;/td&gt;
&lt;/tr&gt;
&lt;tr&gt;
&lt;td&gt;&lt;strong&gt;Ataccama ONE&lt;/strong&gt;&lt;/td&gt;
&lt;td&gt;A fully integrated platform designed especially for data scientists and data engineers&lt;/td&gt;
&lt;td&gt;Not for those avoiding vendor lock-in&lt;/td&gt;
&lt;/tr&gt;
&lt;/tbody&gt;
&lt;/table&gt;&lt;/div&gt;

&lt;p&gt;But obviously, each is tailored to certain needs; IBM InfoSphere Master Data Management and SAP Master Data Governance have proven more suitable for large companies, while Informatica Multidomain MDM, Pimcore Open Source Master Data Management, and Ataccama ONE are excellent alternatives for small to medium-sized companies looking for more affordable solutions.&lt;/p&gt;

&lt;p&gt;&lt;a href="https://eu1.hubs.ly/H02_VzV0" rel="noopener noreferrer"&gt;GeoPostcodes&lt;/a&gt; is an example of one of the data sources you’ll probably use to improve your master data quality. A common data type that most companies use to inform their logistics, sales, and marketing endeavors is postal data—think zip codes, international postcodes, streets and boundaries. GeoPostcodes offers the most comprehensive enterprise postal code database in the world. Implement it easily with your systems of choice thanks to its &lt;a href="https://eu1.hubs.ly/H02_VBx0" rel="noopener noreferrer"&gt;ERP connector&lt;/a&gt;.&lt;/p&gt;

</description>
      <category>gratitude</category>
    </item>
    <item>
      <title>The Quickest Way to Build an Address Autocomplete API</title>
      <dc:creator>Jérôme Urbain</dc:creator>
      <pubDate>Mon, 27 Feb 2023 14:21:30 +0000</pubDate>
      <link>https://dev.to/jerome_urbain/the-quickest-way-to-build-an-address-autocomplete-api-3o1k</link>
      <guid>https://dev.to/jerome_urbain/the-quickest-way-to-build-an-address-autocomplete-api-3o1k</guid>
      <description>&lt;p&gt;An &lt;em&gt;address autocomplete&lt;/em&gt; is an address form feature that’s kind of just what it says on the tin: it automatically suggests addresses to users as they type into an online form. It’s an elegant way to ensure accuracy as a user inputs all the components of a complete address, like country, city, postal code, and street.&lt;/p&gt;

&lt;p&gt;But building a form to capture an address with auto-complete and validation can be quite a challenge. What should the form ask for first: the postal code or the region? Which data input should use auto-complete? Which reliable data source should be used? How to optimize UI responsiveness?&lt;/p&gt;

&lt;p&gt;In this article you will learn all that’s required to build a backend API for address completion.&lt;br&gt;
&lt;br&gt;&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;The technicalities of building an API backend and frontend are beyond the scope of this article. They’re too dependent on your chosen programming language. As a personal recommendation, I favor React with &lt;a href="https://mui.com/"&gt;Material UI&lt;/a&gt; for the frontend and Node.js with &lt;a href="https://expressjs.com/"&gt;Express&lt;/a&gt; for the backend. But you will need some solid technical knowledge about building an API backend and frontend before you read on.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt; &lt;/p&gt;

&lt;h2&gt;
  
  
  What Does an Address Autocomplete Feature Actually Look Like?
&lt;/h2&gt;

&lt;p&gt;To expound on my earlier definition, an address autocomplete feature generates suggestions based on previously entered information as the user enters data into the form. The user can easily choose from a list of options, ensuring the accuracy of the input address. This is especially useful when someone needs to enter a long or complex address or when the user isn’t familiar with a specific address format.&lt;br&gt;
&lt;br&gt;&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--eQmmyydf--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/jb9yy8b0mnf9c9b1lv3o.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--eQmmyydf--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/jb9yy8b0mnf9c9b1lv3o.png" alt="Address autocomplete API" width="880" height="876"&gt;&lt;/a&gt;&lt;br&gt;
&lt;br&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Why Is Address Autocomplete Important?
&lt;/h2&gt;

&lt;p&gt;An efficient address autocompletion form on your website prevents you from ingesting inaccurate data, like typos or nonexistent addresses. Unless you allow free entry, you could also lose sign-up or orders. But allowing that will even worsen the data quality.&lt;/p&gt;

&lt;p&gt;Ensuring data quality right from the start is critical for proper data processing. Without it, you risk having incorrect shipping addresses, which can cause delays, extra costs, warnings in your billing system, or incorrect recommendations from a store locator feature—all bad for your business and bad for user experience.&lt;/p&gt;

&lt;p&gt;But the feature isn’t just about preventing negative events. Address autocompletion can have quite a positive impact simply by making things easier on your users, like when someone has a time constraint and needs to complete a purchase quickly. It will also increase the overall quality of all the addresses you collect in your database and therefore decrease your operational costs.&lt;br&gt;
&lt;br&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Building and Implementing an Address Autocomplete API
&lt;/h2&gt;

&lt;p&gt;For the purposes of this article, I’m going to assume that you’re using a datasource with a structure similar to this:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Countries:&lt;/strong&gt; Lists all countries and their corresponding ISO code. This table should also include metadata cataloging which data is present or not, like streets that might be missing or whether no postal codes exist for a certain country.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Places:&lt;/strong&gt; Linked to a country, lists the cities and towns.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Postal codes:&lt;/strong&gt; Linked to a place, lists the related postal codes. Note that one postal code can be included in more than one place.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;Streets:&lt;/strong&gt; Linked to a place and a postal code, lists all related streets. Note that the same street can cross multiple places or postal codes.&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt; &lt;/p&gt;

&lt;h3&gt;
  
  
  Keep a Few Tips in Mind
&lt;/h3&gt;

&lt;p&gt;For building and apartment numbers, I work on the assumption that the datasource doesn’t provide accurate information. It’s truly difficult to have a complete and accurate source of house numbers, so I advise you to leave that as a free input. Otherwise your captured data quality will probably be impaired.&lt;/p&gt;

&lt;p&gt;Note that some countries don’t have postal codes or your datasource might not have accurate data for that country. In that case, just show a free input for the fields that are missing data.&lt;/p&gt;

&lt;p&gt;The form built in this tutorial calls a backend API that’s linked to a referential database. The API sends back and validates the data from the source, and provides autocomplete when necessary. It’s important to remember that the quality of the result depends a lot on the quality of the reference datasource. You can read the paragraph Ensuring the Quality of Your Datasource to learn more about this.&lt;br&gt;
&lt;br&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Choose Your Field Order
&lt;/h3&gt;

&lt;p&gt;There are two possible ways to approach building this form. You can enforce the fields to fill in this order:&lt;/p&gt;

&lt;p&gt;&lt;strong&gt;1. Country -&amp;gt; Postal code -&amp;gt; Locality -&amp;gt; Street.&lt;/strong&gt; This approach is easier technically, but some users might prefer to see a filtered list of postal codes.&lt;br&gt;
&lt;strong&gt;2. Country -&amp;gt; Locality -&amp;gt; Postal code -&amp;gt; Street.&lt;/strong&gt; This is a more traditional approach.&lt;/p&gt;

&lt;p&gt;I actually recommend combining the two approaches in one form and letting the user fill whichever field they want first. The country should still be enforced as the first field to fill because it's information all the users can give quickly and correctly. It also narrows down the possible choices and allows you to hide or show some fields. As mentioned earlier, some countries don’t even have postal codes, so you wouldn’t need to show a field for them.&lt;/p&gt;

&lt;p&gt;The final result for Approach 1 should look like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Z7Pm5-Ja--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i.imgur.com/rKUabna.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Z7Pm5-Ja--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i.imgur.com/rKUabna.png" alt="Autofilling data with postal code first - Approach 1" width="636" height="768"&gt;&lt;/a&gt;&lt;br&gt;
&lt;br&gt;&lt;br&gt;
The final result for Approach 2 should look like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--Y37fKQXB--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i.imgur.com/Sh1hh81.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--Y37fKQXB--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i.imgur.com/Sh1hh81.png" alt="Autofilling data with places first - Approach 2" width="650" height="796"&gt;&lt;/a&gt;&lt;br&gt;
&lt;br&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Display the Data in the Form
&lt;/h3&gt;

&lt;p&gt;For some endpoints, the data is returned based on what the user is currently typing. To avoid having too many hits on your API endpoint, I recommend a delay of 20 ms. In that time, the user might continue typing their query; by introducing a delay, you avoid sending useless queries to the backend.&lt;/p&gt;

&lt;p&gt;A convenient way to implement that functionality is to use the &lt;a href="https://www.learnrxjs.io/learn-rxjs/operators/filtering/debouncetime"&gt;RxJS debounceTime feature&lt;/a&gt;.&lt;br&gt;
&lt;br&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Build the API
&lt;/h3&gt;

&lt;p&gt;Here I’ll describe all the endpoints you need to serve to have a complete API. But first, let me share a few quick insights that I’ve found helpful:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;If a path contains a colon (&lt;code&gt;:&lt;/code&gt;), that indicates a parameter that will be reused in the query.&lt;/li&gt;
&lt;li&gt;For POST requests, the content’s structure is described as the &lt;em&gt;payload&lt;/em&gt; in a JSON format.&lt;/li&gt;
&lt;li&gt;For all the API endpoints doing an autocomplete, don’t return anything if the user hasn’t started typing. It would make little sense in terms of user experience.&lt;/li&gt;
&lt;li&gt;When filtering on a user’s input, always do a case-insensitive search with wildcards before and after.&lt;/li&gt;
&lt;li&gt;The queries that might return a lot of data should be limited to 100 results to increase both the query performance and the response time.&lt;/li&gt;
&lt;li&gt;Consider that important entities like postal code and place will have a unique identifier. Depending on your datasource, the structure might slightly differ.&lt;/li&gt;
&lt;li&gt;Depending on your datasource’s license, you might want to implement a per user throttling and quota mechanism to avoid having all the data scrapped.

#### Approaches 1 and 2 — Country&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;strong&gt;Endpoint:&lt;/strong&gt; &lt;code&gt;GET /countries&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;This returns the list of countries without any filtering. The list is short, so it can be loaded entirely on the client side.&lt;/p&gt;

&lt;p&gt;Again, remember that some countries don’t have postal codes and your datasource might not have street data for all the countries. This is a good place to return that information and simply show a free text field for the missing data.&lt;br&gt;
&lt;br&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Approach 1 — Postal codes from Country
&lt;/h4&gt;

&lt;p&gt;&lt;strong&gt;Endpoint:&lt;/strong&gt; &lt;code&gt;GET /countries/:iso/postcodes/:postcode&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;This returns the autocompletion of a postal code for a given country. The data must be filtered on the postal code.&lt;br&gt;
&lt;br&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Approach 1 — Places from Postcode
&lt;/h4&gt;

&lt;p&gt;&lt;strong&gt;Endpoint:&lt;/strong&gt; &lt;code&gt;GET /postcodes/:postcodeId/places&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Given a postal code, the API will return all the associated places.&lt;br&gt;
&lt;br&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Approach 2 — Places from Country
&lt;/h4&gt;

&lt;p&gt;&lt;strong&gt;Endpoint:&lt;/strong&gt; &lt;code&gt;POST /countries/:iso/places&lt;/code&gt;&lt;br&gt;
&lt;strong&gt;Payload:&lt;/strong&gt; &lt;code&gt;{ place }&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;This query should return all the places in a country. The data is filtered on the place entered by the user.&lt;br&gt;
&lt;br&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Approach 2 — Postal codes from Place
&lt;/h4&gt;

&lt;p&gt;&lt;strong&gt;Endpoint:&lt;/strong&gt; &lt;code&gt;GET /places/:placeId/postcodes/:postcode&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;If the user chose a place from a country, this endpoint must be used to autocomplete the postal code.&lt;br&gt;
&lt;br&gt;&lt;/p&gt;

&lt;h4&gt;
  
  
  Approach 1 and 2 — Street from Place
&lt;/h4&gt;

&lt;p&gt;&lt;strong&gt;Endpoint:&lt;/strong&gt; &lt;code&gt;POST /places/:placeId/streets&lt;/code&gt;&lt;br&gt;
&lt;strong&gt;Payload:&lt;/strong&gt; &lt;code&gt;{ postcodeId, street }&lt;/code&gt;&lt;/p&gt;

&lt;p&gt;Note that both the place and the postal code should be necessary to correctly filter the street. This of course depends on your datasource, but combining both usually offers the most accurate discriminator.&lt;br&gt;
&lt;br&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Improving Your Address Autocomplete API
&lt;/h2&gt;

&lt;p&gt;The autocomplete mechanism presented in this article only works on perfect matches. That’s rarely good enough, so you’ll most likely want to improve on what we’ve already got here.&lt;br&gt;
 &lt;br&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Implement Full Text Search
&lt;/h3&gt;

&lt;p&gt;Users are going to make spelling mistakes. To alleviate that in simple projects, you can use trigrams or the full text search capability of your DBMS. &lt;/p&gt;

&lt;p&gt;For more complex solutions you can use a dedicated search engine such as &lt;a href="https://www.elastic.co/elasticsearch/"&gt;Elasticsearch&lt;/a&gt;, &lt;a href="https://opensearch.org/"&gt;OpenSearch&lt;/a&gt;, or &lt;a href="https://typesense.org/"&gt;Typesense&lt;/a&gt;.&lt;br&gt;
&lt;br&gt;&lt;/p&gt;

&lt;h3&gt;
  
  
  Use Additional Datasets
&lt;/h3&gt;

&lt;p&gt;Additional data can significantly enhance the user experience.&lt;/p&gt;

&lt;p&gt;For example, consider how &lt;em&gt;weighted data&lt;/em&gt; can improve the autocompletion. The most important places have the heavier weight, which would sort the results in a useful order that’s comprehensible for the user while aiding in disambiguation. With correctly weighted data, &lt;code&gt;Boston (Massachusetts)&lt;/code&gt; would appear first in a list of the many cities that are named Boston in the United States.&lt;/p&gt;

&lt;p&gt;Another interesting improvement would be grouping places by the name people expect to find. For example, London doesn’t actually exist as a place within the fine-grained English administrative divisions, but is rather a group of boroughs. Your users will probably still be looking for &lt;code&gt;London&lt;/code&gt;, though.&lt;br&gt;
&lt;br&gt;&lt;/p&gt;

&lt;h2&gt;
  
  
  Ensuring the Quality of Your Datasource
&lt;/h2&gt;

&lt;p&gt;The quality of the datasource used to build your API is critical to the success of your autocomplete feature. Otherwise you’ll gather incomplete information at best and inaccurate information at worst. If a user enters data not present in your dataset, and you’ve made the use of your data mandatory, you could even lose the sign-up or the order, decreasing your overall conversion rates. This can be mitigated by allowing free text to be entered but will worsen the captured data quality.&lt;/p&gt;

&lt;p&gt;The datasource you work with needs to include an up-to-date, exhaustive list of valid postal codes and places for all your target countries. Streets can be left optional but having a complete list of streets definitely helps. As it’s difficult to always have an up to date list of valid addresses, it’s best to allow free text. You can try to combine both approaches by trying auto-complete first and then allowing free text if nothing is found.&lt;/p&gt;

&lt;p&gt;You’ve got 2 main options for your sources. &lt;/p&gt;

&lt;p&gt;If you’d like to use free or open source data providers, we would recommend to use a combination of  &lt;a href="https://www.geonames.org/"&gt;GeoNames&lt;/a&gt; and &lt;a href="https://www.openstreetmap.org"&gt;OpenStreetMap&lt;/a&gt;. GeoNames links places to postal codes for 96 countries while OpenStreetMap offers street coverage for most countries.&lt;/p&gt;

&lt;p&gt;If you’re willing to invest in a paid solution, I’d recommend our &lt;a href="https://eu1.hubs.ly/H02_gy00"&gt;GeoPostcodes&lt;/a&gt; database. It contains places for all countries in the world, postal codes for all of the countries using them, and streets for a third of the countries. It’s updated on a weekly basis, relying on more than 1,500 sources. You would benefit from the fact that postal codes and streets are already linked in a unified structure. Based on our experience, it would increase the quality of your API and its speed of implementation. &lt;a href="https://eu1.hubs.ly/H02_gyb0"&gt;Browse our datasets&lt;/a&gt; yourself and access our map explorer for free.&lt;/p&gt;

&lt;p&gt;The API described in this tutorial would benefit from GeoPostcodes products like:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;
&lt;strong&gt;Our &lt;a href="https://eu1.hubs.ly/H02_gy90"&gt;postal normalized street database&lt;/a&gt;.&lt;/strong&gt; It includes alternative place names in foreign languages.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://eu1.hubs.ly/H02_7vL0"&gt;Town weights&lt;/a&gt;.&lt;/strong&gt; This improves results sorting.&lt;/li&gt;
&lt;li&gt;
&lt;strong&gt;&lt;a href="https://eu1.hubs.ly/H02_gXg0"&gt;City tags&lt;/a&gt;.&lt;/strong&gt; You can natively incorporate large cities and metropoles in the search.

&lt;/li&gt;
&lt;/ul&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;Clearly, having an autocomplete address form is crucial to a lot of business processes, like logistics or online sales. It vastly improves data quality and ensures correct data processing. With all the information covered here—what it looks like to start with postal code or place for your form, and why you’ve got to have a reliable and updated datasource —you should be well on your way to creating the perfect form to capture accurate addresses.&lt;/p&gt;

</description>
    </item>
    <item>
      <title>How to Build a Zip Code to Time Zone Database in Five Steps</title>
      <dc:creator>Jérôme Urbain</dc:creator>
      <pubDate>Fri, 17 Feb 2023 09:43:37 +0000</pubDate>
      <link>https://dev.to/jerome_urbain/how-to-build-a-zip-code-to-time-zone-database-in-five-steps-2ij8</link>
      <guid>https://dev.to/jerome_urbain/how-to-build-a-zip-code-to-time-zone-database-in-five-steps-2ij8</guid>
      <description>&lt;p&gt;Timing matters. Whether you want to remind customers about reservations, run a survey, or send marketing messages, contacting people at the optimal time is a key to business success. In the case of call centers, timing can even be critical, when you really need live interaction with the recipient. &lt;/p&gt;

&lt;p&gt;But knowing the proper time to call can be tricky when you’re working with a large user base. Your recipients may be spread over a range of time zones, and it’s not always straightforward to gather that data from standard user information. Some large countries (e.g., USA, Canada, Mexico, Brazil, Russia) span numerous time zones; you’ll need more than just a country of residence to determine the time zone of their residents.&lt;/p&gt;

&lt;p&gt;In this article, we’ll see how you can infer time zones from international postcodes. Zip codes (also known as postal codes or postcodes) are an element of standard address formatting. They’re also frequently included in user information, making zip codes a good candidate for determining time zones.&lt;/p&gt;

&lt;p&gt; &lt;/p&gt;

&lt;h2&gt;
  
  
  What Is Zip Code to Time Zone Mapping?
&lt;/h2&gt;

&lt;p&gt;Zip code to time zone mapping is the process of associating a time zone to its correct postal code. The end goal of this is to facilitate reaching out to any contact at a suitable time.&lt;/p&gt;

&lt;p&gt;This requires access to:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Zip codes with additional information (administrative divisions, coordinates, etc.)&lt;/li&gt;
&lt;li&gt;Time zone rules&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;When mapping zip codes to time zones across a single country, it’s usually feasible to work with a set of time zone rules (e.g., California is in time zone &lt;code&gt;America/Los_Angeles&lt;/code&gt;, while New York is in time zone &lt;code&gt;America/New_York&lt;/code&gt;). However, this can already require some multi-level processing in order to build accurate mapping. For instance, states and even counties in the USA can include more than one time zone. As a result, you would not only need to find a source that maps zip codes to states/counties, then assign time zones to states/counties and then encode exceptions.&lt;/p&gt;

&lt;p&gt;Of course, rules get more complicated when you target international data, which is the purpose of this article. So, we will not work with time zone rules, but use a more convenient method for international data, focusing on geographical matching between postcodes and time zones.&lt;/p&gt;

&lt;p&gt; &lt;/p&gt;

&lt;h2&gt;
  
  
  Assigning Time Zones to Zip Codes
&lt;/h2&gt;

&lt;p&gt;Let’s dive into the practicalities of assigning time zones to zip codes. In this section, we’ll talk about where to find the base data (the postcodes and time zones we’ll work with), how to upload the data to a common framework, and finally how to use both datasets to assign a time zone to every postcode.&lt;/p&gt;

&lt;p&gt; &lt;/p&gt;

&lt;h3&gt;
  
  
  Sourcing Postcode Data
&lt;/h3&gt;

&lt;p&gt;You first need to identify a good source of international postcodes, with geographical information. Free candidates include &lt;a href="https://www.geonames.org/"&gt;GeoNames&lt;/a&gt; and &lt;a href="https://www.openstreetmap.org/#map=4/38.01/-95.84"&gt;OpenStreetMap&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;From those sources, you can access postal codes for a number of countries (and even more if you combine them) along with their latitude and longitude. This tutorial will use GeoNames.&lt;/p&gt;

&lt;p&gt; &lt;/p&gt;

&lt;h3&gt;
  
  
  Sourcing Time Zone Data
&lt;/h3&gt;

&lt;p&gt;The most widely used time zone data is published by the &lt;a href="https://www.iana.org/"&gt;Internet Assigned Numbers Authority&lt;/a&gt;. It includes a set of rules and code that anyone can process, not only to capture the different time zones, but also to access the past and future time switches from Standard Time (sometimes referred to as “winter time”) to Daylight Saving Time (“summer time”) in relevant time zones.&lt;/p&gt;

&lt;p&gt;For the sake of our geographical mapping, we’ll use a map of the time zones as maintained on &lt;a href="https://github.com/evansiroky/timezone-boundary-builder"&gt;Time Zone Boundary Builder&lt;/a&gt;. This project releases high quality maps of the official IANA time zones. You can download the &lt;a href="https://github.com/evansiroky/timezone-boundary-builder/releases"&gt;&lt;code&gt;geojson&lt;/code&gt; file in the Releases folder&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt; &lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Take the file with oceans if you want to cover the whole earth. Take the one without if you don’t want to assign a time zone to points that would fall into oceans (indicating coordinates are likely wrong).&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt; &lt;/p&gt;

&lt;p&gt;The world time zones map, according to the &lt;a href="https://github.com/evansiroky/timezone-boundary-builder/releases/tag/2022g"&gt;IANA 2022g release&lt;/a&gt;, looks like this:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--hrZx9OOF--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/13te5507jn89we7bgt35.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--hrZx9OOF--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://dev-to-uploads.s3.amazonaws.com/uploads/articles/13te5507jn89we7bgt35.png" alt="Image description" width="880" height="537"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt; &lt;/p&gt;

&lt;p&gt;However, the map only includes the zone name. In order to use the zones, you’ll need more information (such as the Standard and Daylight Saving Times of each time zone, and so on). To obtain additional details like that, you can process the official IANA time zone files. If you don’t want to run that, or extract data from Wikipedia (scraping or extracting &lt;a href="https://en.wikipedia.org/wiki/List_of_tz_database_time_zones"&gt;this table&lt;/a&gt;), you can get a consolidated &lt;a href="https://eu1.hubs.ly/H02WKj10"&gt;CSV file from GeoPostcodes&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt; &lt;/p&gt;

&lt;h3&gt;
  
  
  Uploading the Data to a Common Database
&lt;/h3&gt;

&lt;p&gt;To combine the data sources, you’ll benefit from uploading them to a common framework. While it’s perfectly possible to handle the processes with widespread programming languages and libraries (e.g. Python and &lt;a href="https://geopandas.org/en/stable/"&gt;GeoPandas&lt;/a&gt;), we’ll use a database to store and query our data in this tutorial.&lt;/p&gt;

&lt;p&gt;Our preference goes to PostgreSQL with the PostGIS extension, a leading open-source database engine with state-of-the-art geographical data processing features. It offers all the functionalities you need in this case:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Uploading the different data sources&lt;/li&gt;
&lt;li&gt;Matching them using geographical attributes or other keys&lt;/li&gt;
&lt;li&gt;Capable of storing, querying, and exporting the results&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;It also natively handles time zones, including the IANA zone names, making it a great choice if you want to manage your time data in the same framework as your postcodes data.&lt;/p&gt;

&lt;p&gt;First, set up a PostgreSQL/PostGIS database. There are numerous tutorials available for that, but &lt;a href="https://freegistutorial.com/how-to-install-postgis-on-ubuntu-22-04/"&gt;we suggest this one&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;Once you have the database set up, you can upload your different sources to it. We’re assuming in this tutorial that you’ll upload everything to a schema named &lt;code&gt;timezones&lt;/code&gt;. &lt;/p&gt;

&lt;p&gt;All the commands we show here can be run from any SQL client (like &lt;a href="https://www.pgadmin.org/"&gt;pgAdmin&lt;/a&gt; or &lt;a href="https://dbeaver.io/"&gt;DBeaver&lt;/a&gt;), but also using the &lt;a href="https://www.postgresql.org/docs/current/app-psql.html"&gt;psql&lt;/a&gt; interactive terminal.&lt;/p&gt;

&lt;p&gt;The general psql command syntax looks like this:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;psql -tA -U #username -d #database -h #host -c "#sql_command"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;blockquote&gt;
&lt;p&gt;It’s useful to &lt;a href="https://www.postgresql.org/docs/current/libpq-pgpass.html"&gt;set up a password file&lt;/a&gt; to avoid being prompted for your password every time.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt; &lt;/p&gt;

&lt;p&gt;Now, it’s time to upload your files. You’ll need to unzip all the files to a common directory (&lt;code&gt;/home/timezones&lt;/code&gt;, in this tutorial). Then, you can upload their data to your PostgreSQL database with the following commands.&lt;/p&gt;

&lt;p&gt; &lt;/p&gt;

&lt;h4&gt;
  
  
  Geonames.csv
&lt;/h4&gt;

&lt;p&gt;First, create a table to host the data:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;CREATE TABLE timezones.geonames_postcodes(
iso char(2), 
postcode varchar(20), 
place varchar(180), 
adm1_name varchar(80), 
adm1_code varchar(20), 
adm2_name varchar(80), 
adm2_code varchar(20), 
adm3_name varchar(80), 
adm3_code varchar(20), 
lat float, 
lng float, 
geo_accuracy int);
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt; &lt;/p&gt;

&lt;p&gt;Now we can upload the data, here using the &lt;code&gt;\copy&lt;/code&gt; command from psql:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;psql -tA -U #username -d #database -h #host -c "\copy timezones.geonames_postcodes FROM /home/timezones/allCountries.txt CSV HEADER DELIMITER AS E'\t' ;"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt;Alternatively, you can also use the &lt;code&gt;COPY&lt;/code&gt; from a &lt;a href="https://www.postgresql.org/docs/current/sql-copy.html"&gt;SQL client&lt;/a&gt;, and most clients offer functionalities to upload a CSV directly from the graphical interface.&lt;/p&gt;

&lt;p&gt; &lt;/p&gt;

&lt;h4&gt;
  
  
  IANA geojson
&lt;/h4&gt;

&lt;p&gt;Leverage the &lt;code&gt;ogr2ogr&lt;/code&gt; GDAL command to directly upload the &lt;code&gt;geojson&lt;/code&gt; file with time zone polygons to our database:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;    ogr2ogr -f "PostgreSQL" -clo SCHEMA=timezones PG:"dbname=postgres user=#user" combined.json -nln timezone_polygons
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt; &lt;/p&gt;

&lt;h4&gt;
  
  
  Time Zone Information
&lt;/h4&gt;

&lt;p&gt;You can upload the file available from GeoPostcodes’s website, &lt;code&gt;GPC-TIMEZONES.csv&lt;/code&gt;, with the following commands. First create a table, then upload the CSV:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;    CREATE TABLE timezones.timezones_info(
iso char(2), 
country text, 
timezone text, 
std_offset char(6), 
dst_offset char(6), 
zone_abbrevation char(10), 
zone_alt_name text);
psql -tA -U #username -d #database -h #host -c "\copy timezones.timezones_info FROM /home/timezones/GPC-TIMEZONES.csv CSV HEADER DELIMITER AS ';' ;"
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt; &lt;/p&gt;

&lt;h3&gt;
  
  
  Preparing Geographical Data
&lt;/h3&gt;

&lt;p&gt;Once you’ve uploaded the necessary data, it’s a good idea to add a spatial index on the polygon database to speed up the spatial joins:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;CREATE INDEX timezones_geom_idx on timezones.timezone_polygons USING SPGIST(wkb_geometry);
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt; &lt;/p&gt;

&lt;p&gt;You’ll also benefit from creating a geometry column in the Geonames table, which you can then invoke for the spatial joins:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;ALTER TABLE timezones.geonames_postcodes ADD COLUMN geom geometry(point,4326);
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt; &lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;UPDATE timezones.geonames_postcodes SET geom=ST_Makepoint(lng,lat);
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt; &lt;/p&gt;

&lt;h3&gt;
  
  
  Linking Postcodes and Time Zones with Geographical Matching
&lt;/h3&gt;

&lt;p&gt;Now that all your data is stored in the PostgreSQL database, leverage the geographical attributes in the &lt;code&gt;geonames_postcodes&lt;/code&gt; and &lt;code&gt;timezone_polygons&lt;/code&gt; tables to join them. Use the coordinates of every postcode and check which time zone polygon they fall in.&lt;/p&gt;

&lt;p&gt;You can perform this with the following query, creating a new table to store your results:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;CREATE TABLE timezones.postcodes_timezones AS SELECT g.iso, g.postcode, t.tzid FROM timezones.geonames_postcodes g JOIN timezones.timezone_polygons t on ST_Intersects(g.geom,t.wkb_geometry);
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt; &lt;/p&gt;

&lt;p&gt;After this process, some postcodes might be mapped to several time zones. This can happen either because time zone polygons overlap (it’s infrequent but it happens) or because the postcode is associated with a locality that falls in more than one time zone.&lt;/p&gt;

&lt;p&gt;To properly match the time zone most associated to each postal code, run:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight sql"&gt;&lt;code&gt;&lt;span class="k"&gt;CREATE&lt;/span&gt; &lt;span class="k"&gt;TABLE&lt;/span&gt; &lt;span class="n"&gt;timezones&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;postcodes_tz&lt;/span&gt; &lt;span class="k"&gt;AS&lt;/span&gt; &lt;span class="k"&gt;SELECT&lt;/span&gt; &lt;span class="n"&gt;iso&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;postcode&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="k"&gt;mode&lt;/span&gt;&lt;span class="p"&gt;()&lt;/span&gt; &lt;span class="n"&gt;WITHIN&lt;/span&gt; &lt;span class="k"&gt;GROUP&lt;/span&gt;&lt;span class="p"&gt;(&lt;/span&gt;&lt;span class="k"&gt;ORDER&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt; &lt;span class="n"&gt;tzid&lt;/span&gt;&lt;span class="p"&gt;)&lt;/span&gt; &lt;span class="k"&gt;as&lt;/span&gt; &lt;span class="n"&gt;timezone&lt;/span&gt; &lt;span class="k"&gt;FROM&lt;/span&gt; &lt;span class="n"&gt;timezones&lt;/span&gt;&lt;span class="p"&gt;.&lt;/span&gt;&lt;span class="n"&gt;postcodes_timezones&lt;/span&gt; &lt;span class="k"&gt;GROUP&lt;/span&gt; &lt;span class="k"&gt;BY&lt;/span&gt; &lt;span class="n"&gt;iso&lt;/span&gt;&lt;span class="p"&gt;,&lt;/span&gt; &lt;span class="n"&gt;postcode&lt;/span&gt;&lt;span class="p"&gt;;&lt;/span&gt;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;blockquote&gt;
&lt;p&gt;Note that in case of a tie, &lt;code&gt;mode&lt;/code&gt; will arbitrarily select one candidate.&lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt; &lt;/p&gt;

&lt;p&gt;Once the time zone has been identified, consider gathering extra information about it, like its Standard and Daylight Saving Times. If you want to pull in the information contained in the &lt;code&gt;timezones_info&lt;/code&gt; table, extend the query to:&lt;br&gt;
&lt;/p&gt;

&lt;div class="highlight js-code-highlight"&gt;
&lt;pre class="highlight plaintext"&gt;&lt;code&gt;SELECT pt.iso, pt.postcode, mode() WITHIN GROUP(ORDER BY pt.tzid) AS timezone, ti.std_offset, ti.dst_offset, ti.zone_abbrevation, ti.zone_alt_name FROM timezones.postcodes_timezones pt LEFT JOIN timezones.timezones_info ti on ti.timezone=pt.tzid GROUP BY 1, 2, 4, 5, 6, 7;
&lt;/code&gt;&lt;/pre&gt;

&lt;/div&gt;



&lt;p&gt; &lt;/p&gt;

&lt;p&gt;The following screenshot shows the result, where you can see every postcode mapped to a single time zone, with high level information about the time zone:&lt;/p&gt;

&lt;p&gt;&lt;a href="https://res.cloudinary.com/practicaldev/image/fetch/s--9p_z6t6W--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i.imgur.com/iIzUYuF.png" class="article-body-image-wrapper"&gt;&lt;img src="https://res.cloudinary.com/practicaldev/image/fetch/s--9p_z6t6W--/c_limit%2Cf_auto%2Cfl_progressive%2Cq_auto%2Cw_880/https://i.imgur.com/iIzUYuF.png" alt="Every postcode mapped to a single time zone" width="880" height="306"&gt;&lt;/a&gt;&lt;/p&gt;

&lt;p&gt; &lt;/p&gt;

&lt;h2&gt;
  
  
  Known Issues with a Postcode to Time Zone Database
&lt;/h2&gt;

&lt;p&gt;The IANA time zone polygons are released shortly after any updates to the IANA time zone rules, so the main issues you might encounter will relate to postal data.&lt;/p&gt;

&lt;p&gt;Aside from the issue with retrieving a single time zone per postcode in the case of a tie, there are three other areas for potential hiccups to consider:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Country coverage&lt;/li&gt;
&lt;li&gt;Coordinates accuracy &lt;/li&gt;
&lt;li&gt;Out-of-date postcode data&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt; &lt;/p&gt;

&lt;h3&gt;
  
  
  Country Coverage
&lt;/h3&gt;

&lt;p&gt;GeoNames and OpenStreetMap produce postcodes for the vast majority of countries spanning multiple time zones (the main exception being Indonesia for GeoNames). The other countries, which are covered by a single time zone, can be mapped to time zones with country-wide rules, taken from the &lt;code&gt;timezones.timezones_info&lt;/code&gt; table.&lt;/p&gt;

&lt;p&gt; &lt;/p&gt;

&lt;h3&gt;
  
  
  Coordinates Accuracy
&lt;/h3&gt;

&lt;p&gt;OpenStreetMap and GeoNames deliver generally reliable coordinates. However, a known issue of GeoNames is that coordinates are rounded and consequently &lt;a href="http://www.informatik.uni-oldenburg.de/~there/ahlers2013girgeonames.pdf"&gt;mapped to a grid in some areas&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;These approximate coordinates will generally take you to the correct time zone or at least a neighboring time zone, so you’ll end up within two hours of the actual local time (taking variations about Daylight Saving Times into account).&lt;/p&gt;

&lt;p&gt; &lt;/p&gt;

&lt;h3&gt;
  
  
  Out-of-date Postcode Data
&lt;/h3&gt;

&lt;p&gt;Both OpenStreetMap and GeoNames do not always have the most complete and up-to-date postcode data. As a result, you could be missing postcodes. Some OpenStreetMap postcodes should just be discarded altogether, as &lt;a href="https://nominatim.org/2022/06/26/state-of-postcodes.html"&gt;they don’t respect the country’s postcode format&lt;/a&gt;.&lt;/p&gt;

&lt;p&gt;If you can accommodate small errors (in other words, assign a time zone that’s close to correct), you can look for the closest postcode in your available data. Postcodes usually follow a hierarchical structure, so look for a postcode sharing the same leading characters. For instance, postcode &lt;em&gt;6999&lt;/em&gt; should be considered closer to &lt;em&gt;6990&lt;/em&gt; than &lt;em&gt;7000&lt;/em&gt;, and &lt;em&gt;T2H 0K7&lt;/em&gt; is closer to &lt;em&gt;T2H 0Y0&lt;/em&gt; than &lt;em&gt;T2H 1K7&lt;/em&gt;.&lt;/p&gt;

&lt;blockquote&gt;
&lt;p&gt;Note of course that approximations can add up in cases like that. If you link a postcode to another alphabetically close postcode, that second postcode can itself have approximative coordinates. &lt;/p&gt;
&lt;/blockquote&gt;

&lt;p&gt; &lt;/p&gt;

&lt;h2&gt;
  
  
  Conclusion
&lt;/h2&gt;

&lt;p&gt;While you can derive time zone information from postcodes using freely available data and software, the quality of the result is highly dependent on the data sources you use. &lt;/p&gt;

&lt;p&gt;We’ve highlighted some common issues and explained some workarounds, but you might want to consider a commercial option if you need:&lt;/p&gt;

&lt;ul&gt;
&lt;li&gt;Higher accuracy than the ±2 hours solved for in this article&lt;/li&gt;
&lt;li&gt;Extra information, such as the results of switching to/from Daylight Saving Time&lt;/li&gt;
&lt;li&gt;Data that’s always up to date&lt;/li&gt;
&lt;/ul&gt;

&lt;p&gt;&lt;a href="https://eu1.hubs.ly/H02WJLS0"&gt;GeoPostcodes&lt;/a&gt; maintains a worldwide database of postcodes, including up-to-date time zone information for every postcode-locality combination. When a postcode is linked to several time zones, you can access all of them and use locality to filter.&lt;/p&gt;

&lt;p&gt;You can have a look at the &lt;a href="https://eu1.hubs.ly/H031bh50"&gt;Timezone product sheet&lt;/a&gt; or even &lt;a href="https://eu1.hubs.ly/H02WJXz0"&gt;browse the data for yourself&lt;/a&gt;. But don’t hesitate to reach out to us if you want to know more! &lt;/p&gt;

</description>
    </item>
  </channel>
</rss>
