DEV Community

Cover image for Securing third-party artifacts
Rajesh Deshpande
Rajesh Deshpande

Posted on

Securing third-party artifacts

Third-party and open-source components form basic building blocks for modern cloud-native applications. They help organizations to improve quality, time-to-market, and reduce risk. But as applications are heavily dependent on these dependencies, the overall quality and security of the application are directly dependent on the quality and the security of third-party dependencies.
Here are a couple of steps to ensure the security of third-party artifacts:

1. Verify third-party artifacts and open-source libraries

Before using any third-party artifacts and open-source libraries in the application, those are thoroughly verified. The verification is done by validating their cryptographic signature and checksum.

  • Validate Signature: The signature validation ensure that artifacts are from intended sources only. Many tools are available which help with this validation. For example, Docker provides Docker Content Trust(DCT) for image signature validation.

  • Validate Checksum: The checksum validation ensures that the downloaded artifact is the same as the uploaded one and there is no tampering between upload and download. Docker always verifies the image signature during the image pulling. Maven also provides flag --strict-checksum to validate the checksum of maven dependency.

2. Understand the composition of artifacts

As a consumer of external material, we should always know the details of its dependencies and versions. These details help to understand the composition along with the vulnerabilities of external artifacts.

  • Software Bill of Material (SBOM): The SBOM file of artifacts provides composition details which include a list of dependencies, version, license type, vulnerabilities, etc. We should always get the SBOM file from the supplier.

  • Software Composition Analysis (SCA): If the SBOM of artifacts is unavailable, perform an SCA to generate the details like, a list of dependencies, version, license type, vulnerabilities, etc. To perform SCA of Docker image, tools like Snyft and Snyk are used.

3. Track and maintain artifacts within organizations

All the third-party materials used within organizations are tracked and stored privately. Also, ensure all application builds use only these materials and not the public from the internet.

  • Component Inventory: A component inventory should be maintained of a project's open-source components, dependencies, and vulnerabilities. It helps to identify the software vendors, suppliers, and sources used in an organization. Component inventory can create using tools like OWASP Dependency-Track, and Nexus IQ.

  • Component Repository: Organizations should host their private repositories and restrict build machines to pull from only those repositories. The artifacts stored in those repositories are secured by regular scanning and upgrades. Artifact repository can be created using Jfrog Artifactory, and Nexus repository.

4. Perform a security scan

The third-party dependencies should be scanned and evaluated to ensure they are free from security issues. And even if security issues exist, those are within the risk limit of the organization's assurance level.

  • Scan for vulnerabilities: Third-party dependency is scanned to detect security vulnerabilities. All vulnerabilities are carefully evaluated and decided if dependency is allowed to use. For vulnerability scanning, tools like Snyk, Twistlock, and Blackduck are used.

  • Scan for license implications: Like vulnerabilities, license type of artifacts and their dependencies creates obligations. This scan is performed to ensure artifacts meet legal and regulatory compliance requirements. Snyk, Twistlock, and Blackduck can be used to understand license types.

5. Automate as much as possible
All the above steps should be automated as much as possible to reduce human errors. Steps like artifact scanning are implemented in the CI and CD pipeline and steps like verification and composition analysis are performed through separate automation pipelines. The automation will also reduce the time required to perform these steps manually.

Top comments (0)