In my previous article, I introduced the Resume Protocol: a system designed to make professional reputation verifiable, soulbound, and owned by you.
But a protocol is only as useful as the tools we build to interact with it.
To bridge the gap between complex smart contracts and everyday utility, I built the Resume Integrator. This isn't just a script; it is a reference implementation designed to demonstrate reliability and excellence in Web3 engineering.
Whether you are building a freelance marketplace or a university certification portal, the challenge remains the same: linking rich off-chain evidence (PDFs, images) with on-chain truth (Immutable Ledgers).
In this guide, I will walk you through the thoughtful architectural decisions behind this integration.
The Engineering Challenge
Integrating a blockchain protocol requires us to reconcile two different realities:
- The Evidence (Off-chain): The detailed descriptions, design portfolios, and certificates. These are heavy, and storing them on-chain is inefficient.
- The Truth (On-chain): The cryptographic proof of ownership and endorsement.
My goal with the Resume Integrator was to stitch these together seamlessly, creating a system that is robust and user-centric.
The Architecture
I believe that good code tells a story. Here is the visual narrative of how an endorsement travels from a local environment to the blockchain.
Step 1: Structuring Data with Intent (The Metadata)
Clarity is the first step toward reliability. If we upload unstructured data, we create noise. To ensure our endorsements are interoperable with the wider Ethereum ecosystem: wallets, explorers, and marketplaces, we strictly adhere to the ERC-721 Metadata Standard.
I enforced this using strict TypeScript interfaces. We don't guess the shape of our data; we define it.
// From src/types.ts
export interface Attribute {
trait_type: string;
value: string | number | Date;
}
/**
* Standard ERC-721 Metadata Schema
* We use strict typing to ensure every credential we mint
* is readable by standard wallets.
*/
export interface CredentialMetadata {
name: string;
description: string;
image: string;
attributes: Attribute[];
}
Step 2: The Storage Layer (Pinata SDK)
For our "Evidence" layer, we need permanence. If I rely on a centralized server to host my resume data, my reputation is rented, not owned. That is a risk I am not willing to take.
We use IPFS (InterPlanetary File System) via the Pinata SDK. I chose Pinata because it offers the reliability of a managed service without compromising the decentralized nature of content addressing.
Here is the "Two-Step" pattern I implemented to ensure data integrity:
- Upload the visual proof first.
- Embed that proof's URI into the metadata.
// From src/storage.ts
/**
* Creates NFT-compatible metadata for a credential
* and uploads it to IPFS via Pinata.
*
* This function optionally uploads an image first,
* then embeds its IPFS URL into the metadata JSON.
* @param input Credential metadata fields
*
* @returns A public IPFS gateway URL pointing to the metadata JSON
*/
export async function createCredentialMetadata(
input: CredentialMetadataInput
): Promise<string> {
console.log("Authenticating with Pinata...");
await pinata.testAuthentication();
console.log("Pinata authentication successful");
// Will store the IPFS URL of the uploaded image (if any)
let image = "";
// If an image path is provided, upload the image to IPFS first
if (input.imagePath && fs.existsSync(input.imagePath)) {
console.log(`Uploading image: ${input.imagePath}`);
// Read the image file from disk into a buffer
const buffer = fs.readFileSync(input.imagePath);
// Convert the buffer into a File object (Node 18+ compatible)
const file = new File([buffer], "credential.png", {
type: "image/png",
});
// Upload the image to Pinata's public IPFS network
const upload = await pinata.upload.public.file(file);
// Construct a gateway-accessible URL using the returned CID
image = `https://${CONFIG.PINATA_GATEWAY}/ipfs/${upload.cid}`;
console.log(` Image URL: ${image}`);
} else if (input.imagePath) {
console.warn(
`Warning: Image path provided but file not found: ${input.imagePath}`
);
}
// Construct ERC-721 compatible metadata JSON
// This structure is widely supported by NFT platforms
const metadata: CredentialMetadata = {
name: input.skillName,
description: input.description,
image,
attributes: [
{ trait_type: "Recipient", value: input.recipientName },
{ trait_type: "Endorser", value: input.issuerName },
{
trait_type: "Date",
value: new Date(input.endorsementDate.toISOString().split("T")[0]!),
},
{ trait_type: "Token Standard", value: "Soulbound (SBT)" },
],
};
// Upload the metadata JSON to IPFS
console.log("Uploading metadata JSON...");
const result = await pinata.upload.public.json(metadata);
// Return a public gateway URL pointing to the metadata
// This URL can be used directly as a tokenURI on-chain
return `https://${CONFIG.PINATA_GATEWAY}/ipfs/${result.cid}`;
}
Step 3: The Issuance Layer (Viem)
With our data secured, we move to the "Truth" layer. We need to instruct the smart contract to mint a Soulbound Token that points to our metadata.
I chose Viem for this task. It is lightweight, type-safe, and aligns with my preference for precision over bloat.
The most critical engineering decision here is Waiting for Confirmation. In blockchain systems, broadcasting a transaction is not enough; we must ensure it is finalized. This prevents UI glitches and ensures the user knows exactly when their reputation is secured.
// From src/contract.ts
/**
* Mint a new endorsement onchain
*/
export async function mintEndorsement(
recipient: string,
skill: string,
dataURI: string
) {
if (!CONFIG.CONTRACT_ADDRESS)
throw new Error("Contract Address not set in .env");
console.log(`Minting endorsement for ${skill}...`);
const hash = await walletClient.writeContract({
address: CONFIG.CONTRACT_ADDRESS,
abi: CONTRACT_ABI,
functionName: "endorsePeer",
args: [recipient, skill, dataURI],
});
console.log(` Tx Sent: ${hash}`);
// Wait for confirmation
const receipt = await publicClient.waitForTransactionReceipt({ hash });
console.log(`Confirmed in block ${receipt.blockNumber}`);
return hash;
}
Step 4: Verification (The Read)
A protocol is useless if we cannot retrieve the data efficiently.
Querying a blockchain state variable by variable is slow and expensive. Instead, we use Event Logs. By listening to the EndorsementMinted event, we can reconstruct a user's entire professional history in a single, efficient query. This is thoughtful engineering that respects both the network and the user's time.
// From src/contract.ts
/**
* Find all endorsements for a specific user
*/
export async function getEndorsementsFor(userAddress: string) {
if (!CONFIG.CONTRACT_ADDRESS)
throw new Error("Contract Address not set in .env");
console.log(`Querying endorsements for ${userAddress}...`);
const logs = await publicClient.getLogs({
address: CONFIG.CONTRACT_ADDRESS,
event: parseAbiItem(
"event EndorsementMinted(uint256 indexed tokenId, address indexed issuer, address indexed recipient, bytes32 skillId, string skill, uint8 status)"
),
args: {
recipient: userAddress as Hex,
},
fromBlock: "earliest",
});
return logs.map((log) => ({
tokenId: log.args.tokenId,
skill: log.args.skill,
issuer: log.args.issuer,
status: log.args.status === 1 ? "Active" : "Pending",
}));
}
Conclusion
The Resume Integrator is more than a codebase. It is a blueprint for building with purpose.
By separating our concerns: using IPFS for heavy data and the Blockchain for trust, we create a system that is efficient, immutable, and scalable. By enforcing strict types and waiting for confirmations, we ensure reliability for our users.
The Resume Protocol is the foundation. This Integrator is the bridge. Now, it is up to you to build the interface.
Repositories:
- The Protocol (Smart Contracts): github.com/obinnafranklinduru/nft-resume-protocol
- The Integrator (Sample Client): github.com/obinnafranklinduru/resume-integrator
Let's build something you can trust with clarity, purpose, and excellence.

Top comments (0)