DEV Community

Cover image for How I Automate Social Media Page using Web Scrapping, Google sheets and Figma
Abhi Dadhaniya
Abhi Dadhaniya

Posted on

How I Automate Social Media Page using Web Scrapping, Google sheets and Figma

Recently, I was exploring AI tools from online resources and also I came across a web-scraping Chrome extension I thought what if I can scrap this data into beautiful social media

Here are the most powerful web-scraping Chrome extension

  1. Bardeen - Automate manual work

  2. Axiom Browser Automation

What is Web scraping

Web scraping is a powerful tool for collecting data from websites quickly and efficiently. For marketers, web scraping can be especially useful for collecting data on potential leads, target audiences, or competitors. However, manually collecting data from websites can be time-consuming and tedious, particularly when dealing with websites that have large amounts of data.

The Challenge of Data Collection

Collecting data from a large website with over 300 AI tools can be a daunting task. Manually copying and pasting the data for each tool would take hours or even days. This is where web scraping comes in handy. With a web scraping tool, it's possible to automatically collect data from a website, even if there are hundreds or thousands of pages to scrape.

How to scrap data from any website

Here's a YouTube channel to learn about how to use Bardeen to scrap data from any website.

After collecting the data in table form using the Bardeen extension, I transfer table data into google sheets by connecting my Google account.

Organizing Data with Google Sheets

Once the data was collected, I imported it into Google Sheets to organize it. Google Sheets is a powerful tool for data organization, allowing users to sort, filter, and analyze data in a variety of ways. I created separate columns for each data point and used filters to remove any unnecessary data.

Extracted data from AI tool using web scraping

  • Name

  • Description

  • Website link

  • Icon link

Here's an important phase that came up where I applied some google sheets functions to create necessary columns like,

  • Absolute URL path to extract the domain name

  • Used IMPORTXML() function to get meta image from extracted URL of AI tool

Also, we can import meta descriptions of websites using the IMPORTXML function but in my case, I had the description content of the AI tool.

Used google sheets functions

Get a domain from an absolute URL (https://abhidadhaniya.com to abhidadhaniya.com)

=REGEXREPLACE(A1,"http\:\/\/|https\:\/\/|\/.*|\?.*|\#.*","")
Enter fullscreen mode Exit fullscreen mode

Import meta image using og:image property from metadata.

=IMPORTXML(A1,"//meta[@property='og:image']/@content")
Enter fullscreen mode Exit fullscreen mode

After successfully fetching all data let's sync in the Figma post.

Sync data in Figma Post

I created common components in Figma to use a common structure in all posts.

Figma Common Components

I also created variants of Profile images and account info.

Variants

If I want to create a new theme and post layout, Now I can easily change the design of components and it will reflect in all posts. This is the real power of Figma 🔥

Sync data using the Google Sheets sync Figma plugin

I installed a Figma plugin called Google Sheets sync to fetch data from shared google sheets by inputting Google sheet URL.

After that plugin will pop up a table that we can connect to the respected element.

Google sheets sync figma plugin

Connect the element by clicking on the respected column and the name of the element will be replaced by #ColumnName

Now, Copy and paste the 1st post by dragging next to it and click on the sync button in the pop-up of the Google Sheets sync plugin and That's it, We're Done...!

I choose a lazy person to do a hard job. Because a lazy person will find an easy way to do it.

- Bill Gates

Conclusion

Web scraping and Google Sheets are powerful tools for streamlining data collection and organization. By using a web scraping tool and Google Sheets, I was able to quickly and efficiently collect and organize data from a large website. By syncing the data with Figma, I was able to create social media posts quickly and easily.

Thanks for reading this article the last thing I want to mention is, We're running up and freelance agency called Gigaweb - an agency that offers a range of digital services to help clients. With my years of experience in web design and development, We provide top-quality services to businesses and individuals.

Visit my portfolio, verify project work and read more blogs. To stay updated on my latest projects and industry insights, follow me on his social media accounts- LinkedIn, Twitter, and GitHub.

Top comments (0)