casually surfs twitter...
finds a 🔥 thread...
it's too long to be read now...
~ THE END ~
Whenever I find a piece that I am interested in, I would switch to
After getting a fair amount of knowledge on the topic, I discard the links/ resources and jump back on
I found this approach 'draining' due to the sheer frequency of context switches. Twitter doesn't help much with its lightning fast content feeds.
I was in search of a practice that isolates the knowledge collection and acquisition phases.
I decided to go ahead with the following approach —
"Schedule the knowledge collection step"
...and go consume them at a later point in time.
"I would collect interesting & new data during the week and learn them leisurely during the weekends"
Twitter has this nice
But there lies a catch,
...if we were to save more than a couple of threads, we should have to doom-scroll our way into the right tweet.
On way to look at this is,
"What if we were to go through all the bookmarked tweets in order and subsequently get rid of them after knowledge acquisition"?
🤨 That's not how I intend to learn things.
What could I possibly do?
Twitter has a premium subscription service called 'TwitterBlue' where the user can create bookmark folders, much like how one does in a browser.
This is light years ahead of the normal bookmarks on twitter and not to mention it's also inexpensive... about $3 per month (+/-).
It also provides other features like 'thread-reader' which I would definitely benefit from.
Before I jumped into Twitter Blue, I had this thought.
"I use Notion as my knowledge management tool"
I could just export my bookmarks into a csv/excel file and import it into my notion workspace.
"Notion databases have this amazing feature of filtering records by attributes (columns)"
If I were to import tweets into my notion, I would have to manually have to add author and context details.
That's when I got this 💡 moment.
✅ Notion API
✅ Twitter API
I have prior experience with both APIs and they are not that hard to get started with.
I would read tweets from twitter servers and save them into my notion database in an instant... all using code.
Author information can be read from the tweet metadata.
Context is my input, #web3 for example is an indication that the context is web3. Hashtags are a supported metadata value in twitter API responses.
My code scrapes (searches) for a condition to save the respective tweets/threads into the notion database?
I would reply to a tweet with a command; for example:
@tnvmadhav save=Notion #web3
and the code searches for commands from these types for selected users (whitelist) and save the tweet into their respective notion databases?
Well, I have built just that and have been using it for a month now.
Ever since, I have saved so much time and boosted my productivity.
I also had this thought of productising this into a tiny SaaS for other users as well.
The product is currently MVP complete and can be used by others here 👇🏻
Top comments (3)
Great idea. But there's a catch. What if someone uses this command -
@tnvmadhav save=Notion #web3instead of you?
I maintain a whitelist. The tweet will be saved only for those in it.
Very fresh and interesting idea for an MVP, all the best 🎉