The crazy idea I had today was, Oh what fun would it be to write and teach and make videos about whatever I build everyday.

So here we are.

Today, I built an automation for a script that automatically picks up local zoom recording files and uploads them in a shareable way in a notion db.

While doing so, I used quite a few application interfaces.

Windows file system, Notion API, YouTube (Google) API, and the input data coming via zoom saving the recs to a folder.

Amidst all of this, I used openAI in one step to bring all the data together and push it to notion.

Why this decision - because LLMs are perfect for making the non-deterministic → deterministic. It’s the tool when you have a bunch of information but it can’t really talk to each other

Or if it’s just too much work to figure out all the edge cases and if-then-else flows. When writing and maintaining code for a mini task is almost so that it makes the project not worth pursuing → that’s where gpt apis work the best.

I used it to be a link between the knowledge of which tables and databases we use and the zoom file names.

This way, when we add a new client, or when there is any change in where and how we want to manage zoom recordings and how they’re shared - we don’t have to change and rework a bunch of code, we can just simply tell it to the openai system prompt, just as one would to their colleague.

As the cofounder of Nvidia says - one day all code will be natural language, I truly believe him.

Just as easy as it is now to do art, to paint, to make a paper greeting card or to write using a pen - that’s how easy it will be to make digital products that manipulate information.

At least for smaller scale projects, I believe that the natural language code solutions future isn’t that far.

In a way if you see, programming languages have ever been upgrading to be more like natural language. And now we’ve finally taught computers how to better talk to themselves.

This is a new form of leverage we’re unlocking, and I’m excited to see where this goes!

Slowly I’m using such gpt functions in many workflows to make non-deterministic non compatible data talk to each other, reason and output in a deterministic way.

Let’s see which of such solutions impacts things assymetrically.