I went through different system architectures during my life as a programmer. From the single-file C64 programs to the multiple serverless functions running on distributed systems.
If I zoom out and look at the systems I’ve helped design, you can see a distinctive pattern: they continuously break down into smaller pieces (creating new foundational layers.)
One file becomes two files; two files become a library. An application runs on one single server. A group of applications running on multiple servers becomes services.
We went from macro-architecture (one single file program) to micro-architecture (one function service.)
What’s the next logical step in this design evolution? Following the trend and listening to the current engineers’ pain points, you can easily identify what’s next.
Coordinating thousands of functions and dealing with cloud deployments is a nightmare. So two things will happen, I believe:
1) Cloud environments will merge with local environments
There won’t be any distinction between the “cloud” and the local environment
2) Functions will be broken down into instructions
There is a huge amount of open-source functions that AI assistants are mining; unless you are working on a particular algorithm, you won’t need to write a function, you will need to write an instruction (e.g. “>upload *.csv, compress, aggregate and print”)
If this happens along with the next-gen databases, we will see developers writing a set of instructions, without knowing anything about the functions or servers that those functions will trigger.