Highlights of Flowforge
My experience building Flowforge, a service request pipeline builder
Written on: 20 Jul 2024
My experience building Flowforge, a service request pipeline builderWritten on: 20 Jul 2024
I'm so happy to be able to announce the public release of Flowforge. Flowforge is a service request pipeline builder that allows teams, with lean developer outfits, to create modular pipelines with ease. More information can be found at my Github repository. In this post, I'll be sharing some of the highlights and challenges I faced while building Flowforge.
Flowforge was born out of a need to quickly create automated pipelines to fulfil service requests. At my current workplace, we are a small developer team managing the AWS Cloud Infrastructure for all the Ministry of Defence projects. On a daily basis, we receive numerous service requests from the various project teams. These requests range from simple resetting of passwords to more complex requests like setting up a brand new environment for a new project.
Working with the cloud meant that a lot of these requests could easily be fulfilled with a few calls to the relevant APIs. However, we didn't want to be spending time writing a brand new script for each request. If we could somehow have a pre-built "API caller" of sorts that could be dynamically and easily configured to call various APIs, we could save a lot of time. Hence, Flowforge was born: a modular service request pipeline builder with pre-built steps like "Approval" and "Make API Call" that could be easily configured and reused. Every pipeline can be easily defined in a JSON schema. Customising existing steps or creating new ones will still involve writing some code, but the process is a lot simpler than writing a brand new script.
Over the course of the project, I worked with 2 other developers and we encountered many challenging, but rewarding, moments and developments. I was the main lead for the project and was responsible for the overall architecture and design of the system. I worked mainly on the backend system, which was written in Go, and occasionally helped out with the frontend, which was written in TypeScript and Next.js. Here are some of my highlights:
The primary goal of Flowforge was to allow users, particularly non-developers, to easily create pipelines. To achieve this, I opted to use a JSON schema to define the pipeline, similar to how one would define Github Actions or AWS SSM Automation documents in JSON/YAML. These 2 examples were more than enough for us to get started and I created a similar schema for our pipelines.
Parsing it was a lot more difficult. As I wanted to allow for each step to be customisable, e.g. changing the API endpoint or the request body, I had to introduce string parameters using ${}
delimiters. Once introduced, I had to write a custom parser to extract out these string parameters and replace them with the actual values provided by the users. Since JSON supports arrays and nested objects, I had to write a recursive parser and relied on go's reflect
package to handle the different types of JSON objects. Below is one of the core functions that I wrote to replace placeholders in a JSON object:
func ReplacePlaceholders(input any, values map[string]any) (any, error) { switch reflect.TypeOf(input).Kind() { case reflect.String: return ReplacePlaceholdersInString(input.(string), values) case reflect.Slice: // If the input is a slice, iterate over each element and replace placeholders output := make([]any, 0) for _, elem := range input.(bson.A) { replaced, err := ReplacePlaceholders(elem, values) if err != nil { return nil, err } output = append(output, replaced) } return output, nil case reflect.Map: // If the input is a map, iterate over each key and value and replace placeholders output := make(map[string]any) for key, value := range input.(map[string]any) { replacedKey, err := ReplacePlaceholdersInString(key, values) if err != nil { return nil, err } replacedValue, err := ReplacePlaceholders(value, values) if err != nil { return nil, err } output[replacedKey] = replacedValue } return output, nil default: return nil, ErrInvalidTypeForPlaceholderReplacement } }
Though not perfect, writing this function gave me the opportunity to learn and understand more about reflection – a challenging but rewarding experience.
Pipeline execution was really hard to design. One of my main design decisions was to de-couple the main backend API server from the pipeline executor as much as possible. I did not want the backend server to be negatively affected just because the pipeline execution was faulty or slow. To achieve this, I decided to use an event-driven architecture through a lightweight and simple event library https://github.com/gookit/event. To start the execution of a pipeline or to resume it upon approval, the backend API server would emit the relevant event which would subsequently be handled by the executor.
Apart from decoupling the API server from the executor, it was a challenge to figure out how to pass data between steps, keeping functions as simple as possible and making them only do one thing (e.g., a function for preparing execution, another for actual execution, and another for post-execution). Designing the pipeline steps itself was also a great difficulty, especially the "Make API Call" step which could be customised by the pipeline creator.
Some things I would like to improve on if I had more time:
context
better. I was using it mainly as a way of passing data about the step and the entire pipeline across the various functions. However, I feel that it could have been used more effectively and aligned with its original designed purpose.Having read the book Designing Data-Intensive Applications by Martin Kleppman, I was inspired to create a log-based table that would track the execution of every step in a pipeline. Every status change, every error, and every success would be recorded in this append-only table along with the relevant metadata. This is a feature of Flowforge that would allow for tracking and auditing of pipeline executions. It might seem like a simple, or obvious, design decision, but it was a great and fun learning experience to implement it. Designing the table and writing the SQL queries to retrieve the relevant data was enjoyable.
As a team, we tried our best to use Github's native project management tools to manage our tasks and milestones. Throughout this project, I have come to appreciate the difficulties of being a project manager. It was tiring enough to be coding after work and being the main architect of the entire project, but to also have to manage the team and meet our self-set deadlines added another layer of complexity and work. Constant check-ins, trying my best to create small and well-defined Github issues for my team mates to work on, and ensuring everyone was on the same page were some of the challenges I faced. I'm glad to have had this wonderful learning experience.
Building Flowforge was a great and challenging experience. I learned a lot about Go, TypeScript, and Next.js, and also about project management. I'm glad to have had the opportunity to work on this project and I hope that perhaps one day, someone will find it useful and contribute to it. If you have any questions or feedback, feel free to reach out to me at joshua.tyf.career@gmail.com. I'm always happy to chat more about tech and projects.
– Josh