Building API Integration Part 1: Using API Clients
As the number of integrations you have to build, starts to grow, consuming all those third-party APIs at scale becomes a technical challenge.
Integrating one or two APIs is not (always) complicated, but when it comes to build dozens of integrations, each providing key business-critical components to your application, that’s when things get more complex (and interesting!). From there on in, you realize that every API is a single point of failure that needs to be managed accordingly; dealing with technical debt and security is going to be paramount.
In this series of blog posts, we are going to trace the typical developer’s journey, mapping out the required decisions, for building & maintaining API integrations.
The unspoken choice, using API clients
The journey, with an API, starts with some play and discovery, usually through the API’s developer portal. Once you start to grasp a general idea of the possibilities, that’s when the fun starts, coding!
What we’ve learned from many industry discussions, is that the time allocated to building an integration is almost always never enough. It’s usually due to a mix of misunderstandings with the product’s or business’ decision-makers, as to the time required, as well as overconfidence on the engineer’s part.
How difficult can it be to consume an API, after all? It’s not like it’s something new, right? Sadly, it is, and it's at that precise moment of realization, that an unspoken, yet critical, decision is made:
We’ve found an API client for X API, let's use it to deliver the integration faster
Using a dedicated API client is the best way to build technical and security debt over time. Most of the time, it is just a nice wrapper over REST calls. Ideally, the chosen API client will be provided and maintained by the API vendor, but it is more often the case that a third-party/community is involved.
For this reason, but not only, we strongly advocate against using a dedicated API client.
Imagine yourself consuming dozens of APIs, not an unusual feat, it means relying on dozens of dependencies,+ their own in turn. This will translate into numerous updates and security patches over time. Add to that the fact that, every API call will look different in your codebase; you quickly realize this may not be the ideal way.
Would you query your database using a different client for every table? Different methods? It's not just because data comes from an API, that you should suddenly forget all your best practices.
From that single unspoken decision the impact over time, is going to be dramatic.
So, what is the solution?
First, the keep it simple stupid (KISS) approach would simply tell us to query the API using a regular HTTP client (axios, httparty etc.). Yes, It is going to be more verbose, but is that a bad thing after all?
The ideal solution is to build a dedicated service, responsible for querying all your third-party APIs. This way, every integration will rely on the same pattern, structure and dependencies; you will be able to improve this service over time, and by design improving all your integrations. You can think of it as a micro-service or a module for example, depending of your preferred architectural choice obviously.
And finally, you can also give a try to our universal API client if you don't want to reinvent the wheel 😅.
As we will discover in the part 2, using this kind of solution opens many new options, for managing your integrations, that wouldn't have been possible before.
Discuss this article on Twitter and ping us @BearerSH