What is a web service proxy? Well a few years back in the hey days of SOAP, software development tooling included tools which you could point to a WSDL file and generate a file or set of files provided needed code to call the SOAP endpoint. These files were typically included in your application calling the SOAP endpoint. This was convenient as it provided a means to avoid having to handcraft code to build SOAP XML envelops and pass in the correct parameters and or create data types which were part of the signature or interface of the remote API being called. In those days is was normal for each application to generate their own proxy and include it as a 'client' in their application.
What is a web service proxy these days? Well SOAP is not as popular as it once was and now REST is all the rage. JSON has largely replaced XML as a platform and language agnostic way of passing data and data structures around between distributed applications, servers and clients. It's taken a few years, but the REST analogy for WSDL has finally evolved/arrived to where there is now software tooling supports "pointing" to an Open API Specification compliant JSON file, which much like WSDL did, providing the meta data about a REST service, its endpoints/methods, data types, etc., and once again generating a file or files that can be included in the client or calling application to call the REST endpoint.
What's the shared library? Shared library is a mechanism to created re-usable software components. Re-use, while challenging, is important. It provides a means for business to gain ROI across their business rather than writing all applications from scratch which creates a number of problems of which I'll mention just two. The first is not re-using code creates a very high cost of doing business. The second is duplicating code also creates a very high cost of doing business. Code is constantly changing, changes include bugs. Maintaining multiple copies of code and updating and fixing bugs in all of them has a higher cost than re-use. Use of existing things lowers cost in the long run. Now there are times, albeit few and far between, when duplication does make sense. That's fodder for a whole different article/post.
Should a proxy be a shared library? Should it be re-used? The answer is no. Well why "no"? Aren't we duplicating code? Yes, we're duplicating code, but we didn't pay any cost for writing that code. It was generated at zero or near zero cost. It didn't really cost anything to generate so re-use is not the goal here. There's no ROI.
Even so why can't multiple applications make use of the same copy of a proxy? Here's why. Unless you are versioning a shared proxy library like you version other reusable components, doing so creates a false dependency which forces the business to pay an unnecessary costs. What unnecessary cost? The cost is the cost of re-integrating and regression testing applications that aren't interested in a new version of the proxy that is generated when new endpoints or new features/semantics of existing endpoints occur which is inevitable. Code rarely, rarely doesn't change, increase its capabilities. Only those application that need those new capabilities need that new proxy. When an API changes, it should change in a backwards compatible way. Existing applications should continue to work with no code changes. No one would want to use an API that requires constant client code changes every time there are changes. Those clients should not be forced to use a new version of a generated proxy at the time that ne proxy is generated which would then create at a minimum the cost of re-compiling and redeploying other applications that consume the same API and shared proxy. At worst they may need regression testing. Now changes to API naturally require regression on the serve side, but should not require regression testing of the client application's logic!
Should businesses pay this unnecessary cost? No, if something is unnecessary, then why pay a cost for it even if low. Cost is cost. Even in these days of dev ops with their automated builds and deployments, it still requires scheduling, managing those builds and deployments. Just as when some shared library is upgraded, whether it is ours or third party, we don't want to be forced to update, recompile, redeploy and possible regression test. If we don’t want NuGet to force that on us, why would we force it out on ourselves by trying to re-use machine generated code.
This doesn't even touch on the likelihood that generated proxies to API's with fat interfaces likely violates the RCM's interface segregation principle, but hopefully your dev team is smart enough to put the generated proxy behind an thin interface rather than creating a false dependency to the fat interface of the API and otherwise coupling the client application to one specific API implementation, but again that's another topic.Share on Twitter Share on Facebook