With an increasing number of enterprises adopting the API centric approach to digital transformation, it is worthwhile to take a step back and look at how the APIs evolved, along with the present and future outlook on what the APIs are destined for.

More...

This blog post was originally published for Rakuten RapidAPI

Software has evolved from specific subroutines to complex Internet-scale applications. Back in the days of mainframe computers, software was merely a bunch of small subroutines, performing specific tasks. Fast forward to today, and the software is modeling a real-world application that interacts with people over the Internet.  In the mainframe world, and during the pre-Internet era, interfaces to interact with the software was a bunch of commands fired from a terminal. In the Internet age, especially after the proliferation of the World Wide Web, this interface has morphed into what we commonly refer to as an API. 

In this blog post, we are going to cover a brief historical perspective on the evolution of APIs, followed by some current and future technological trends that are spearheading the revolution around APIs and the API economy. Finally, we address the future of APIs and upcoming trends in API economy that developers and enterprises can leverage to achieve broader business goals. 

In case you need to get a more fundamental grasp on APIs, then read the post to get your basics right about what is an API? 

Brief History of APIs

The abstract concept behind an API, as a programmable access point to invoke another application, has been in existence since the early days of computer science. However, the ways and means of implementing them have gone through many transformations over the last few decades. 

The history of web-based APIs,  as we know today, can be traced back to the late 1990s when Salesforce launched its web-based sales automation tool. This application marked the beginning of the SaaS (Software as a Service) revolution. The World Wide Web powered the underlying infrastructure enabling this newfound way of delivering software.

Before the World Wide Web and the Internet, APIs did exist, albeit mostly in the form of proprietary protocols supporting small distributed computing networks that spanned a limited area. 

Whether you look at API in the pre or post Internet era, the purpose remains the same. An API enables an API provider to expose its services, such that external systems can call the API provider to avail those services.  At an Internet-scale, where developers from across the world build applications and host them over the World Wide Web, APIs are the future of distributed computing paradigm.

Evolution of APIs Alongside the World Wide Web

At that dawn of the Internet era, the potential of the World Wide Web to deliver services over this new medium was well understood. As a result, many companies sprang up to offer services by transforming the brick and mortar model of delivering products or services into a virtual store. 

In those early days, SOA (Service Oriented Architecture) was the preferred model to build distributed systems. Traditional SOA deployments were limited to the organizational boundaries. However, a  few tech pioneers realized the importance of syndicating data over the WWW, crossing the boundaries of organizational silos, and utilizing data from multiple sources to offer valuable insights to customers.  

The First Iteration of API Evolution

Perhaps, Salesfoce was one of the first companies to realize this data-led transformation. Their web-based sales automation tool was like the first iteration of an API service. This application was based on XML, as XML, at that time, was touted to be the de-facto format for exchanging data over the Internet. 

The XML based data interchange format was later standardized as SOAP. SOAP provided a specification to exchange structured data in XML format. It also included a specification for message formats, encoding rules for processing requests and responses to the API. It was a much-needed innovation because XML evolved as a free form markup derived from HTML, and there was a need to standardize the XML based data exchange for better interoperability of APIs over the web.  

APIs in Web 2.0 World

With Web 2.0, the web-based applications leapfrogged into complete suites of software products hosted on the web. This was the time when software no longer played the role of an assistant to humans. It modeled a real-world operation.

Consider CRM software, which is supposed to assist you in managing your customer lifecycle. In the Web 1.0 era, a web-based CRM software was most capable of providing a form for entering data and providing reports to track the progress and KPIs. With Web 2.0, the software played an active role in analyzing the customer lifecycle process, the operations, and the outliers, to provide more proactive inputs. In this way, it modeled a real-world agent analyzing customer journeys via different channels of interaction and alleviated the chores of a human operator to a greater extent.

Evolution of APIs from Web1.0 to Web2.0

Much of this transformation was due to the emergence of the REST (REpresentational State Transfer) framework. The REST architecture is a set of guidelines and recommendations for defining API interfaces that model real-world objects. This is, in a way, similar to how software architectures have evolved from procedural to object-oriented paradigms, with individual software components representing an object. Similarly, REST envisages that the API endpoints should represent a virtual resource that models a real-world object.  For instance, if you build an API service to manage a todo list, then here is how you would define the API endpoints in REST, compared to a non-rest architecture.

REST Architecture

Roy Fielding originally proposed the REST framework as part of his Ph.D. dissertation in 2000, which envisaged a new style of software architecture that was driven by network interfaces identified by API endpoints. While it took some time for the world to take notice, the developer community immediately realized the benefits of this architecture and adopted it wholeheartedly.  In the following years, this led to the concept of Open APIs when developers started to publicly host the APIs for others to invoke them via a programmable interface as per the guidelines defined under REST.  

Emerging Trends in APIs with Web 3.0

The Web is progressing fast, and we are currently witnessing the third wave of evolution with Web 3.0. Web 3.0 is all about more engaging web experiences, with seamless integration of humans and machines, powered by IoT, AI, and ML. Web 3.0 is also about a distributed web architecture powered by Blockchain, which is more trustworthy and gives the power to masses instead of a few greedy corporations. 

What is the impact of API for Web3.0? 

There is a need to re-imaging APIs in the Web 3.0 scenario. Traditionally, APIs always followed the request-response paradigm. That is, an application sends requests to an API, and the API sends a response back to the application.   

However, in a hyper-interactive web application, which has humans, and machines collaborating in real-time, a request-response based paradigm limits the capabilities. What we need in this case is an event-driven paradigm, where APIs themselves are involved in the interaction.

Request Response vs Event Driven

The Rise of API Economy

Somewhere along this evolutionary phase of the web, someone said: “Data is the new oil.” This phrase is credited to Mathematician Clive Humby. We all know the connotation of this phrase as we experience it every day while living and experiencing our virtual lives.

It can be conclusively deduced that if data is indeed equivalent to oil, then APIs are the pipelines, and API backends are like the refineries.

API Economy

API Economy

With Web 3.0, online services have evolved from a software-based delivery model to a thing based model. This is the transformation of SaaS into XaaS (Everything-as-a-Service), wherein any tangible object can be offered as a service. The most obvious example of this is Airbnb, where you offer a spare room in your house as a short term rental service. This is also known as the sharing economy, which encourages people to utilize shared resources instead of owning them. Uber is also an example of XaaS, specifically in the form of Transportation as a Service leveraging the sharing economy for cars.

The emergence of XaaS gives rise to the concept of digital twins. Now the software is not only modeling a real-world operation, but it is representing a real-world thing. Therefore it also needs APIs to query the state and well being of the thing. In this way, the API economy has morphed into a full-fledged business case around a service being offered by the thing. 

Just like the evolution of WWW and the corresponding iterative improvements in APIs, the advancements in the business models driven by API economy has also undergone a few phases which can be summarized as follows.

1.  APIs that feed and analyze data: This was the first iteration of the API centric business model. APIs were the gatekeepers of data aggregated from different sources. This data was analyzed and then offered as a service to any API consumer which is interested in the data. These APIs are mostly some form of data API that provides access to public information about different industries.  

APIs that feed and analyse data

2. APIs as an extension to the business process: This is an enhancement of the first iteration, where the APIs provide a more robust backend operation. They perform heavy-duty stuff and integrate into the workflow of the API consumer.  

APIs as a extension to business process

3. APIs that manage the entire operational and service aspects of a business: These are the APIs that are super intelligent and are capable of making decisions on behalf of the user. They also interact with the end-user, via some form of event or notification mechanism, giving rise to a new dimension of user/developer experience driven by APIs.

Impact of the Underlying Protocols on APIs

Any discussion on APIs is incomplete without a mention of the technological underpinnings that support the delivery of API services. A lot happens behind the scenes to transfer bits and bytes that carry the API data. Just like a telecom service provider needs to think about the underlying wireless technology ( 4G or 5G) for delivering the mobile services, an API provider also needs to think about the underlying technology delivering the API service.

Unlike the telcos, the API providers have not had many choices. HTTP has been the de facto protocol driving the web since the early days of the Internet. It continues to evolve and mature to support the newer forms of web applications. With the mandate to bring in HTTP version 3 to support a better internet experience on wireless networks,  APIs are slated to gain a lot. 

Drawing parallels with the evolution and advancements of the web and APIs, HTTP has also undergone similar phases of advancements.

  1. 1
    HTTP 1.0 & HTTP 1.1: These were the early versions of HTTP that ruled the web during the 1990s. At that time, HTTP was a text-based protocol with support for headers, content negotiation and authentication. At that time, the protocol had a lot of inefficiencies, such as repetitive headers, TCP slow start, and head of the line blocking.
  2. 2
    HTTP 2: HTTP 2 took a long time in the making. Introduced in 2012, HTTP2 introduced some much-needed enhancements to the protocol in line with the burgeoning growth of the web. It introduced binary framing, streams, and multiplexing with prioritization support. It also introduced the support for server push, which was a much-needed feature for event-driven APIs. However, other alternatives have also cropped up to support this feature. For more details, take a look at this in-depth guide on HTTP/2.
  3. 3
    HTTP 3: HTTP 3 is currently an IETF draft specification, but will be standardized soon. HTTP3 supports a new UDP based transport protocol called QUIC, which performs better under high latency and intermittent network conditions, similar to what we experience in poor mobile wireless coverage areas. From a syntax and semantics point of view, it is similar to HTTP2. However, there are changes in protocol framing to adapt to a QUIC based transport protocol instead of the TCP protocol. If you are interested to explroe more, here is a concept guide on HTTP 3

While HTTP has served its purpose well, there is one thing about it that continues to hamper the evolution of event-droven APIs. Being a protocol based on the request-response paradigm, HTTP always requires clients to initiate a connection with the server.  This means that event-driven APIs, which need to send notifications from servers to clients, cannot rely on HTTP alone. This issue has been a bane for the web application developers for many years now. W3C took a proactive step to introduce additional protocols to address this. 

WebSockets – WebSocket is an IETF standardized protocol that supports bi-directional communication over the Web. With the help of Web Sockets, browsers can open up a dedicated connection with the server such that the server can push data packets to the browser without the need for the browser to continually ping the server. Web Sockets is standardized as part of RFC 6455. For a more in-depth understanding of Web Sockets, you can check out this concept guide on WebSockets.

WebSub – WebSub is another protocol that aims to provide a real-time communication backend for web applications. It is based on a distributed Publish-Subscribe pattern, which means that multiple WebSub supported peers can collaborate via an intermediary known as a hub. For more in-depth coverage, refer to the concept guide on WebSub.

A Look at the Future of APIs

We have covered the general progression and developments around the web, APIs, and the underlying protocols that power the API economy.

What does this mean for the advancement of the APIs? Are we going to see new trends, or will newer use cases and business models emerge around the API economy?  

Here are some predictions from our side on the state of APIs going into the new decade.

1. Better API Experience on Mobile Devices: With HTTP3, the underlying transport layer of the TCP/IP protocol stack is undergoing a complete revamp to support the better transmission of packets under low and fluctuating network coverage. This is good news for app developers who form a large chunk of the API consumption market. They can now incorporate more APIs on mobile apps to improve their time to market. Thanks to the standardization efforts around HTTP3 and QUIC, API access is going to be faster, and mobile apps are going to perform significantly better even when subjected to patchy wireless coverage.

2. New Business Models Around Event-Driven APIs: With event-driven APIs, API providers have an active role to play in serving the API consumers. It also means a shift in the execution of business logic and application state management at the API provider’s end instead of the API consumers. As an example, when you schedule a trip on Uber, the app automatically assigns a driver at the scheduled time and intimates you. Behind the scenes, this is an event-driven API taking decisions on your behalf to serve you. Although this is a very trivial example, with the use of AI and analytics, more sophisticated event-driven API based services can be built to address different applications.

3. Better Standardization for APIs: While the protocols for implementing APIs are standardized, there is a need for an end to end standardization of how APIs are defined and described in an inter-operable way.  This is required to enable a consistent interpretation of the API specification, both by humans as well well as at the machine level. The Open API Initiative (OAI), is a consortium of companies and experts who are working towards defining an API specification that is programming language agnostic.  This will foster the development of tools and documentation in a standardized format that can be converted into configurable steps for launching APIs or even executable code to implement the API backend logic.

4. Rise of Microservices for Delivering APIs: Microservices have changed the landscape of how software is deployed. It allows developers to define independent, loosely coupled pieces of code that are linked to API endpoints. According to the Smartbear’s State of APIs 2019 report, microservices are going to be the biggest technological contributor in the adoption of APIs. In the years to come, we will see more and more API backend frameworks being developed around the microservices architecture. 

5. API Driven Architecture for End User Apps: Steve Jobs famously said, “There is an app for that” when the Apple store kick-started the app revolution during the later part of the 2000s. But based on the proliferation of API driven services in the last few years, it might also be worthwhile for some developers to build and host APIs instead of building apps. This gives rise to API driven architecture. So next time an app developer wants to add a new feature to his app, he has the option to write the code himself, or someone might just tell him, “There is an API for that.”

API Driven Development

Rakuten RapidAPI has the world’s largest API marketplace, with over ten thousand APIs for accomplishing simple to complex tasks. If you are a developer and are looking to leverage APIs in your software development lifecycle, then with a single Rakuten RapidAPI subscription key,  you can access multiple APIs to fast track your development schedule. If you are a data analyst, then you have a plethora of options to get publicly available data about different industries such as financial marketsreal estatetravel, and more.  Take a look at some of our popular API collections around communication serviceslanguage translationmachine vision.

6. APIs as Information Superhighways for Enterprises:  Enterprises already realize the importance of breaking the data silos to share information across departments in a more seamless way. Any cross-functional or company-wide initiative requires this kind of data syndication to support big data analytics. APIs provide the pathways to access the data. However, in an enterprise setup, this is easier said than done because API providers, in this case, can be internal departments or external APIs. As an example, if an organization has a large field force and wants to host an internal API to monitor their performance, then they might have to a mashup that data with an external weather API since weather hurts the performance of field force.

More and more enterprises are likely to deploy processes and systems in place to address the issues surrounding the governance and usage of APIs. This means that there has to be a system in place to orchestrate the use of APIs by different departments of an organization.

APIs for Enterprise

In the coming years, organizations are likely to deploy such solutions to build their own micro API economy. This will enable them to seamlessly harvest data from across the organization and convert that into meaningful business insights and monetization opportunities.

The Rakuten RapidAPI Enterprise Hub is the one-stop solution for enterprises to manage APIs. The Enterprise Hub enables organizations to define a centralized policy and governance framework to access both internally hosted APIs as well as external, third-party APIs.  It comes with comprehensive analytics and monitoring capabilities to keep a track of API consumption along with a rich set of features for API governance at scale. For more details send an inquiry.

About the author 

Shyam Purkayastha

Shyam is the Creator-in-Chief at RadioStudio. He is a technology buff and is passionate about bringing forth emerging technologies to showcase their true potential to the world. Shyam guides the team at RadioStudio, a bunch of technoholiks, to imagine, conceptualize and build ideas around emerging trends in information and communication technologies.

{"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}
TechForCXO Weekly Newsletter
TechForCXO Weekly Newsletter

TechForCXO - Our Newsletter Delivering Technology Use Case Insights Every Two Weeks

>