Leveraging Application Load Balancer (ALB) to Mock APIs for Test Automation with Zero Application Code Changes

“Mock Test APIs In a Distributed Application Environment with AWS Serverless Stack and Zero Application Code Changes”    

 

In an enterprise world, your application might be depended on many other sub systems. Most of the time you have zero ownership on those. Actual issue will come to surface when your application depended on data from those sub systems. Data might be refreshed most of the time, so you can’t have fixed set of data for your test automation work. It will always be a bottleneck for your QA team as well. How can you overcome this situation?

As developers, there are might be many options. One quick solution that might come in to our mind is to have set of pre-defined APIs. If client is requesting for that URL pattern, then from our backend system we will intercept the request and will send the mock data set to client. In the above approach, you need to do code changes in the backend system. And you need to have a mechanism to exclude such interceptors or conditions been push to production. It will be an overhead all the time.

Thought process was carried out to implement a solution which will have zero code changes to our application code and zero operation and maintenance cost. Since our application is deployed in AWS, we thought of using few serverless services to address the above requirement. Let’s drive through it step by step.

 

Using AWS Application Load Balancer (ALB)

Usual deployment pattern in an enterprise web application is that, you will have a Load balancer on top of the application servers. In AWS, you have a component called Application Load Balancer (ALB), where you can do many actions to your client requests before reaching the application servers. Actually, we were using the AWS classic load balancer, but with the feature set of ALB we moved to ALB and used some of the features to our implementation.

 

Mock Small Size Response

 

In ALB you have Listeners and for each listener you can configure rules. In the Listener rules, you can configure a fixed response.

Example: You need to mock the order history data of a certain account. So, if user submit a request URL as below, you need to submit a fixed data set all the time. To configure it, just use the Fixed response listener rule.

IF URL path is /PMT/api/fetchOrderGuide/067-123456, (You can add many other rule conditions to filter your request)

Then Return a Fixed Response (Add your fixed response body)

It’s simple as that. But there is a limitation on max length of the response body, which is 1024 characters. Hence if you have a large response body, then you have to move to the next option.

 

Medium Size Response

 

If you have a medium size response less than ~10,000 characters, then integration of ALB listener rules and API Gateway will help your course. Here how you achieve it,

There is a Rule action called Redirect which will Redirect requests from one URL to another. When you have the redirect capability, then you need a component that can produce the response without much effort. Since API Gateway is fully managed by AWS an it has a feature to generate Mock APIs, you can choose API Gateway as the response producer.

In API Gateway (APIG) create a Mock API. For testing purpose you can use an existing APIG and do the following steps or follow this link to go from starch.

  1. Create a resource in the APIG (ex: /fetchOrderguide)
  2. Attach a HTTP Get method to the resource while doing that select Integration type as Mock
  3. In the GET method execution pane, click on the Integration Response container
  4. Expand the HTTP 200 row and click on the Mapping Templates
  5. Click on application/json in the Content Type section
  6. Add the Mock Response in the Text Area and Save
  7. Deploy the new resource in the Stage you wish and note down the endpoint URL for the newly created GET method

Now the Mock API which is exposed via API Gateway is ready to use. You can test the API through the API Gateway console if needed. By default, API gateway and the APIs are public. But if you need to adhere to your organization security compliance, you might need to make the API Gateway and APIs private. You can follow this AWS resource to make your API Gateway private.

Final step for Medium size responses is to link the Mock API and ALB. As per the below diagram, you can add the filtering rules and then use the Redirect Action. In that section, give the API URL information and keep other options as per the diagram.

Now all set to test your API via postman. If your criteria match with the listener rules, then ALB will redirect the request to API Gateway and utilize the Mock API to generate the response and provided it back to the client.

Note: If you need to test this with your frontend (browser), then you need to start browser without CORS. Follow this link to Start chrome without CORS.

 

Large Size Response

 

There is a response limit for API Gateway Mock API responses as well. If you try to add a very large response, then you will get this warning.

“The resource being saved is too large. Consider reducing the number of modeled parameters, the number of response mappings, or perhaps the size of your VTL templates if used.”

Therefore, you need to think on an alternative. The quickest solution would be use a lambda function to generate the large response. Lambda functions are serverless and will have very less cost.

  1. Create the lambda function using the AWS console
  2. Give a function name
  3. Use Author from scratch and use NodeJs 12.x as the runtime. (Feel free to use any)
  4. Choose a role to execution
  5. In the Index.js file

exports.handler = function(event, context) {

  context.done(null, <Add the large response>);

}

  1. Save and Test the function to check whether you receive the response as expected

Now your large response is ready. Next you need to integrate it with API Gateway.

  1. Create a resource in the APIG (ex: /fetchOrderguideGroupView)
  2. Attach a HTTP Get method to the resource while doing that select Integration type as Lambda Function
  3. Provide the ARN of the newly created lambda in the Lambda function text box
  4. Click Save
  5. Test the API to see you get the response from the Lambda function

 

All good now. Repeat the same steps that we followed in final step of linking Medium size response of the Mock API with ALB using the Redirect rule action. Test with postman to see your integration works as expected.

There might be many other alternatives, but this helped our requirement on Mocking APIs for test automation in a very fast and serverless manner with zero code changes to your application.

May we all be well, happy and peaceful, May no harm come to you!

Read More

Making Your Enterprise Application 100% Serverless with AWS

There was an era in which we all fussed about cloud computing; however, right now the hype is mainly about serverless computing. In this article, I will brief you about serverless computing and share my experience in working with some serverless technologies that my team and I used to develop enterprise solutions.

My list of topics are as follows – each will have a quick introduction the technology used along with some web links which we looked at when integrating these into our final solution.

  • Serverless Computing
  • Requirement
  • Architecture & AWS Services
  • Lambda Functions for Microservices and BFF
  • API Gateway
  • Cognito for User Federation
  • ECS Fargate for Long Running Tasks
  • AWS Code Pipeline & Code Build for CI/CD
  • Other Services

[Please visit this link to get details on each of the above topics]

Read More

What makes an Enterprise SOA?

In this post we will identify the key tools or components that work together in an Enterprise Service Oriented Architecture solution.

In Enterprise SOA world there are key components work together to ensure SOA solution work as expected. Let’s identify them and have basic understanding of those components. And In coming posts deep it to those components.

 

As we can see in the above diagram we can list the key components as follows:

Application Server

(Apache Tomcat, Oracle WebLogic, GlassFish, JBoss, IBM WebSphere, Jetty)

In developer world it’s a server which we deploy our enterprise applications or services. However if we further look in to it, then It’s a server program which consist of three tiers.

  • Client Tier : It can be one or more applications APIs or browsers
  • Middle Tier : This is consisted with Web server and a EJB server
  • EIS (Enterprise Information System) Tier : Deployed applications, files and database

As mentioned above in the SOA world we deploy our web services, web applications in those servers. Some services may expose through API manager to public and some will use to develop the composite services which will make a proper SOA solution to the enterprise.

 

Message Broker or Messaging solutions

(RabbitMQ, ActiveMQ, SonicMQ, Kafka)

Message brokers comes in to play when we required reliable messaging communication between applications. The key concepts behind is Queues and Topics.

Queue: Point to Point model so, message will go to only one subscriber

Topic:  With Publisher-Subscriber model each message which publish to the topic will be transmitted   to all the subscribers who are registered to the topic

In addition some key features of Message Broker:

  • Ability to retain (Store) the messages when the receiver is not available / or slow to consume them
  • We can route messages one to many destinations
  • Being able to decouple the message publisher and receiver

 

ESB – Enterprise Service Bus

(Oracle Service Bus, WSO2 ESB, Biztalk, Mule ESB, IBM Websphere ESB)

ESB is one of the main component in ESOA solutions. If someone ask you to give a one term to describe usage of ESB in SOA context, then it should be as a “Mediator”. Let’s find out why we call ESB as a mediator.

The core concept of ESB is to integrate different applications (Services) using an ESB as the middle man (mediator) to each of those applications to communicate with each other. This covers one of the core concept in SOA, with lose coupling of services and ability to interact with each other, Increases organizational agility by reducing time to market for new initiatives.

Key features of ESB are follows

  • Message routing between services by reading the content of the messages
  • Message transformation (ex: SOAP – REST and vise versa)
  • Transport protocol negotiation between multiple formats (such as HTTP, JMS, JDBC)
  • Allow to configure security and monitoring policies for services

 

Registry

(WSO2 Governance Registry, Oracle Registry)

The registry is an information catalog that is constantly updated with information about the different services in a service-oriented architecture project. It helps to govern the SOA solution by storing, cataloging and indexing metadata related to services, so it can be easily manage / governed using this component.

SOA registry support UDDI specification and it is the main component in SOA governance. However with emergence of API Manager, the Registry component is slowly moving out from the SOA component set.

 

API Manager

(WSO2 API Manager, Apigee, Postman, Azure API Management)

In the current SOA component stack, API management component plays a major role. As we discovered in our earlier posts, that now most of all the applications expose services / APIs.  API manager is also capable of covering some registry features as well.

If we define API Management: API management is the process of publishing, documenting and overseeing application programming interfaces (APIs) in a secure, scalable environment.  The goal of API management is to allow an organization that publishes an API to monitor the interface’s lifecycle and make sure the needs of developers and applications using the API are being met.

Key Features:

  • Automate and control connections between an API and the application which use it
  • Monitor traffic of each exposed API
  • Add security policies for APIs
  • One repository to discover APIs in the Enterprise SOA boundary

 

BPEL Process Manager

(Oracle BPEL Process Manager, Apache ODE, ExpressBPEL, jBPM)

The key concept of SOA is construct services using other reusable services. To achieve that goal, we use BEPL Process Manager Component in SOA solutions. It holds a layer to orchestrate and combine multiple services to generate the task services which covers the business processes of the SOA domain. In BPEL manager we use BPEL (Business Process Execution Language) to define those complex orchestration processes. We would say BPEL Manager is the core component in Enterprise SOA solution.

Key Features:

  • It was initially based on XML schema, SOAP and WSDL. However now it support REST and JSON
  • Send and receive messages asynchronously from remote services
  • Manipulate XML, JSON data using XSTL
  • Manage events and exceptions in the process flow
  • Ability to design parallel flows
  • Compensate – Undo portion of processes when exception occur during the flow
  • Version control

 

BAM – Business Activity Monitoring

(Oracle BAM, WSO2 BAM)

BAM describes the processes and technologies that enhance situation awareness and enable analysis of critical business performance indicators based on real-time data. BAM is used to improve the speed and effectiveness of business operations by keeping track of what is happening and making issues visible quickly.

With all the components working together in Enterprise SOA solution, we can ensure a complete SOA solution running in our enterprise.

References

Read More