The AWS re:Invent 2019 conference, similarly to previous editions, was full of interesting lectures, such as breakout sessions, which aimed at familiarizing the participants with  a particular technical problem regarding the Amazon Web Services cloud. One of these presentations inspired me to write a few words about the security of applications created in the serverless model. The confidence of the business sector in this architecture is systematically rising, not only among start-ups and small firms but also in large organisations. There are many indications that in the coming years this trend will continue and the number of serverless production implementations will increase. This is why it is a good idea to take a closer look at the tools that AWS makes available to help programmers, administrators and architects minimise the risk of a data leak or losing control over their software.

When deciding to implement an application in a serverless model, one share the responsibility for the security with the cloud service provider. This is described as the so-called Shared Responsibility Model [1]. Amazon Web Services must take care of securing the whole computing and network infrastructure, the correct configuration of operating systems and part of the software (hypervisor, firewall, etc.), installation of updates and security patches, and the encryption of data and network traffic. On the other hand, the client, being an AWS services direct user, is responsible for implementing their program in such a way as to prevent potential attackers from accessing confidential resources or performing undesirable actions. In other words, the client is responsible for errors in the code and business logic of the application, as well as for proper monitoring of important parameters and event logging.

User authentication

Currently, almost every business application has an access control mechanism. One want to make sure that the user is really who they claim to be and that they only have access to the resources they should have. A static password which is usually stored in the database is the authentication method used most often. Of course, storing passwords as plaintext is unacceptable.

A similarly bad solution is hashing them with poor cryptographic hash functions that have long since ceased to be considered safe, such as MD5 or SHA1. If someone takes security seriously, they should concentrate their attention on one of the modern hashing algorithms, e.g. Bcrypt or PBKDF2. They have built-in key stretching and salt adding mechanisms, thanks to which they are very resistant to brute force, dictionary or rainbow tables attacks.

There is also the possibility of authenticating users without the necessity of storing passwords in any database, or sending them between the client and the server, thanks to the use of the SRP (Secure Remote Password) protocol. To put it simply, instead of a password, it requires the storage of a verifier that is a value calculated following the v = gx mod N formula, where g is the so-called group generator, N – a large enough prime number, and x – the value calculated based on the user’s name, password and salt by the use of one-way hash function.

The potential attacker that would want to break the protocol has to calculate x, in other words, has to find a discrete logarithm. This problem is extremely hard to compute, even knowing the N and g values (assuming that the N number meets specified criteria). Therefore, the SRP protocol can be an interesting alternative to the popular storing of password hashes in a database but one that is also demanding when it comes to implementation.

In this case, do you need to hire a programmer with a knowledge of modular arithmetic to enjoy a secure authentication process in your application? In the serverless world, there is no need for that. Amazon has already hired such programmers to do all the dirty work for you. This is why you can (and should) use the Amazon Cognito service, which frees you from the responsibility of storing user access data.

Thanks to the libraries for many popular programming languages, it will also facilitate the implementation of authentication logic with and without the use of SRP protocol. Cognito allows you to create your own user pool and to integrate with external identity providers, such as Google, Facebook, Amazon or any other provider who supports the OpenID Connect or SAML protocol. This facilitates the implementation of one of the good security practices, identity management centralisation, which can reduce the number of possible attack vectors.

From the application point of view, in such a situation there is only one identity storage, even if in reality user data are stored in many places. The use of Amazon Cognito in your serverless system is a step in the right direction if you want to achieve a high level of security.

Access control to REST API

Your serverless application probably uses (or will use) REST API to exchange data between different components.

In the AWS cloud there is a dedicated service for creating and managing API endpoints, called Amazon API Gateway. Although in this case you also share responsibility for security with the provider, remember that you are the one responsible for the correct configuration. One of the key elements is the so-called authorizer, the mechanism built into the API Gateway that is responsible for access control to your API. Let us consider a piece of simple application architecture presented on the diagram below.

Cloud architecture

The client application sends requests to one or many REST API endpoints exposed via the Amazon API Gateway service which then calls suitable Lambda functions. If the application has an authentication mechanism, you would probably want to allow only logged-in users to send requests to the API and thus to have access to the restricted resources. The case is simple when using the above-mentioned Amazon Cognito service.

However, to better understand the mechanism of restricting access to the API by using the integration with Cognito, let us start with a quick explanation of what the JWT (JSON Web Token) [2] is. It is a standard that describes a secure method of information exchange in the form of the JSON object, which is encoded and signed cryptographically, thus making you (almost) certain that the data is coming from the expected source.

A typical JWT token consists of three parts – the header, the payload and the signature – divided by a dot character. After the user authenticates, the Amazon Cognito service returns three JWT tokens to your application:

  • ID token
  • Access token
  • Refresh token

The first two contain claims regarding the user identity and are valid for an hour after their generation. The refresh token allows generating the ID and Access token again after they expire without requiring additional user authentication. Cognito generates two pairs of RSA cryptographic keys for each user pool. One of the private keys is used to create a digital signature of the token.

Knowing the public key, both your client application and the API Gateway can easily verify, if the user using the given JWT comes from the right pool and if they are not trying to impersonate another user. Each attempt to interfere in the character chain that creates the token invalidates the signature.

Having this knowledge, we slowly move closer to explaining the secret of how the authorizer integrated with the Cognito pool works. Let us look at the modified version of the architecture diagram presented earlier.

Expanded architecture

This time the request send by the client application to the REST API contains an additional HTTP header named Authorization. As its value, you set the ID or Access token sent earlier to the application as a response to a successful user authentication. Before transferring the request to the Lambda function, the API Gateway service will make sure that the token was signed with the key tied to the correct Cognito pool and that the token’s validity period had not passed. If the verification fails, the request will be rejected.

Therefore, even if a potential attacker guesses the endpoint’s URL address, they will not be able to successfully send any request to the API while not being a registered user in your application. The solution is simple, elegant and safe, and it is yet another argument in favour of using Amazon Cognito.

Another available type of authorizer is the Lambda authorizer, helpful whenever you cannot use Cognito. As the name suggests, this solution uses the Lambda function that is executed when the request comes to an endpoint. The function contains your own logic of request verification based on the provided token or based on the headers and parameters sent with the request. The Lambda must return an object containing an IAM policy that describes whether the API Gateway should accept or reject the request.

Secure Lambda functions

When writing about security of serverless applications one cannot omit Lambda functions. As already mentioned, the cloud service provider is responsible for the proper configuration of the execution environment and the infrastructure it uses to operate. On the other hand, the programmer must write code free from vulnerabilities and gaps that can be used as an attack vector. One such vulnerability, mentioned in the first place in the OWASP Serverless Top 10 [3] list is the so-called injection, which many can associate with the popular and well described in various sources attack called SQL Injection.

As a reminder: it consists of the unintentional execution of an SQL query (or a fragment of it) placed by an attacker in the input data that your application processes without prior filtering. In the serverless model, this type of data does not have to come directly from the user interface.

Lambda functions are often triggered by a specific event coming from another AWS service, e.g. the creation of a new file in the S3 bucket, modification of a record in a DynamoDB table, or the appearance of a message in the SNS topic. The event object that contains a set of information on such an event and is available in the main Lambda method in some cases can also contain code “injected” by the attacker.

Therefore each byte of input data that enters your function from any source must be appropriately filtered before being used in an SQL query or a shell command. All popular programming languages have functions or libraries that allow the execution of shell commands from the code level. From the technical point of view, there is nothing stopping you from doing the same thing in Lambda functions, but…

To better understand the possible attack vector, let us briefly recall the basics of the Lambda service.

Serverless

Source: Security Overview of AWS Lambda [1]

To execute your functions, AWS uses special virtual machines, so-called MicroVMs [4]. Each MicroVM instance can be reused for different functions within one account. Also, each instance can contain many execution environments (these are a kind of container) in which a runtime environment chosen by the user works, e.g. Node.js, JVM or Python. Execution environments are not shared between different functions, but – what is important – they can be reused to start another instance of the same Lambda.

Therefore, if you run a shell command that is dynamically created and contains unfiltered input data, the attacker can use this to take control over the execution environment and through this over other function invocations. This will often allow gaining access to confidential information stored, e.g. in the database or files within the /tmp folder, as well as to destroy some of the application resources. This vulnerability has been described as RCE (Remote Code Execution) [5].

Besides filtering and validating input data, one of the most important security practices regarding the AWS Lambda service is the use of the principle of least privilege. To implement it correctly, you must make sure that two assumptions are met:

  • Each Lambda in your application should have a separate IAM role assigned. Creating one common role for all functions is unacceptable.
  • The role of each Lambda should allow only those operations that are actually being performed. The wildcard (*) symbol should be avoided in policies.

A simple example of the practical use of this rule: if a particular function’s task is to read records from the DynamoDB database, the assigned IAM role must allow only to perform a read-out from a particular table needed in this case. In some situations, you can go one step further and restrict the access only to specific records in the table and specific attributes of those records [6]. Even if the attacker would be able to outsmart you and gain access to the Lambda execution environment, the use of the principle of least privilege would allow you to extensively limit the damage.

Storing access data

As you may already know, when using the Amazon DynamoDB you do not have to worry about the process of connecting and authenticating with a username and password, which is usually the case with database servers. Communication takes place via the HTTP(S) protocol and each sent request contains a cryptographic signature. The programmer usually does not need to know the low-level details of the interface operation because they can use comfortable, high-level API via AWS CLI or AWS SDK packages.

However, in some instances, it is necessary to use a different database, which usually means that access data need to be stored somewhere within your application. Writing them directly into the Lambda function code is not a good idea because then they will probably end up in the Git repository. A much better solution might be the use of Lambda’s encrypted environment variables, and better yet – the use of the AWS Secrets Manager service. It allows for the safe storage of passwords, API access keys and other confidential information. A programmer can easily retrieve such data in the Lambda function code, calling a suitable method from the AWS SDK package.

If your application uses a server or database cluster created via the Amazon RDS service, you can use IAM authentication [7]. After configuring the database and the IAM role attributed to the Lambda function, using a simple method call from the AWS SDK you can download a temporary access token valid for 15 minutes that is used instead of a password in the standard procedure for establishing a connection to the database. The fact that it must be an encrypted SSL connection additionally increases the security level of the solution. However, you should remember about certain limitations – e.g. in case of the MySQL engine, only 200 new connections per second can be established in such a way.

Summary

It is worth remembering that choosing the serverless model does not completely relieve you from having to deal with security issues, however, using the tools and services available in the AWS cloud you can make this task much easier. A detailed description of all the possible dangers found in this architecture and how to prevent them is a topic for at least a large book. This article only touches upon the chosen subjects and is a good starting point for the further broadening of one’s knowledge about this subject. I would advise you to read the supplementary materials and to watch a lecture entitled “Securing enterprise-grade serverless apps” on the basis of which this article was written.

  1. Amazon Web Services, Inc., Security Overview of AWS Lambda. An In-Depth Look at Lambda Security.
  2. Auth0, Inc., JSON Web Token Introduction
  3. The OWASP Foundation, OWASP Serverless Top 10
  4. Amazon Web Services, Inc., Firecracker
  5. Yuval Avrahami, Gaining Persistency on Vulnerable Lambdas
  6. Amazon Web Services, Inc., Using IAM Policy Conditions for Fine-Grained Access Control
  7. Amazon Web Services, Inc., IAM Database Authentication for MySQL and PostgreSQL

_All posts in this category

blogpost
Articles

How to leverage AWS key benefits to get real business value with Cloud Application Modernization

It's not rocket science, and it's pretty obvious that businesses need to keep up with the fast-changing digital landscape to remain competitive. Cloud application modernization is a critical strategy for updating outdated and legacy systems to leverage cloud computing benefits like those offered by Amazon Web Services(AWS). With this blog post, I decided to explore ones directly related to application modernization and the overall advantages of the cloud. Also, I will address the most common question of whether it is worth to modernize existing apps?

Read more
blogpost
Articles

Cloud Native Approach: Modernize or build cloud applications from scratch?

Everyone probably knows about applications. But how about the concept of Cloud Native? Perhaps many of you, well almost everybody, have heard something and will have an opinion. Okay, then, what are native applications and the Cloud Native Approach really? Is it worth developing new applications or upgrading existing ones to the Cloud Native model to overcome technology debt and improve legacy system?In this article, I will try to answer the above questions and show why the Cloud Native approach can be a key element in the success of any organization's digital transformation.

Read more
blogpost
Articles

How can artificial intelligence influence the vision of the future and cloud computing development?

The beginning of the year is a time of intensified summary of the past months, as well as preparation of plans for the upcoming ones. During this period, there are many more or less accurate predictions about what we can expect shortly in the multiple cloud providers offer.

Read more
blogpost
Articles

Navigating Cloud Disaster Recovery Realities

In the ever-changing landscape of cloud technologies, the advent of AWS over a decade ago marked the onset of a transformative era. The intricacies of disaster recovery have gained unprecedented prominence in this dynamic realm. As organizations progressively shift to the cloud, the necessity of a robust disaster recovery strategy is frequently undervalued.

Read more
blogpost
Articles

Azure Cloud Security: How to ensure the Zero Trust Model and use AI to our advantage?

Since the global popularization of remote work in recent years, IT security teams are facing ever-increasing challenges to ensure effective and secure access to organizations’ critical assets, resources, and data.Elaborate phishing attacks, through which user credentials are being exposed, allowing for lateral movement attacks or installing ransomware on mission-critical infrastructure. Zero-day vulnerabilities enable malicious actors to disrupt accessed services.

Read more
blogpost
Articles

Is the Edge a new Cloud?

Nowadays, many organizations that adopted the cloud are looking into the Edge as a natural extension for their cloud-based solutions. On the other hand, the ones at the very beginning of the Cloud journey are way more aware of the Edge and the Cloud, so they are considering the usage of both technologies at the very beginning.

Read more
blogpost
Articles

Quantum Computing: Where Schrödinger’s Cat gets cozy in the Cloud

Join me for a journey that will take us from the realm of reality as we know it to a world where a cat can be both: dead and alive, and a particle can be in two places at once. Fasten your seatbelts as we explore the fascinating world of quantum computing and its role in cloud computing.

Read more
blogpost
Articles

Will hybrid cloud and multi-cloud defend you from vendor lock-in? Do you really need to be wary of it?

Vendor lock-in is a concept overly often associated with the IT industry, and in recent years, especially with cloud computing, although it is not inextricably linked with them. Economists considered it in a broader context long before the world first heard of AWS or Azure. From a customer and user perspective, it has tended to be viewed negatively, often creating reluctance and fear of using a particular service or product.At first glance, the problem is not trivial in the public cloud area. Even the main beneficiaries of the phenomenon, i.e. the largest cloud providers, have decided to raise the issue on their official websites, so clearly something must be...And whether it actually is, we will check in this article. We'll look at the risks of vendor lock-in for organizations planning cloud adoption. We'll also consider whether using multiple vendors (multi-cloud ) simultaneously can be a good recipe for improvement. In addition, we will take a look at the hybrid cloud.

Read more
blogpost
Articles

The critical role of cloud-based data platforms. Reshaping manufacturing data management

Cloud-based data platforms revolutionize manufacturing data management by efficiently handling vast amounts of data in real-time. Manufacturers can collect data from various processes, analyze it with advanced tools like AI/ML algorithms and BI, and make informed decisions. These platforms offer key benefits, vital elements, and integration with Data Strategy.

Read more
blogpost
Articles

How can AI Data Discovery help manufacturing companies?

We are all blessed to live in very exciting times. Exponential technological progress over the last couple of decades has influenced not only our personal lives but also heavily impacted business. Trends are obviously evolving occasionally, but it is safe to say that now is the time of advanced analytics.

Read more
blogpost
Articles

Airline Rewards App: Mapping requirements to architecture for application migration and modernization

In this article, I'll guide you through the steps, technical choices, and trade-offs of migrating and modernizing apps to the public cloud, emphasizing beyond lift & shift and PaaS approaches. Using a real-life example, we'll consider business goals, architecture, and functional/non-functional needs. Business factors will be discussed in the next article.

Read more
blogpost
Articles

How to properly understand the public cloud in 2023? And why is it so difficult?

Cloud computing is constantly changing and evolving. What we see today is different from what it was yesterday and not the same as it will be tomorrow. The only constant is change. Today, conversations about change are not only with IT departments but also, before all, with the business, including marketing, HR, or finance departments. Each has different needs, which can be addressed by the cloud.

Read more
blogpost
Articles

Become a top example of a complete transition to Industry 4.0

Digital transformation and moving towards the idea of Industry 4.0 (I4.0) & Smart Factory (in AWS) are not easy. There are many obstacles waiting for the implementers. The most common are pilot purgatory and scale purgatory.

Read more
blogpost
Articles

Automated testing of serverless applications: 6 key takeaways from AWS re:Invent Dev Chat

The long-awaited 11th AWS re: Invent has just come to an end. Transition Technologies PSC marked its presence, among others, thanks to the active participation of our cloud experts. Paweł Zubkiewicz gave a fascinating lecture entitled " Automated testing of serverless applications," which was extremely popular. Many people had the opportunity to talk about cloud-related topics.

Read more
blogpost
Articles

How to get closer towards Industry 4.0?

Ensure business growth in the digital age. Dive into #digitaltransformation to find new opportunities, business models, make changes in your organization and bring a new level of value. Reinforcing your digital adoption strategy, supported by the right AWS cloud strategy, gives you the chance to achieve the expected results. In our second article in the series, you will find out how to prepare your employees for what may come on the road towards Industry 4.0. Get tips on approaching the Smart Factory in AWS from the plan through implementation to achieving the first value.

Read more
blogpost
Articles

How to implement Industry 4.0 smarter, faster, and easier?

The concept associated with Industry 4.0 is Smart Factory – in other words “intelligent factory”. This type of plant is based on integrated systems with the use of the industrial Internet of Things and new methods of production organization. It is intended to enable a high level of product personalization and run production processes with minimal labor input. The idea and activities within Smart Industry allow companies to shift market competition from offering a simple product to providing value-added products and competing with process excellence. This applies to cooperation with potential customers from the stage of virtual product design, through simulations, production optimization and real-time monitoring, to after-sales service.

Read more
blogpost
Articles

ThingWorx AWS Connector

The ubiquitous fourth industrial revolution, named Industry 4.0, is now one of the fastest growing IoT markets. The digital transformation journey is more than bragging about smart innovations and gadgets. It's often one of the best solutions for dealing with serious bottlenecks in the industry, such as frequent downtime and complete shutdowns of production lines. ThingWorx, as part of Industry 4.0 and the rich catalog of tools supporting it, is a comprehensive IoT product that enables the rapid creation and development of IoT solutions. Combining ThingWorx's capabilities with AWS cloud can add new features to these solutions. Having ThingWorx running in the cloud (Connector) translates into, among other things, the ability to preprocess IoT data before it even enters ThingWorx. This feature is particularly useful for real-time data, which could otherwise overwhelm ThingWorx if sent directly to it.

Read more
blogpost
Articles

How to achieve AWS cloud cost optimization with FinOps?

The cloud is not on-premise, which means that IT purchases don't happen according to a strategic plan, but immediately when the architect provisions new resources in the cloud. So how to deal with excessive costs in the AWS cloud? The first and quick solution is cost optimizations, which start by analyzing the accounts and seeing which and how AWS cloud services are used. Based on this, recommendations are made for optimization measures. These actions are part of the long-term adoption of the FinOps culture, which engages finance, technology, and business together to build a process of continuous cloud cost control in the organization.

Read more
blogpost
Articles

Driving digital transformation in the cloud

The cloud is a key success factor in digital transformation. It provides companies with many decisive advantages. However, the prerequisite for this is the right cloud strategy. The interview with Christian Thiem, Senior Business Analyst at TT PSC Germany GmbH, covers questions such as: What needs to be considered in the roadmap to the cloud? What strategies can be adopted for the implementation?

Read more
blogpost
Articles

What should you know about serverless computing?

Serverless cmputing still raises a lot of doubts, especially among those environments that are just starting to use cloud services or are just planning to migrate their systems to the cloud. We will try to answer the most important questions about this solution in this article.

Read more
blogpost
Articles

What is DevOps as a service and how you can benefit from it?

DevOps is an innovative methodology that introduced a new quality of work on IT projects. It is based on the cooperation of autonomous areas: software engineering, system administration, and issues related to safety and quality. The result is a combination of developer competence (Dev), system management (Ops), and most importantly,  operating culture. The DevOps concept assumes close cooperation between the programming team and the operational team.

Read more
blogpost
Articles

9 reasons why you should use the cloud in your business

According to "2019 State of the Cloud Report from Flexera" RightScale's report, 94% of companies use the cloud. It is no coincidence that so many enterprises are switching to cloud computing solutions. In this article, you will learn why it's such a popular concept, how your business will benefit from cloud adoption, and why those who don't use it yet are lagging behind the competition.

Read more
blogpost
Articles

How to start your journey with Azure and prepare for the AZ-900 exam

The demand for Cloud specialists is dynamically growing. How to get a wide range of competences and quickly familiarize yourself with the subject of Cloud computing? Start with a solid foundation- the AZ-900 certificate.

Read more
blogpost
Articles

Cloud in a time of crisis – how to improve work in your company

The world we've known in recent years is changing a lot. It forces us to change our habits as well as the ways in which we work and carry out our daily duties. Both professional and private. The circumstances in which we have to live meant that many people now work remotely. It's great comfort in these crazy times, and the Home office has become a full-fledged place to work.

Read more
blogpost
Articles

SSM parameters in AWS automation

Some time ago I was involved in a project that was to provide HA Windchill Cluster - actually, nothing new, the cluster itself does the job and basically I could end the topic here, but ...

Read more
blogpost
Articles

How we touched the clouds – AWS re:invent 2019 seen with our eyes

Apart from the funding, participation in the AWS conference re:Invent requires engagement and a bit of persistence on the participants’ side. In our case, the long process of preparation and making plans for the participation in the conference started in August with buying tickets to the event.

Read more
blogpost
Articles

How to make use of Talend Open Studio in the medical industry?

The use of modern technologies in medicine is getting more and more popular. Paper patient records are becoming obsolete and are being replaced by electronic forms of data storage. The digitalisation process of the health service is under way! In what areas? The answer to this question can be found below in this article.

Read more
blogpost
Articles

What is Amazon Web Services cloud?

Cloud computing is one of the world’s most rapidly developing technologies. It is successively replacing traditional server solutions, obtaining a larger and larger market share. The research company Gartner predicts that in 2019, total public cloud spending will increase by 17.5% to as much as USD 214 billion. For comparison, the expected revenue in the Polish budget for 2019 is assumed to amount to PLN 387.7 billion, which is nearly USD 100 billion. It is undoubtedly a large and attractive market.

Read more
blogpost
Articles

Why serverless is the future of software and apps

Every few years there is a new big thing in IT. Nowadays, all eyes are focused on Machine Learning (ML) and Artificial Intelligence (AI). At the same time, it seems that everyone got used to containers as the best way to deliver enterprise applications.

Read more
blogpost
Articles

Windchill Single Sign On – how to get from Amazon Web Services to Active Directory in a customer’s network?

One of customer’s migration point to Amazon Web Services was turning on SSO (Single Sign On) – as it’s quite convenient. After fast verification of the possibilities we have, it turned out that we can leverage ADFS. The customer already has ADFS deployed for other services, so there’s no need to convince Security Team that […]

Read more
blogpost
Articles

We build our own AWS Echo (with AWS Alexa on board)

How do you turn your (not so) ordinary Raspberry Pi into AWS Echo device – communicating with its surroundings using the AWS Alexa module? How to ask her about the weather in London, to remotely turn lights off at home, or make an appointment with a dentist? How to extend Alexa (in practice) to any […]

Read more
blogpost
Articles

Why Cloud Computing?

Cloud solutions, i.e. so-called Cloud Computing, are much less popular in Poland than in Western Europe and the United States. The market is new and is just taking shape. Customers are gradually gaining confidence in this type of solutions.   The fears of Polish companies related to the security / “cloud” experiences of businesses from […]

Read more
blogpost
Articles

Serverless is the new black!

Not so long ago at DevOps Days Warsaw 2016, predictions were being made about containerization and Docker as a technology that represents the future and will surely conquer the world. Anyone who invested their time in learning Docker at that time certainly does not regret it today. At TTPSC, we believe that containerization is not […]

Read more
blogpost
Articles

The Cloud is the future

The cloud solutions are the future that is slowly becoming reality and the present. This solution has many advantages. Currently, as much as 63% of companies are undergoing digital transformation. Classic, paper documents are being replaced by their electronic counterparts. Thanks to this, departments such as administration or accounting have significantly reduced their expenses and […]

Read more
blogpost
Articles

Partnership between Transition Technologies PSC and Amazon Web Services has grown to another level!

It is with great pleasure that we announce that the partnership between Transition Technologies PSC and Amazon Web Services has grown to another level. We are currently an AWS Standard Consulting Partner in Poland.   Amazon has recognized us as certified experts, who can efficiently help their customers design, build, migrate and administer resources and […]

Read more

Let’s get in touch

Contact us