Serverless applications can be extremely cost effective as only memory and CPU cycles consumed by the function are passed onto the customer.

Service Summary

To help your organisation unlock the potential of serverless architectures, Mobilise will analyse your current application landscape and review which components can be containerised or split into microservices. Mobilise will select the serverless and microservices components that best fit your current landscape to ensure your application delivery is performant, highly scalable, cost-efficient and resilient. 

no servers.svg

Benefits of Serverless Architectures

By opting for a serverless approach, customers can realise the following benefits:

  • Rapid development and iteration - serverless computing is ideal for companies that need to quickly develop, prototype, and iterate. Development is quicker since there aren’t any dependencies on IT operations. Functions are single threaded, which makes debugging and deploying functions simpler. The build process is also broken down into smaller and more manageable chunks. This increases the number of changes that can be pushed through the Continuous Delivery pipeline, resulting in rapid deployment and more iterative feedback. and Iterations are fast as the architecture is conducive to making large code changes quickly, resulting in more customer feedback and better product market fit.
  • Enhanced scalability – traditional architectures have developers making a choice – provision servers just in case demand reaches certain levels, or under-provision and take a risk. Serverless scales automatically to exactly meet demand, giving applications increased availability and stability.
  • Operational cost savings – with a serverless architecture, you only pay for the runtime of your function – there is no concept of idle resources. This brings huge cost savings against a traditional infrastructure model as you only pay each time your application runs.
  • Developer productivity - by using a serverless architecture, developers can focus more on writing code than having to worry about managing the operational tasks of the application. This allows them to develop innovative features and focus on the core business logic that matters most to the business.
button.svg

Our Partner Serverless Architecture Solutions

Amazon Web Services

Lambda
AWS Lambda is a compute service that lets you run code without provisioning or managing servers. AWS Lambda executes your code only when needed and scales automatically, from a few requests per day to thousands per second. You pay only for the compute time you consume - there is no charge when your code is not running. With AWS Lambda, you can run code for virtually any type of application or backend service - all with zero administration.

API Gateway
Amazon API Gateway is a fully managed service that makes it easy for developers to create, publish, maintain, monitor, and secure APIs at any scale. With a few clicks in the AWS Management Console, you can create an API that acts as a “front door” for applications to access data, business logic, or functionality from your back-end services, such as workloads running on Amazon Elastic Compute Cloud (Amazon EC2), code running on AWS Lambda, or any Web application.

Microsoft Azure

Functions
Azure Functions is an event driven, compute-on-demand experience that extends the existing Azure application platform with capabilities to implement code triggered by events occurring in virtually any Azure or 3rd party service as well as on-premises systems. Azure Functions allows developers to act by connecting to data sources or messaging solutions, thus making it easy to process and react to events. Azure Functions scale based on demand and you pay only for the resources you consume.

Queue Storage
Azure Queue storage provides cloud messaging between application components. In designing applications for scale, application components are often decoupled, so that they can scale independently. Queue storage delivers asynchronous messaging for communication between application components, whether they are running in the cloud, on the desktop, on an on-premises server, or on a mobile device. Queue storage also supports managing asynchronous tasks and building process work flows.

API Management
Microsoft Azure API Management is a turnkey solution for publishing APIs to external and internal consumers. Quickly create consistent and modern API gateways for existing back-end services hosted anywhere, secure and protect them from abuse and overuse and gain insights into usage and health.

Google Cloud Platform

Cloud Functions
Google Cloud Functions is a serverless execution environment for building and connecting cloud services. With Cloud Functions you write simple, single-purpose functions that are attached to events emitted from your cloud infrastructure and services. Your Cloud Function is triggered when an event being watched is fired. Your code executes in a fully managed environment. There is no need to provision any infrastructure or worry about managing any servers.

Cloud Functions are written in Javascript and execute in a Node.js v6.9.1 environment on Google Cloud Platform. You can take your Cloud Function and run it in any standard Node.js runtime which makes both portability and local testing a breeze.

Spring and Spring Boot

The Spring framework provides a comprehensive programming and configuration model for modern Java-based enterprise applications, on any kind of deployment platform. A key element of Spring is infrastructural support at the application level. Spring focusses on the deployment of enterprise applications so that developers can focus on application-level business logic, without unnecessary ties to a specific deployment environment.

Fat Jars
Java does not provide any standard way to load nested Jar files (i.e. Jar files that are themselves contained within a Jar), which can be problematic if the aim is to deploy a self-contained distributed application. To further increase the simplicity of deploying applications using the Spring framework, Spring Boot gives the option of deploying code in to Fat Jars – a self-contained executable Java package which includes all project dependencies. This executable can be easily deployed and can be run on any server with a JVM.

Docker

Docker is a tool designed to make it easier to create, deploy, and run applications by using containers. Containers allow a developer to package up an application with all of the parts it needs, such as libraries and other dependencies, and ship it all out as one package. Because of the container, the application will run on any machine regardless of any customised settings that machine might have that could differ from the machine used for writing and testing the code. The container approach also allows developers to run applications written in Docker on serverless platforms such as AWS Lambda.

Docker Containers
A container image is a lightweight, stand-alone, executable package of a piece of software that includes everything needed to run it: code, runtime, system tools, system libraries, settings. Available for both Linux and Windows based apps, containerised software will always run the same, regardless of the environment. Containers isolate software from its surroundings, for example differences between development and staging environments and help reduce conflicts between teams running different software on the same infrastructure.

Kubernetes

We love Kubernetes flexible approach to containerisation as it provides the ability to deploy services in highly available clusters and is truly cloud agnostic.

Multiple clouds? No problem. Deploy docker images to Kubernetes cluster wherever they live.

Kubernetes is making a huge impact in the Container world. We are excited about the flexibility it offers our customers by helping to build the flexibility to deploy onto multiple clouds but with a standardised CI/CD tool.

Kubernetes is an open-source system for automating deployment, scaling, and management of containerised applications.

It groups containers that make up an application into logical units for easy management and discovery. Kubernetes builds upon 15 years of experience of running production workloads at Google, combined with best-of-breed ideas and practices from the community.

Designed on the same principles that allows Google to run billions of containers a week, Kubernetes can scale without increasing your ops team. Whether testing locally or running a global enterprise, Kubernetes flexibility grows with you to deliver your applications consistently and easily no matter how complex your need is. Kubernetes is open source giving you the freedom to take advantage of on-premises, hybrid, or public cloud infrastructure, letting you effortlessly move workloads to where it matters to you.

“..it was great to have Mobilise Cloud Services managing our web infrastructure during the European Championships – we had a massively successful campaign on the pitch which generated millions of hits at peak moments without any service slowdown and with 100% availability”

Rob Dowling, New Media Manager
Football Association of Wales