python-lambda
A toolkit for developing and deploying serverless Python code in AWS Lambda.
Top Related Projects
Serverless Python
Python Serverless Microframework for AWS
⚡ Serverless Framework – Effortlessly build apps that auto-scale, incur zero costs when idle, and require minimal maintenance using AWS Lambda and other managed cloud services.
λ Gordon is a tool to create, wire and deploy AWS Lambdas using CloudFormation
Quick Overview
The python-lambda
project is a Python library that simplifies the process of deploying and managing AWS Lambda functions. It provides a set of tools and utilities to help developers create, test, and deploy serverless applications on AWS.
Pros
- Simplifies AWS Lambda Deployment: The library abstracts away the complexities of AWS Lambda deployment, making it easier for developers to focus on writing their application code.
- Supports Multiple Runtimes:
python-lambda
supports multiple Python runtimes, including Python 2.7, 3.6, 3.7, and 3.8, allowing developers to choose the appropriate runtime for their project. - Automated Testing: The library includes built-in support for automated testing of Lambda functions, making it easier to ensure the correctness of the application.
- Flexible Configuration: The project allows developers to customize the deployment process by providing a flexible configuration system.
Cons
- Limited to AWS Lambda: The library is specifically designed for AWS Lambda and may not be suitable for other serverless platforms or cloud providers.
- Dependency on AWS SDK: The project relies on the AWS SDK for Python (Boto3), which may introduce additional complexity and dependencies for some projects.
- Learning Curve: While the library aims to simplify the deployment process, there is still a learning curve for developers who are new to AWS Lambda and serverless architectures.
- Potential Vendor Lock-in: By using
python-lambda
, developers may become more tightly coupled to the AWS ecosystem, which could make it more difficult to migrate to other cloud providers in the future.
Code Examples
Here are a few examples of how to use the python-lambda
library:
- Creating a new Lambda function:
from python_lambda import LambdaFunction
def handler(event, context):
return {"message": "Hello, World!"}
function = LambdaFunction(
name="my-lambda-function",
handler="handler.handler",
runtime="python3.8",
role="arn:aws:iam::123456789012:role/my-lambda-role",
)
function.deploy()
- Invoking a Lambda function:
from python_lambda import LambdaFunction
function = LambdaFunction(name="my-lambda-function")
response = function.invoke({"key": "value"})
print(response)
- Running local tests:
from python_lambda import LambdaFunction
def handler(event, context):
return {"message": "Hello, World!"}
function = LambdaFunction(
name="my-lambda-function",
handler="handler.handler",
runtime="python3.8",
)
function.test(event={"key": "value"})
- Configuring environment variables:
from python_lambda import LambdaFunction
function = LambdaFunction(
name="my-lambda-function",
handler="handler.handler",
runtime="python3.8",
environment={
"MY_ENV_VAR": "my-value"
}
)
function.deploy()
Getting Started
To get started with the python-lambda
library, follow these steps:
- Install the library using pip:
pip install python-lambda
- Create a new Python file (e.g.,
handler.py
) with your Lambda function code:
def handler(event, context):
return {"message": "Hello, World!"}
- Create a new
python_lambda.yml
configuration file in the root of your project:
function_name: my-lambda-function
handler: handler.handler
runtime: python3.8
role: arn:aws:iam::123456789012:role/my-lambda-role
- Deploy your Lambda function to AWS:
python-lambda deploy
That's it! You've now deployed your first Lambda function using the python-lambda
library. You can further customize the deployment process by modifying the configuration file or using the library's API directly in your Python code.
Competitor Comparisons
Serverless Python
Pros of Zappa
- More comprehensive framework for deploying serverless Python applications
- Supports a wider range of AWS services and integrations
- Offers automatic API Gateway configuration and management
Cons of Zappa
- Steeper learning curve due to more complex configuration options
- May be overkill for simple Lambda function deployments
- Requires more setup and configuration for basic use cases
Code Comparison
Zappa configuration (zappa_settings.json):
{
"dev": {
"app_function": "my_app.app",
"aws_region": "us-west-2",
"profile_name": "default",
"project_name": "my-project",
"runtime": "python3.8"
}
}
python-lambda configuration (config.yaml):
region: us-west-2
function_name: my-lambda-function
handler: service.handler
role: arn:aws:iam::123456789012:role/lambda_basic_execution
runtime: python3.8
Both projects aim to simplify AWS Lambda deployments for Python applications, but Zappa offers a more feature-rich solution with broader AWS service support. python-lambda focuses on simplicity and ease of use for basic Lambda function deployments. The choice between the two depends on the complexity of your serverless application and your familiarity with AWS services.
Python Serverless Microframework for AWS
Pros of Chalice
- More comprehensive AWS integration, supporting multiple AWS services beyond Lambda
- Built-in CLI for easier deployment and management of serverless applications
- Better support for API Gateway, including automatic route generation
Cons of Chalice
- Steeper learning curve due to more complex features and abstractions
- Less flexibility in project structure compared to python-lambda's simpler approach
- Potentially overkill for small, single-function Lambda projects
Code Comparison
Chalice:
from chalice import Chalice
app = Chalice(app_name='helloworld')
@app.route('/')
def index():
return {'hello': 'world'}
python-lambda:
def lambda_handler(event, context):
return {
'statusCode': 200,
'body': 'Hello, World!'
}
Summary
Chalice offers a more feature-rich environment for developing serverless applications on AWS, with better integration across multiple services. It provides a CLI and abstractions that simplify complex deployments. However, this comes at the cost of a steeper learning curve and potentially less flexibility in project structure.
python-lambda, on the other hand, offers a simpler approach that may be more suitable for developers who prefer more control over their project structure or are working on smaller, single-function Lambda projects. It has a gentler learning curve but lacks some of the advanced features and integrations provided by Chalice.
The choice between the two depends on the project's complexity, the developer's familiarity with AWS services, and the desired level of control over the deployment process.
⚡ Serverless Framework – Effortlessly build apps that auto-scale, incur zero costs when idle, and require minimal maintenance using AWS Lambda and other managed cloud services.
Pros of Serverless
- Supports multiple cloud providers (AWS, Azure, Google Cloud, etc.)
- Extensive plugin ecosystem for additional functionality
- Comprehensive documentation and large community support
Cons of Serverless
- Steeper learning curve due to more complex configuration
- Larger project size and potential overhead for simple applications
- May require additional setup for Python-specific projects
Code Comparison
Python-lambda example:
def handler(event, context):
return "Hello from Python-lambda!"
Serverless example:
functions:
hello:
handler: handler.hello
plugins:
- serverless-python-requirements
def hello(event, context):
return {"statusCode": 200, "body": "Hello from Serverless!"}
Python-lambda focuses on simplicity for Python AWS Lambda functions, while Serverless offers a more comprehensive framework for serverless applications across multiple cloud providers. Python-lambda provides a straightforward approach with minimal configuration, making it easier for Python developers to get started quickly. Serverless, on the other hand, offers more flexibility and features but requires more setup and configuration. The choice between the two depends on the project's complexity, target cloud provider(s), and desired level of customization.
λ Gordon is a tool to create, wire and deploy AWS Lambdas using CloudFormation
Pros of Gordon
- More comprehensive AWS serverless framework, supporting multiple services beyond just Lambda
- Provides a higher-level abstraction for managing serverless applications
- Offers better integration with other AWS services like API Gateway and DynamoDB
Cons of Gordon
- Steeper learning curve due to its more complex architecture
- Less frequently updated compared to Python-Lambda
- May be overkill for simple Lambda function deployments
Code Comparison
Python-Lambda:
from lambda_function import lambda_handler
def test_lambda_handler(event, context):
result = lambda_handler(event, context)
assert result['statusCode'] == 200
Gordon:
- name: hello-world
type: lambda
runtime: python3.6
handler: handler.hello
memory: 128
timeout: 30
Gordon uses YAML configuration files for defining Lambda functions and other resources, while Python-Lambda focuses on Python code for function implementation and deployment. Gordon's approach allows for more complex serverless architectures, but Python-Lambda's simplicity can be advantageous for straightforward Lambda deployments.
Convert designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual CopilotREADME
Python-lambda is a toolset for developing and deploying serverless Python code in AWS Lambda.
A call for contributors
With python-lambda and pytube both continuing to gain momentum, I'm calling for contributors to help build out new features, review pull requests, fix bugs, and maintain overall code quality. If you're interested, please email me at nficano[at]gmail.com.
Description
AWS Lambda is a service that allows you to write Python, Java, or Node.js code that gets executed in response to events like http requests or files uploaded to S3.
Working with Lambda is relatively easy, but the process of bundling and deploying your code is not as simple as it could be.
The Python-Lambda library takes away the guess work of developing your Python-Lambda services by providing you a toolset to streamline the annoying parts.
Requirements
- Python 2.7, >= 3.6 (At the time of writing this, these are the Python runtimes supported by AWS Lambda).
- Pip (~8.1.1)
- Virtualenv (~15.0.0)
- Virtualenvwrapper (~4.7.1)
Getting Started
First, you must create an IAM Role on your AWS account called
lambda_basic_execution
with the LambdaBasicExecution
policy attached.
On your computer, create a new virtualenv and project folder.
$ mkvirtualenv pylambda
(pylambda) $ mkdir pylambda
Next, download Python-Lambda using pip via pypi.
(pylambda) $ pip install python-lambda
From your pylambda
directory, run the following to bootstrap your project.
(pylambda) $ lambda init
This will create the following files: event.json
, __init__.py
,
service.py
, and config.yaml
.
Let's begin by opening config.yaml
in the text editor of your choice. For
the purpose of this tutorial, the only required information is
aws_access_key_id
and aws_secret_access_key
. You can find these by
logging into the AWS management console.
Next let's open service.py
, in here you'll find the following function:
def handler(event, context):
# Your code goes here!
e = event.get('e')
pi = event.get('pi')
return e + pi
This is the handler function; this is the function AWS Lambda will invoke in
response to an event. You will notice that in the sample code e
and pi
are values in a dict
. AWS Lambda uses the event
parameter to pass in
event data to the handler.
So if, for example, your function is responding to an http request, event
will be the POST
JSON data and if your function returns something, the
contents will be in your http response payload.
Next let's open the event.json
file:
{
"pi": 3.14,
"e": 2.718
}
Here you'll find the values of e
and pi
that are being referenced in
the sample code.
If you now try and run:
(pylambda) $ lambda invoke -v
You will get:
# 5.858
# execution time: 0.00000310s
# function execution timeout: 15s
As you probably put together, the lambda invoke
command grabs the values
stored in the event.json
file and passes them to your function.
The event.json
file should help you develop your Lambda service locally.
You can specify an alternate event.json
file by passing the
--event-file=<filename>.json
argument to lambda invoke
.
When you're ready to deploy your code to Lambda simply run:
(pylambda) $ lambda deploy
The deploy script will evaluate your virtualenv and identify your project dependencies. It will package these up along with your handler function to a zip file that it then uploads to AWS Lambda.
You can now log into the AWS Lambda management console to verify the code deployed successfully.
Wiring to an API endpoint
If you're looking to develop a simple microservice you can easily wire your function up to an http endpoint.
Begin by navigating to your AWS Lambda management console and clicking on your function. Click the API Endpoints tab and click "Add API endpoint".
Under API endpoint type select "API Gateway".
Next change Method to POST
and Security to "Open" and click submit (NOTE:
you should secure this for use in production, open security is used for demo
purposes).
At last you need to change the return value of the function to comply with the standard defined for the API Gateway endpoint, the function should now look like this:
def handler(event, context):
# Your code goes here!
e = event.get('e')
pi = event.get('pi')
return {
"statusCode": 200,
"headers": { "Content-Type": "application/json"},
"body": e + pi
}
Now try and run:
$ curl --header "Content-Type:application/json" \
--request POST \
--data '{"pi": 3.14, "e": 2.718}' \
https://<API endpoint URL>
# 5.8580000000000005
Environment Variables
Lambda functions support environment variables. In order to set environment
variables for your deployed code to use, you can configure them in
config.yaml
. To load the value for the environment variable at the time of
deployment (instead of hard coding them in your configuration file), you can
use local environment values (see 'env3' in example code below).
environment_variables:
env1: foo
env2: baz
env3: ${LOCAL_ENVIRONMENT_VARIABLE_NAME}
This would create environment variables in the lambda instance upon deploy. If your functions don't need environment variables, simply leave this section out of your config.
Uploading to S3
You may find that you do not need the toolkit to fully
deploy your Lambda or that your code bundle is too large to upload via the API.
You can use the upload
command to send the bundle to an S3 bucket of your
choosing. Before doing this, you will need to set the following variables in
config.yaml
:
role: basic_s3_upload
bucket_name: 'example-bucket'
s3_key_prefix: 'path/to/file/'
Your role must have s3:PutObject
permission on the bucket/key that you
specify for the upload to work properly. Once you have that set, you can
execute lambda upload
to initiate the transfer.
Deploying via S3
You can also choose to use S3 as your source for Lambda deployments. This can
be done by issuing lambda deploy-s3
with the same variables/AWS permissions
you'd set for executing the upload
command.
Development
Development of "python-lambda" is facilitated exclusively on GitHub. Contributions in the form of patches, tests and feature creation and/or requests are very welcome and highly encouraged. Please open an issue if this tool does not function as you'd expect.
Environment Setup
- Install pipenv
- Install direnv
- Install Precommit (optional but preferred)
cd
into the project and enter "direnv allow" when prompted. This will begin installing all the development dependancies.- If you installed pre-commit, run
pre-commit install
inside the project directory to setup the githooks.
Releasing to Pypi
Once you pushed your chances to master, run one of the following:
# If you're installing a major release:
make deploy-major
# If you're installing a minor release:
make deploy-minor
# If you're installing a patch release:
make deploy-patch
Top Related Projects
Serverless Python
Python Serverless Microframework for AWS
⚡ Serverless Framework – Effortlessly build apps that auto-scale, incur zero costs when idle, and require minimal maintenance using AWS Lambda and other managed cloud services.
λ Gordon is a tool to create, wire and deploy AWS Lambdas using CloudFormation
Convert designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual Copilot