Skip to content

Add Python examples and playbooks#426

Draft
welteki wants to merge 2 commits intoopenfaas:masterfrom
welteki:add-python-examples
Draft

Add Python examples and playbooks#426
welteki wants to merge 2 commits intoopenfaas:masterfrom
welteki:add-python-examples

Conversation

@welteki
Copy link
Copy Markdown
Member

@welteki welteki commented Apr 3, 2026

Description

Add detailed, step-by-step examples to the Python language docs. Examples are added as individual commits.

Examples added so far:

  • boto3 S3 example — accessing AWS S3 from a Python function with secret-based credential management and client reuse (tested e2e)
  • Kafka producer example — publishing messages to a Kafka topic using confluent-kafka with SASL/SSL authentication, including a note on common SASL mechanisms (tested e2e)

More examples will be added in follow-up commits.

Motivation and Context

Make it easier for users to get started with common patterns by providing ready-to-use examples in the official documentation.

  • I have raised an issue to propose this change (required)

How Has This Been Tested?

Both examples have been tested end-to-end: functions were built, deployed to a live OpenFaaS cluster, and invoked to verify correct behaviour.

Types of changes

  • Bug fix (non-breaking change which fixes an issue)
  • New feature (non-breaking change which adds functionality)
  • Breaking change (fix or feature that would cause existing functionality to change)

Checklist:

  • My code follows the code style of this project.
  • My change requires a change to the documentation.
  • I have updated the documentation accordingly.
  • I've read the CONTRIBUTION guide
  • I have signed-off my commits with git commit -s

Detailed step-by-step example showing how to access AWS S3 from a Python
function using boto3. Covers template selection, dependency setup, secret
management for AWS credentials, stack configuration, handler implementation
with S3 client reuse, and deployment with faas-cli.

Signed-off-by: Han Verstraete (OpenFaaS Ltd) <han@openfaas.com>
@reviewfn

This comment has been minimized.

Add a step-by-step example showing how to produce messages to a Kafka
topic from a Python function using the confluent-kafka package with
SASL/SSL authentication. Covers template selection, dependency setup,
secrets configuration, handler implementation with producer reuse,
and deployment.

Signed-off-by: Han Verstraete (OpenFaaS Ltd) <han@openfaas.com>
@reviewfn
Copy link
Copy Markdown

reviewfn bot commented Apr 3, 2026

AI Pull Request Overview

Summary

  • Adds detailed examples for Python functions in OpenFaaS documentation
  • Includes AWS S3 integration example using boto3 with secret-based credentials
  • Provides Kafka producer example with SASL/SSL authentication
  • Examples demonstrate best practices for credential management and client reuse
  • Step-by-step guides for setup, configuration, and deployment
  • Uses Debian-based template for native dependencies
  • Includes build and invocation commands

Approval rating (1-10)

8/10 - Well-structured examples with good practices, but lacks error handling in code samples.

Summary per file

Summary per file
File path Summary
docs/languages/python.md Added two comprehensive examples: AWS S3 access with boto3 and Kafka message publishing with confluent-kafka.

Overall Assessment

The PR successfully adds valuable, practical examples to the Python documentation that will help users implement common integrations. The examples follow security best practices by using secrets for credentials and demonstrate efficient patterns like client reuse. However, the code samples could benefit from basic error handling to make them more robust for production use, though this is acceptable for documentation examples.

Detailed Review

Detailed Review

docs/languages/python.md

AWS S3 Example:

  • Credential management via secrets is properly implemented and secure.
  • Client initialization in global scope for reuse across invocations is efficient for serverless functions.
  • Missing error handling for AWS API calls (e.g., list_objects_v2 and put_object) could lead to unhandled exceptions in production. Consider adding try-catch blocks or checking response status.
  • The POST handler assumes the 'key' query parameter exists or defaults to 'upload.txt' - this is reasonable but could be more explicit.

Kafka Producer Example:

  • SASL_SSL configuration with PLAIN mechanism is correctly set up for common Kafka brokers.
  • Producer reuse pattern matches the S3 example, maintaining consistency.
  • producer.flush() is called after produce(), ensuring message delivery, but this is synchronous and could block the function response. For better performance, consider asynchronous handling in production code.
  • No error handling for producer initialization or message publishing - similar to S3 example, adding basic error handling would improve robustness.
  • The read_secret helper function is well-implemented and reusable.

General Observations:

  • Both examples use the python3-http-debian template appropriately for native dependencies.
  • Build and deployment instructions are clear and include best practices like using --tag digest.
  • Code formatting and structure are consistent with existing documentation.
  • Examples are self-contained with all necessary configuration steps.

AI agent details.

Agent processing time: 26.805s
Environment preparation time: 4.288s
Total time from webhook: 34.817s

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant