merge files from the blockchain infra repo (#59)

This commit is contained in:
autistic-symposium-helper 2024-11-17 17:03:20 -08:00 committed by GitHub
parent 23f56ef195
commit 2a6449bb85
No known key found for this signature in database
GPG key ID: B5690EEEBB952194
346 changed files with 29097 additions and 132 deletions

View file

@ -0,0 +1,104 @@
# Byte-compiled / optimized / DLL files
__pycache__/
*.py[cod]
*$py.class
# C extensions
*.so
# Distribution / packaging
.Python
build/
develop-eggs/
dist/
downloads/
eggs/
.eggs/
lib/
lib64/
parts/
sdist/
var/
wheels/
*.egg-info/
.installed.cfg
*.egg
MANIFEST
# PyInstaller
# Usually these files are written by a python script from a template
# before PyInstaller builds the exe, so as to inject date/other infos into it.
*.manifest
*.spec
# Installer logs
pip-log.txt
pip-delete-this-directory.txt
# Unit test / coverage reports
htmlcov/
.tox/
.coverage
.coverage.*
.cache
nosetests.xml
coverage.xml
*.cover
.hypothesis/
.pytest_cache/
# Translations
*.mo
*.pot
# Django stuff:
*.log
local_settings.py
db.sqlite3
# Flask stuff:
instance/
.webassets-cache
# Scrapy stuff:
.scrapy
# Sphinx documentation
docs/_build/
# PyBuilder
target/
# Jupyter Notebook
.ipynb_checkpoints
# pyenv
.python-version
# celery beat schedule file
celerybeat-schedule
# SageMath parsed files
*.sage.py
# Environments
.env
.venv
env/
venv/
ENV/
env.bak/
venv.bak/
# Spyder project settings
.spyderproject
.spyproject
# Rope project settings
.ropeproject
# mkdocs documentation
/site
# mypy
.mypy_cache/

View file

@ -0,0 +1,40 @@
install:
@python setup.py install && pip install -r requirements.txt
build:
@/bin/bash ./scripts/build_package.sh
clean:
@rm -rf /tmp/*.mp4 .coverage .tox build dist lib/*.pyc *.egg-info *pyc __pycache__/ ffmpeg* .pytest_cache /tmp/*mp4 /tmp/*jpg
doctoc:
@doctoc README.md
event:
@PYTHONPATH=$(pwd) ./scripts/create_test_event.py
invoke:
@PYTHONPATH=$(pwd) lambda invoke -v
lint:
@pep8 --exclude=build,venv,dist . && echo pep8: no linting errors
fixlint:
@autopep8 --in-place *py lib/*py lib/handlers/*py lib/routes/*py tests/*py scripts/*py
test:
@PYTHONPATH=$(pwd) py.test -v --color=yes --ignore=venv/
deploy:
@/bin/bash scripts/deploy_lambda.sh sandbox
sbox:
@/bin/cp .env.sample_sandbox .env
stag:
@/bin/cp .env.sample_staging .env
prod:
@/bin/cp .env.sample_prod .env
.PHONY: install clean doctoc lint invoke test build deploy event fixlint prod stag sbox

View file

@ -0,0 +1,289 @@
# AWS Lambda Function to Trim Videos with FFMPEG
An AWS Lambda Function to trim videos served from an API endpoint, within two given NTP UTC timestamps.
The stack also uses SQS, SNS, and S3 resources.
----
# Table of Contents
<!-- START doctoc generated TOC please keep comment here to allow auto update -->
<!-- DON'T EDIT THIS SECTION, INSTEAD RE-RUN doctoc TO UPDATE -->
- [Introduction](#introduction)
- [Running Locally](#running-locally)
- [Create a virtual environment](#create-a-virtual-environment)
- [Configure the environment](#configure-the-environment)
- [Changes when moving to another environment](#changes-when-moving-to-another-environment)
- [Install the dependencies](#install-the-dependencies)
- [Create Sample SQS events](#create-sample-sqs-events)
- [Running the App locally](#running-the-app-locally)
- [AWS Deploynment](#aws-deploynment)
- [Running the App as a Lambda Function](#running-the-app-as-a-lambda-function)
- [Testing the flow in AWS](#testing-the-flow-in-aws)
- [Debugging Errors](#debugging-errors)
- [Contributing](#contributing)
- [Committing new code](#committing-new-code)
<!-- END doctoc generated TOC please keep comment here to allow auto update -->
----
# Introduction
As we see in this diagram, this application performs the following steps:
1. Receive a SQS event requesting a clip for a given time interval. An example of SQS event is the follow:
```json
{
"Records": [
{
"body": "{'clipId': '1111111111111', 'retryTimestamps': [], 'cameraId': '1111111111111', 'startTimestampInMs': 1537119363000, 'endTimestampInMs': 1537119423000}",
"receiptHandle": "MessageReceiptHandle",
"md5OfBody": "7b270e59b47ff90a553787216d55d91d",
"eventSourceARN": "arn:aws:sqs:us-west-1:123456789012:MyQueue",
"eventSource": "aws:sqs",
"awsRegion": "us-west-1",
"messageId": "19dd0b57-b21e-4ac1-bd88-01bbb068cb78",
"attributes": {
"ApproximateFirstReceiveTimestamp": "1523232000001",
"SenderId": "123456789012",
"ApproximateReceiveCount": "1",
"SentTimestamp": "1523232000000"
},
"messageAttributes": {
"SentTimestamp": "1523232000000"
}
}
]
}
```
2. Call the camera API with the endpoint `/cameras/cameraID` to retrieve a camera alias for the given camera id.
3. Call the camera API with the endpoint `/cameras/recording/` to retrieve a list of cam rewind source files within the given time range.
Which would generate this response:
```json
[{
"startDate":"2018-09-16T16:00:17.000Z",
"endDate":"2018-09-16T16:10:17.000Z",
"thumbLargeUrl":URL,
"recordingUrl":URL,
"thumbSmallUrl":URL,
"alias":"test"
}]
```
4. Retrieve the cam rewind source files from the origin S3 bucket (downloading them on disk).
5. Use ffmpeg to trim and merge clips into a single clip and to create several thumbnails.
6. If the clips are available, store them in the destination S3 bucket.
7. If the clips are not available, send a SQS message back to the queue, similar to the initial SQS, with a visibility timeout.
8. Call the camera API with endpoint `/cameras/clips` to update the information about the new clip and send a SNS message with the resulting metadata. An example of SNS message:
```json
{
"clipId": "1111111111111",
"cameraId": "1111111111111",
"startTimestampInMs": 1534305591000,
"endTimestampInMs": 1534305611000,
"status": "CLIP_AVAILABLE",
"bucket": "s3-test",
"clip": {
"url": URL,
"key": "/test.mp4"
},
"thumbnail": {
"url": "https://url_{size}.png",
"key": "/1111111111111/1111111111111{size}.png",
"sizes": [300, 640, 1500, 3000]
}
}
```
# Running Locally
To add new features to this application, follow these steps:
### Create a virtual environment
```bash
virtualenv venv
source venv/bin/activate
```
### Configure the environment
```bash
cp .env.sample_{env} .env
vim .env
```
Where these are the global variables in this file:
| Constant | Definition |
| :----------------------|:-------------------------------------------------------------------------------------- |
| CLIP_DOWNLOAD_DEST | Where the clips are going to be downloaded in disk |
| TIMESTAMP_FORMAT | The timestamp we will be parsing from the clip name strings |
| OLD_FILE_FORMAT | False if the clips to be downloaded have seconds encoded in their names (new format) |
| SQS_RETRY_LIMIT | The limit, in seconds, of retries for CLIP PENDING (default: 15 minutes) |
| OUT_OF_RANGE_LIMIT | The limit, in seconds, of how back in the past clips can be retrieved (default: 3 days)|
| CAM_SERVICES_URL | The url where the camera services is available |
| CLIP_URL | The url where the clips are posted to, accordingly to the environment |
| RECORDINGS_URL | The url where the source recordings are retrieved. |
| THUMBNAIL_SIZES | List of values for which clip thumbnails need to be created |
| VIDEO_MAX_LEN | Maximum length allowed for a clip |
| S3_BUCKET_ORIGIN | AWS S3 bucket where the rewinds are available |
| S3_BUCKET_ORIGIN_DIR | AWS S3 'folder' where the rewinds are available |
| S3_BUCKET_DESTINATION | AWS S3 bucket where the clips will be upload to. |
| AWS_SNS_TOPIC | AWS SNS topic arn |
| AWS_SQS_QUEUE | AWS SQS queue arn |
| AWS_SQS_QUEUE_URL | AWS SQS queue url |
| SQS_TIMEOUT | AWS SQS invisibility timeout in seconds |
#### Changes when moving to another environment
Whenever you move among the environments (prod, sandbox, or staging), you need to change the following variables:
| Constant | Possible value |
| :---------------------- |:------------------------------------------------- |
| CLIP_URL | https://camclips.{ENV}.test.com |
| S3_BUCKET_DESTINATION | cameras-service-clips-cdn-{ENV} |
| AWS_SNS_TOPIC | arn:aws:sns:test_{ENV} |
| AWS_SQS_QUEUE | arn:aws:sqs:test-sqs-{ENV} |
| AWS_SQS_QUEUE_URL | https://sqs.test-sqs-{ENV} |
### Install the dependencies
```bash
make install
```
### Create Sample SQS events
To create an `event.json` file to be tested in this application, run:
```bash
make event
```
Note that this command runs `./scripts/create_test_event.py` considering that the camera `test` is up. In case it is down, you should add a valid camera to the global variables section in that script.
You can create testing `event.json` to test alternate flows such as:
* **Clip pending** (i.e. when the requested clip is within 15 minutes to the SQS message timestamp but it was not created yet):
```bash
python scripts/create_test_event.py -p
```
* **Clip not available** (i.e. when the requested clip is later than 15 minutes but within 3 days to the SQS message timestamp):
```bash
python scripts/create_test_event.py -n
```
* **Clip out of range** (i.e. when the requested clip is later than 3 days to the SQS message timestamp):
```bash
python scripts/create_test_event.py -o
```
### Running the App locally
```bash
make invoke
```
-----
# AWS Deploynment
### Running the App as a Lambda Function
This creates a `.zip` package and deploys it to the lambda function:
```bash
make deploy
```
Check whether the package has the expected content:
```bash
unzip -l dist/cameras-service-generate-clip.zip
```
Note that this adds FFMPEG's dependencies manually and the Python dependencies are built within a Dockerfile.
### Testing the flow in AWS
You can test this application flow in sandbox and/or staging environment following theses steps:
1. In the [SQS dashboard](https://console.aws.amazon.com/sqs/home?region=us-west-1), select SQS queue and click `Queue action -> Send a Message`.
2. Type the value for `body`, similarly as the a message created in `event.json`. For instance:
```
{'clipId': '111111111111','retryTimestamps': [],'cameraId': '111111111111','startTimestampInMs': 1538412898000,'endTimestampInMs': 1538413498000}
```
1. This should trigger the lambda function and you should see the clips and thumbnails in the environment's S3 bucket in around 20-40 seconds.
### Debugging Errors
Errors will be logged in [CloudWatch](https://us-west-1.console.aws.amazon.com/cloudwatch/home?region=us-west-1#logs:). To make sense of logs in the CLI, you should install [saw](https://github.com/TylerBrock/saw).
For instance, to check error logs for staging in the last hour:
```bash
saw get /aws/lambda/clip-function -1h --filter error
```
----
# Contributing
### Committing new code
Run unit tests with:
```bash
make test
```
When deploying scripts (or to report back to Github on PRs), we ensure that code follows style guidelines with:
```bash
make lint
```
To fix lint errors, use:
```bash
make fixlint
```
Update the documentation (README.md) with:
```bash
make doctoc
```

View file

@ -0,0 +1,4 @@
region: us-west-1
function_name: ffmpeg-trimmer
handler: service.handler
description: Lambda function for creating camera clips by two NTP UTC timestamps.

View file

@ -0,0 +1,66 @@
#!/usr/bin/env python2
#
# Create a clipId to be used in event.json
import requests
import subprocess
import json
import time
def put_request(url, data):
"""
Send the PUT request to create the id, returning
the clipId string.
"""
r = requests.post(url, json=data)
print('--------------------------------------------------------')
print('Request to {}'.format(url))
print('Data sent: {}'.format(data))
print('Status code: {}'.format(r.status_code))
if r.status_code == 200:
print(r.json())
return r.json()['clipId']
else:
return False
def create_timestamps():
"""
Create a timestamp to send in the PUT request.
"""
now = int(time.time()*1000)
sent_ts = str(now)
begin_ts = str(now - 600000)
end_ts = str(now - 600000 + 180000)
return sent_ts, begin_ts, end_ts
def create_data(cam_id, url, begin_ts, end_ts):
"""
Create the data that need to be sent to the
PUT request.
"""
data = {
"cameraId": cam_id,
"startTimestampInMs": begin_ts,
"endTimestampInMs": end_ts
}
return data
def main(url, cam_id):
sent_ts, begin_ts, end_ts = create_timestamps()
data = create_data(cam_id, url, begin_ts, end_ts)
clip_id = put_request(url, data)
print('clipId to be added to event.json: {}'.format(clip_id))
print('send ts, start, end: {0} {1} {2}'.format(
sent_ts, begin_ts, end_ts))

View file

@ -0,0 +1 @@
saw get /aws/lambda/ffmpeg-clip --start -24h --filter error

View file

@ -0,0 +1,20 @@
{
"Records": [
{
"attributes": {
"ApproximateFirstReceiveTimestamp": "XXXXXXXXXXXXXXXXXXX",
"ApproximateReceiveCount": "1",
"SenderId": "XXXXXXXXXXXXXXXXXXX",
"SentTimestamp": "1543318636000"
},
"awsRegion": "us-west-1",
"body": "{'clipId': '5bc67ace8e9c352780437d2c','retryTimestamps': [],'cameraId': '582356e81ee905c72145623e','startTimestampInMs': '1543318156000','endTimestampInMs': '1543318636000'}",
"eventSource": "aws:sqs",
"eventSourceARN": "XXXXXXXXXXXXXXXXXXX",
"md5OfBody": "XXXXXXXXXXXXXXXXXXX",
"messageAttributes": {},
"messageId": "XXXXXXXXXXXXXXXXXXX",
"receiptHandle": "XXXXXXXXXXXXXXXXXXX"
}
]
}

View file

@ -0,0 +1,31 @@
boto3==1.4.4
botocore==1.5.62
certifi==2023.7.22
chardet==3.0.4
click==6.6
docutils==0.12
futures==3.2.0
idna==2.7
jmespath==0.9.0
pyaml==15.8.2
python-dateutil==2.5.3
python-dotenv==0.9.1
python-lambda==3.2.2
PyYAML==5.4
requests==2.31.0
s3transfer==0.1.13
six==1.10.0
urllib3==1.26.5
autopep8==1.4
appdirs==1.4.3
packaging==16.8
pep8==1.7.0
py==1.11.0
pyaml==15.8.2
pyparsing==2.2.0
pytest==3.0.7
virtualenv==15.0.3
jmespath==0.9.0
mock==2.0.0
requests-mock==1.5.2
coverage==4.5.1

View file

@ -0,0 +1,4 @@
packages
lib
app
Dockerfile.build

View file

@ -0,0 +1,9 @@
FROM amazonlinux:1
WORKDIR /opt/app
ADD requirements.txt .
RUN \
yum install -y python27-pip && \
pip install --target=/opt/app -r requirements.txt

View file

@ -0,0 +1,46 @@
#!/usr/bin/env bash
# This script adds additional dependences that are need for the lambda function package.
set -x
PACKAGE_NAME=cameras-clip.zip
# If S3_BUCKET env var isn't set, default it
if [ -z "${S3_BUCKET}" ]; then
S3_BUCKET=s3-test
fi
# Set dist env and create initial zip file
ORIGIN=$pwd
rm -rf dist && mkdir dist
lambda build --local-package . && mv dist/*.zip dist/$PACKAGE_NAME
cd dist/
## Fetch & add binary for FFMPEG
aws s3 cp "s3://${S3_BUCKET}/ffmpeg/ffmpeg-release-64bit-static.tar.xz" . && tar xf ffmpeg-release-64bit-static.tar.xz
zip -j -r9 $PACKAGE_NAME ffmpeg-*-64bit-static/ffmpeg
zip -j -r9 $PACKAGE_NAME ffmpeg-*-64bit-static/ffprobe
# Add this App's source code
cp -r ../lib .
zip -r9 $PACKAGE_NAME lib
# Add dependencies from pip
mkdir packages
cp ../scripts/Dockerfile.build Dockerfile
cp ../scripts/.dockerignore .dockerignore
cp ../requirements.txt .
docker build --tag pillow-build .
CTNHASH="$(docker create pillow-build)"
docker cp "${CTNHASH}":/opt/app/ .
cp -rf app/* packages/
# Package everything
cd packages
zip -ur9 ../$PACKAGE_NAME *
cd ..
# Clean up
#rm -rf ffmpeg-release-64bit-static.tar.xz ffmpeg-*-64bit-static/ packages/ lib/
docker rm ${CTNHASH}
cd $ORIGIN

View file

@ -0,0 +1,177 @@
#!/usr/bin/env python2
#
# For integration tests, different SQS events are needed.
# This script generates events for alternate flows.
# Global variables are defined in main().
import time
import json
import argparse
import datetime
import calendar
import datetime
def time_to_epoch(timestamp, timestamp_format):
"""
Given a timestamp string in seconds, return
the epoch timestamp string, in milliseconds.
"""
date = time.strptime(str(timestamp), timestamp_format)
return str(calendar.timegm(date)) + '000'
def generate_delta_time(delta, timestamp_format, now, days):
"""
Given a clip duration delta, and how many days back
from today, return a begin and end timestamp for the event.
"""
end = now - datetime.timedelta(days=days, minutes=0)
begin = now - datetime.timedelta(days=days, minutes=delta)
return begin.strftime(timestamp_format), end.strftime(timestamp_format)
def get_current_local_time(timestamp):
"""
Return the current time in a datetime object, a
human-readable string, and an epoch time integer.
"""
now = datetime.datetime.now()
human_now = now.strftime(timestamp)
epoch_now = time_to_epoch(human_now, timestamp)
return now, human_now, epoch_now
def create_event(begin, end, event_file, cam_id, epoch_now):
"""
Create an event.json SQS message file for
tests with the new timestamps and save it to the
destination in event_file.
"""
data = {'Records': [
{
"md5OfBody": "XXXXXXXXXXXXXXXXXXX",
"receiptHandle": "XXXXXXXXXXXXXXXXXXX",
"body": ("{'clipId': '1111111111111111',"
"'retryTimestamps': [],"
"'cameraId': '" + str(cam_id) + "',"
"'startTimestampInMs': '" + str(begin) + "',"
"'endTimestampInMs': '" + str(end) + "'}"),
"eventSourceARN": "XXXXXXXXXXXXXXXXXXX",
"eventSource": "aws:sqs",
"awsRegion": "us-west-1",
"messageId": "XXXXXXXXXXXXXXXXXXX",
"attributes": {
"ApproximateFirstReceiveTimestamp": "XXXXXXXXXXXXXXXXXXX",
"SenderId": "XXXXXXXXXXXXXXXXXXX",
"ApproximateReceiveCount": "1",
"SentTimestamp": epoch_now
},
"messageAttributes": {}
}
]
}
with open(event_file, 'w') as f:
json.dump(data, f, separators=(',', ': '), sort_keys=True, indent=2)
return data['Records'][0]['body']
def main():
# Global variables.
EVENT_FILE = 'event.json'
TIMESTAMP_FORMAT = '%d-%m-%Y %H:%M:%S'
DAYS_BEFORE_PENDING = 0
DAYS_BEFORE_AVAILABLE = 0
DAYS_BEFORE_NOT_AVAILABLE = 2
DAYS_BEFORE_OUT_OF_RANGE = 8
# Camera IDs used for tests, they should be checked whether
# they are currently down or not. For instance:
CAM_DOWN = '1111111111111111'
CAM_UP = '1111111111111111'
# This should not be more than 5 minutes (or the rewind clip generator
# app won't accent the event).
SESSION_DURATION_OK = 3
SESSION_DURATION_CLIP_TO_LONG = 8
# Get the time of event to be generated.
parser = argparse.ArgumentParser(
description='Clip duration you are looking for (in mins):')
parser.add_argument('-a', '--clip_available',
action='store_true', help='Event for <15 min')
parser.add_argument('-p', '--clip_pending',
action='store_true', help='Event cam down <15 min')
parser.add_argument('-o', '--clip_out_of_range',
action='store_true', help='Event for >3 days')
parser.add_argument('-n', '--clip_not_available',
action='store_true', help='Event cam down >3 days')
parser.add_argument('-t', '--clip_too_long',
action='store_true', help='Clips > 5 min')
args = parser.parse_args()
# Define what type of event we want.
if args.clip_pending:
days_before = DAYS_BEFORE_PENDING
cam_id = CAM_DOWN
session_duration = SESSION_DURATION_OK
elif args.clip_out_of_range:
days_before = DAYS_BEFORE_OUT_OF_RANGE
cam_id = CAM_UP
session_duration = SESSION_DURATION_OK
elif args.clip_not_available:
days_before = DAYS_BEFORE_NOT_AVAILABLE
cam_id = CAM_DOWN
session_duration = SESSION_DURATION_OK
elif args.clip_too_long:
days_before = DAYS_BEFORE_AVAILABLE
cam_id = CAM_UP
session_duration = SESSION_DURATION_CLIP_TO_LONG
else:
# Defaults to CLIP_AVAILABLE event.
days_before = DAYS_BEFORE_AVAILABLE
cam_id = CAM_UP
session_duration = SESSION_DURATION_OK
# Get current time in human string and epoch int.
now, human_now, epoch_now = get_current_local_time(TIMESTAMP_FORMAT)
# Generates a random begin and end time within the last days.
begin, end = generate_delta_time(
session_duration, TIMESTAMP_FORMAT, now, days_before)
# Convert these times to epoch timestamp and human time.
end_epoch = time_to_epoch(end, TIMESTAMP_FORMAT)
begin_epoch = time_to_epoch(begin, TIMESTAMP_FORMAT)
if begin_epoch and end_epoch:
# Creates the JSON file for the event.
body = create_event(begin_epoch, end_epoch,
EVENT_FILE, cam_id, epoch_now)
print('-----------------------------------------------------')
print('Event test saved at {}'.format(EVENT_FILE))
print('Camera id is {}'.format(cam_id))
print('Timestamp for {0} days ago, delta time is {1} mins').format(
days_before, session_duration)
print('Begin: {0} -> End: {1}'.format(begin_epoch, end_epoch))
print('Begin: {0} -> End: {1}'.format(begin, end))
print('Time: {}'.format(human_now))
print('Body: ')
print(body)
print('-----------------------------------------------------')
else:
print('Could not create timestamps for {}'.format(duration))
if __name__ == '__main__':
main()

View file

@ -0,0 +1,58 @@
#!/bin/bash -ex
# Script that deploy this app to the AWS lambda function, similarly to Jenkins.
USAGE=$(cat <<-END
Usage:
deploy_lambda.sh <environment>
Examples:
deploy_lambda.sh staging
END
)
if [[ "$1" = "-h" ]]; then
echo "${USAGE}"
exit
fi
if [[ -n "$1" ]]; then
SERVER_GROUP=$1
else
echo '[ERROR] You must specify the env: production, sandbox, staging'
echo
echo "${USAGE}"
exit 1
fi
BUILD_ENVIRONMENT=$1
APP_NAME=cameras-service-generate-clip
export AWS_DEFAULT_REGION="us-west-1"
export AWS_REGION="us-west-1"
if [[ "${BUILD_ENVIRONMENT}" == "sandbox" ]]; then
S3_BUCKET=sl-artifacts-dev
else
S3_BUCKET="sl-artifacts-${BUILD_ENVIRONMENT}"
fi
S3_PREFIX="lambda-functions/${APP_NAME}"
S3_BUNDLE_KEY="sl-${APP_NAME}.zip"
S3_TAGGED_BUNDLE_KEY="sl-${APP_NAME}-${BUILD_TAG}.zip"
make clean
make install
make lint
make build
aws \
s3 cp "dist/${S3_BUNDLE_KEY}" "s3://${S3_BUCKET}/${S3_PREFIX}/${S3_BUNDLE_KEY}"
aws \
s3 cp "s3://${S3_BUCKET}/${S3_PREFIX}/${S3_BUNDLE_KEY}" "s3://${S3_BUCKET}/${S3_PREFIX}/${S3_TAGGED_BUNDLE_KEY}"
aws \
lambda update-function-code \
--function-name "sl-${APP_NAME}-${BUILD_ENVIRONMENT}" \
--s3-bucket "${S3_BUCKET}" \
--s3-key "${S3_PREFIX}/${S3_TAGGED_BUNDLE_KEY}"
echo "build description:${APP_NAME}|${BUILD_ENVIRONMENT}|${BUILD_TAG}|"

View file

@ -0,0 +1,3 @@
#!/usr/bin/env bash
curl -i URL?startDate=$(date -v '-1H' +%s)000&endDate=$(date +%s)000

View file

@ -0,0 +1,17 @@
# -*- coding: utf-8 -*-
"""
Service handler module for AWS Lambda function. 'HANDLERS' constant dict is
used to map route requests to correct handler.
"""
import logging
from lib.config import LOG_LEVEL
from lib.routes import root
if LOG_LEVEL in ('CRITICAL', 'ERROR', 'WARNING', 'INFO', 'DEBUG', 'NOTSET'):
level = logging.getLevelName(LOG_LEVEL)
else:
level = logging.INFO
logging.basicConfig(level=level)
handler = root.handler

View file

@ -0,0 +1,7 @@
from distutils.core import setup
setup(
name='rewind_clip_generator',
version='1.0',
packages=['lib', 'lib.routes', 'lib.handlers'],
)

View file

@ -0,0 +1 @@
# -*- coding: utf-8 -*-

View file

@ -0,0 +1,19 @@
{
"clipId": "11111111111",
"cameraId": "11111111111",
"startTimestampInMs": 1534305591000,
"endTimestampInMs": 1534305611000,
"status": "CLIP_AVAILABLE",
"bucket": "sl-cam-clip-archive-prod",
"clip": {
"url": "https://test.mp4",
"key": "/583499c4e411dc743a5d5296/11111111111.mp4"
},
"thumbnail": {
"url": "https://test_{size}.png",
"key": "/11111111111/1111111111_{size}.png",
"sizes": [300, 640, 1500, 3000]
}
}

View file

@ -0,0 +1,24 @@
{
"Records": [
{
"body": "{'clipId': '507f191e810c19729de860ea', 'retryTimestamps': [], 'cameraId': '583499c4e411dc743a5d5296', 'startTimestampInMs': 1537119363000, 'endTimestampInMs': 1537119423000}",
"receiptHandle": "MessageReceiptHandle",
"md5OfBody": "7b270e59b47ff90a553787216d55d91d",
"eventSourceARN": "arn:aws:sqs:us-west-1:123456789012:MyQueue",
"eventSource": "aws:sqs",
"awsRegion": "us-west-1",
"messageId": "19dd0b57-b21e-4ac1-bd88-01bbb068cb78",
"attributes": {
"ApproximateFirstReceiveTimestamp": "1523232000001",
"SenderId": "123456789012",
"ApproximateReceiveCount": "1",
"SentTimestamp": "1523232000000"
},
"messageAttributes": {
"SentTimestamp": "1523232000000"
}
}
]
}

View file

@ -0,0 +1,10 @@
[
{
"startDate":"2018-08-25T19:20:16.000Z",
"endDate":"2018-08-25T19:30:16.000Z",
"thumbLargeUrl":"https://test_full.jpg",
"recordingUrl":"https://test.mp4",
"thumbSmallUrl":"https://test_small.jpg",
"alias":"test"
}
]

View file

@ -0,0 +1,32 @@
# -*- coding: utf-8 -*-
""" Test Root service handler module for AWS Lambda function. """
import os
import json
import pytest
from lib.routes import root
fixtures_path = os.path.join(os.path.dirname(__file__), '..', 'fixtures')
@pytest.fixture
def sns_event_record():
sns_event_record_path = os.path.join(fixtures_path, 'SNS_contract.json')
with open(sns_event_record_path, 'r') as sns_event_record_json:
return json.load(sns_event_record_json)
@pytest.fixture
def context():
return {}
class TestHandler():
def test_type_error_for_bad_params(self, context):
try:
root.handler('', context)
except TypeError as e:
pass
else:
self.fail('ExpectedException not raised')

View file

@ -0,0 +1,32 @@
# -*- coding: utf-8 -*-
""" AWS Wrapper Test Module """
import unittest
import mock
import lib.aws_wrapper
class TestAwsWrapper(unittest.TestCase):
def setUp(self):
self.filename = 'filename_test'
self.destination = 'destination_test'
self.clip_metadata = {'test': 'test'}
self.aw = lib.aws_wrapper.AwsWrapper()
@mock.patch('lib.aws_wrapper.boto3')
def test_download_clip_boto(self, boto3):
self.aw.download_video(self.filename, self.destination)
boto3.resource.assert_called_with('s3')
@mock.patch('lib.aws_wrapper.boto3')
def test_upload_clip_boto(self, boto3):
self.aw.upload_asset(self.filename, self.destination)
boto3.client.assert_called_with('s3')
@mock.patch('lib.aws_wrapper.boto3')
def test_send_sns_msg_boto(self, boto3):
aw = lib.aws_wrapper.AwsWrapper()
aw.send_sns_msg(self.clip_metadata)
boto3.client.assert_called_with('sns')

View file

@ -0,0 +1,52 @@
# -*- coding: utf-8 -*-
""" Cam Wrapper Test Module """
import mock
import unittest
import pytest
import lib.cam_wrapper
import lib.utils
class TestCamWrapper(unittest.TestCase):
def setUp(self):
self.session_start_ms = '1535223360000'
self.session_end_ms = '1535224400000'
self.cameraId = '1111111111111111'
self.clipId = '1111111111111111'
self.metadata_test_clip_key = '/{0}/{1}.mp4'.format(
self.cameraId, self.clipId)
self.metadata_test_tb_key = '/{0}/{1}'.format(
self.cameraId, self.clipId) + '_{size}.jpg'
self.cw = lib.cam_wrapper.CamWrapper(
self.session_start_ms, self.session_end_ms,
self.cameraId, self.clipId)
@mock.patch('lib.utils.get_request')
def test_get_alias(self, mocked_method):
self.cw .get_alias()
self.assertTrue(mocked_method.called)
def test_metadata(self):
self.assertEqual(
self.cw .metadata['clip']['key'], self.metadata_test_clip_key)
self.assertEqual(
self.cw .metadata['thumbnail']['key'], self.metadata_test_tb_key)
@mock.patch('lib.utils.get_request')
def test_get_clip_names(self, mocked_method):
alias = self.cw .get_clip_names()
self.assertTrue(mocked_method.called)
@mock.patch('lib.utils.put_request')
def test_put_clip_metadata(self, mocked_method):
alias = self.cw .put_clip_metadata()
self.assertTrue(mocked_method.called)
def test_update_clip_status(self):
test_status = 'test'
self.cw.update_clip_status(test_status)
self.assertEqual(self.cw.metadata['status'], test_status)

View file

@ -0,0 +1,30 @@
# -*- coding: utf-8 -*-
""" Ffmpeg Wrapper Test Module """
import lib.ffmpeg_wrapper
import unittest
class TestFfmpegWrapper(unittest.TestCase):
def setUp(self):
self.epoch_video = 1.535884819e+12
self.crop_start = '03:39.000'
self.crop_end = '13:01.000'
self.session_start_ms = '1535884600000'
self.session_end_ms = '1535885600000'
self.alias = 'test'
self.clipId = '1111111111111111'
self.clips = []
self.fw = lib.ffmpeg_wrapper.FfmpegWrapper(
self.alias, self.clips,
self.session_start_ms,
self.session_end_ms,
self.clipId)
def test_calculate_crop_time(self):
crop_start, crop_end = self.fw.calculate_trim_time(self.epoch_video)
print crop_start, crop_end, self.crop_end, self.crop_start
self.assertEqual(crop_end, self.crop_end)
self.assertEqual(crop_start, self.crop_start)

View file

@ -0,0 +1,80 @@
# -*- coding: utf-8 -*-
""" Utils Test Module """
import os
import json
import pytest
import unittest
import mock
import requests
import requests_mock
import lib.utils
fixtures_path = os.path.join(os.path.dirname(__file__), 'fixtures')
@pytest.fixture
def get_fixture(fixture_json):
get_sqs_event = os.path.join(fixtures_path, fixture_json)
with open(get_sqs_event, 'r') as f:
return json.load(f)
class TestClipGeneratorTrigger(unittest.TestCase):
def setUp(self):
self.domain = 'http://test.com'
self.endpoint = 'filetest.mp4'
self.file_url = 'http://test.com/filetest.mp4'
self.clipname = 'camtest.20180815T140019.mp4'
self.epoch_in_ms = 1535224400000
self.timestamp = '20180825T191320'
self.timestamp_format = '%Y%m%dT%H%M%S'
self.msecs = 1807
self.resp = {'test1': 'test2'}
def test_url_join(self):
self.assertEqual('http://test.com/filetest.mp4',
lib.utils.url_join(self.domain,
self.endpoint), msg=None)
def test_get_request(self):
with requests_mock.Mocker() as m:
m.get(self.file_url, json=self.resp)
self.assertTrue(lib.utils.get_request(self.domain, self.endpoint))
def test_get_basename_str(self):
self.assertEqual('filetest.mp4', lib.utils.get_basename_str(
self.file_url), msg=None)
def test_get_timestamp_str(self):
self.assertEqual('20180815T140019000',
lib.utils.get_timestamp_str(self.clipname), msg=None)
def test_get_location_str(self):
self.assertEqual('hbpiernscam', lib.utils.get_location_str(
self.clipname), msg=None)
def test_timestamp_to_epoch(self):
self.assertEqual(self.epoch_in_ms, lib.utils.timestamp_to_epoch(
self.timestamp, self.timestamp_format), msg=None)
def test_epoch_to_timestamp(self):
self.assertEqual(self.timestamp, lib.utils.epoch_to_timestamp(
self.epoch_in_ms, self.timestamp_format), msg=None)
def test_humanize_delta_time(self):
self.assertEqual(
'00:01.807', lib.utils.humanize_delta_time(self.msecs), msg=None)
@mock.patch('lib.utils.os.remove')
def test_remove_file(self, mocked_remove):
lib.utils.remove_file(self.clipname)
self.assertTrue(mocked_remove.called)
@mock.patch('lib.utils.subprocess.check_output')
def test_run_subprocess(self, mocked_subprocess):
lib.utils.run_subprocess(['ls'], 'ok', 'err')
self.assertTrue(mocked_subprocess.called)