pub async fn upload_user_data(
    tracking_label: &str,
    config: &CoreConfig,
    db_pool: &Pool<PostgresConnectionManager<MakeTlsConnector>>,
    kafka_pool: &KafkaPublisher,
    headers: &HeaderMap<HeaderValue>,
    body: Body
) -> Result<Response<Body>, Infallible>
Expand description

upload_user_data

Handles uploading a POST-ed file to s3 and create a users_data record to track the s3 utilization.

Usage

Environment variables

Change the s3 bucket for file uploads

export S3_DATA_BUCKET=BUCKET_NAME

Change the s3 bucket prefix path for file uploads

export S3_DATA_PREFIX="user/data/file"

The file contents must be passed in the data field of the ApiReqUserUploadData type which is serialized within a POST-ed hyper Request’s Body

Overview Notes

This function only creates 1 users_data record at a time.

It also uploads the data (file contents) with a user-and-date pathing convention.

Arguments

  • tracking_label - &str - caller logging label
  • config - CoreConfig
  • db_pool - Pool - postgres client db threadpool with required tls encryption
  • kafka_pool - KafkaPublisher for asynchronously publishing messages to the connected kafka cluster
  • headers - HeaderMap - hashmap containing headers in key-value pairs Request’s Body
  • body - hyper::Body - the hyper Request’s Body containing the file’s contents to store on s3. The contents must be in the POST-ed Body’s data key.

Returns

upload_user_data on Success Returns

The newly-uploaded users_data record in the db (ApiResUserUploadData)

hyper Response containing a json-serialized ApiResUserUploadData dictionary within the Body and a 200 HTTP status code

Ok(Response)

Errors

upload_user_data on Failure Returns

All errors return as a hyper Response containing a json-serialized ApiResUserUploadData dictionary with a non-200 HTTP status code

Err(Response)