Amazon S3

Overview

Manage AWS S3 data stores with Appian! Users can access their S3 objects directly from an Appian interface. Appian documents can be uploaded to a S3 bucket with Server-Side Encryption and be configured as Public or Private.
 
The AWS S3 Connected System Plug-in uses the AWS Java SDK to connect to S3.

Key Features & Functionality

  • Create Bucket -- Adds a new bucket to the S3 instance. A bucket is needed in order to store objects (files).
  • Upload File -- Uploads a file to any specified bucket on the S3 instance.
  • Upload Multiple Files -- Uploads multiple files to any specified bucket on the S3 instance.
  • List Buckets -- Returns all available buckets on the S3 instance.
  • List Objects -- Returns all available objects from a specified bucket.
  • Delete Bucket -- Permanently removes a bucket from the S3 instance.
  • Delete Object -- Permanently removes an object from a specified bucket.
  • Cognito Authentication -- Pulls credentials from AWS Cognito Identity Pool.

Requirements

Access Key Id: the access key id for connecting to S3 can be retrieved from AWS Management Console Navigate to the Users > Summary page and click the “Security credentials” tab
Secret Access Key: the secret access key can only be viewed once upon the creation of an access key, see AWS Access Keys documentation for more information: https://docs.aws.amazon.com/general/latest/gr/aws-sec-cred-types.html#access-keys-and-secret-access-keys

Notes

Users that want to upload objects as 'Public' must have the correct IAM privileges or an "Access Denied" error will return.

S3 endpoints which restrict traffic based on source will need to allow the IP and VPC endpoints outlined in KB-1582 based on site region.

Anonymous
  • I got the solution. I provided the bucket name in the base URL while setting up Connected system. Hence bucket list was not shown. When I removed the endpoint URL it worked.

  • Hello, for the Upload Multiple File operation, could you allow the File Path to be an expression? It only accepts plain text, as we have requirement to upload the multiple files at single integration call. Kindly respond to the community.

  • Hello, for the Upload Multiple File operation, could you allow the File Path and Bucket to be an expression? It only accepts plain text, which is bad when you need to have flexible File Paths and Buckets

  • I'm unable to view the list of buckets in the bucket drop-down. Given said that I have connection successful. 

    List bucket operations returns all the buckets

    But in other operations bucket list drop-down are not listing down.

  • I am facing the same issue recently.

      Have you got any solution for this issue? Or do you have a workaround?

  • What was the solution for this?

  • Did you find a solution for this?

  • ListAllBuckets related error:

    I am using the AWS Assume Role Plug-in with the Amazon S3 plug-in. After providing an Access Key ID, Region, Secret Access Key, Role ARN, and Role, I am getting Access Denied Error (403) when using the following role policy:

    {
        "Version": "2012-10-17",
        "Statement": [
            {
                "Action": [
                    "s3:Put*",
                    "s3:Get*",
                    "s3:Delete*",
                    "s3:List*"
                ],
                "Effect": "Allow",
                "Resource": [
                    "arn:aws:s3:::specificBucket",
                    "arn:aws:s3:::specificBucket/*"
                ]
            }
        ]
    }


    After reading through the past comments I've identified that this error is a ListAllBuckets permissions error, as I've found that the following role policy works (successful connection):
    {
        "Version": "2012-10-17",
        "Statement": [
            {
                "Action": [
                    "s3:Put*",
                    "s3:Get*",
                    "s3:Delete*",
                    "s3:List*"
                ],
                "Effect": "Allow",
                "Resource": [
                    "arn:aws:s3:::specificBucket",
                    "arn:aws:s3:::specificBucket/*"
                ]
            },
            {
                "Action": [
                    "s3:ListAllMyBuckets"
                ],
                "Effect": "Allow",
                "Resource": [
                    "*"
                ]
            }
        ]
    }

    However, I cannot give full access to AWS S3 and I need to use the first policy.

  • Hi Team,
    i have a problem with bucket's list in the integration, no bucket are show to be selected. Connection is ok as you can see on the screenshot. Someone can help me please?

  • v1.12.3 Release Notes
    • Upgraded ion-java library
    • Fixing the flagged issues of printstacktrace and system.out
    • Fixing the versions
    • Added new info logs to output user id, cognito identity id and s3 path into tomcat-std.logs