Amazon S3

Overview

Manage AWS S3 data stores with Appian! Users can access their S3 objects directly from an Appian interface. Appian documents can be uploaded to a S3 bucket with Server-Side Encryption and be configured as Public or Private.
 
The AWS S3 Connected System Plug-in uses the AWS Java SDK to connect to S3.

Key Features & Functionality

  • Create Bucket -- Adds a new bucket to the S3 instance. A bucket is needed in order to store objects (files).
  • Upload File -- Uploads a file to any specified bucket on the S3 instance.
  • Upload Multiple Files -- Uploads multiple files to any specified bucket on the S3 instance.
  • List Buckets -- Returns all available buckets on the S3 instance.
  • List Objects -- Returns all available objects from a specified bucket.
  • Delete Bucket -- Permanently removes a bucket from the S3 instance.
  • Delete Object -- Permanently removes an object from a specified bucket.
  • Cognito Authentication -- Pulls credentials from AWS Cognito Identity Pool.

Requirements

Access Key Id: the access key id for connecting to S3 can be retrieved from AWS Management Console Navigate to the Users > Summary page and click the “Security credentials” tab
Secret Access Key: the secret access key can only be viewed once upon the creation of an access key, see AWS Access Keys documentation for more information: https://docs.aws.amazon.com/general/latest/gr/aws-sec-cred-types.html#access-keys-and-secret-access-keys

Notes

Users that want to upload objects as 'Public' must have the correct IAM privileges or an "Access Denied" error will return.

S3 endpoints which restrict traffic based on source will need to allow the IP and VPC endpoints outlined in KB-1582 based on site region.

Anonymous
  • I have the same issue with the 1.12.4 plugin version. Has anyone found a solution?

  • I have the same issue.

    With the previous update, we could have a connected system with privatelink, and the integration object without privatelink, and it worked.

    Now, with the new version update, you can't select it in the integration object, so, if you use privatelink in the connected system, you must use it also in the integration.

    I'm not sure if this is a bug of the plugin, or a wrong configuration over AWS S3 side.

    Any help please?

  • Hello,

    After any actualization, the connection doesn't work anymore.

    When I trie to access to any integration that uses the CS configurated with this plugin shows this message:

    Error de evaluación de expresión [evaluation ID = 36e75:71134] : com.appiancorp.connectedsystems.templateframework.functions.pipeline.proxyDecoratorPipeline.TemplateInvocationException: Contact the developer of this template to resolve the following issue. com.amazonaws.services.s3.model.AmazonS3Exception: Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied; Request ID: RCC2TB81YW4GGCGB; S3 Extended Request ID: IHuWAXBKyNedMu13Zw03sfVBYpcVfe9e68XJSDkxjYVZa0CaflUAZvK6nEk/Urs1LVvx/ZfC50Q=; Proxy: null)

    And when I test the connection in the Connected System, it shows the following error:

    Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied; Request ID: 8460E1VZ8SR6YCBJ; S3 Extended Request ID: 56TGXSkIRZhw9QLlZL5ttj0rZlWpbYJeyosJksKq+Twa6TKamZRKx5+gNZHGXB2kV8SNucO7z/k=; Proxy: null)

    Please I need help urgently.

  • Do we have limitations on file size to be transferred? What is the maximum file size that it currently supports?

  • Do we have any update on above comment?

  • I am getting an error after upgrading to the new version:

    com.appiancorp.connectedsystems.templateframework.functions.pipeline.proxyDecoratorPipeline.TemplateInvocationException: Contact the developer of this template to resolve the following issue. java.lang.NullPointerException: null

    Any ideas what this is?

  • v1.12.4 Release Notes
    • [MIGRATED] Manage S3 buckets and objects. Create Bucket. Upload Files. List Buckets. List Objects. Delete Bucket. Delete Object.
    • AWS S3 Connected System Plug-in.
    • Fixes PrivateLink connection.
    Note: When using PrivateLink or Assumed Roles, Access Key Id & Secret Access Key are still required.

  • I got the solution. I provided the bucket name in the base URL while setting up Connected system. Hence bucket list was not shown. When I removed the endpoint URL it worked.

  • Hello, for the Upload Multiple File operation, could you allow the File Path to be an expression? It only accepts plain text, as we have requirement to upload the multiple files at single integration call. Kindly respond to the community.

  • Hello, for the Upload Multiple File operation, could you allow the File Path and Bucket to be an expression? It only accepts plain text, which is bad when you need to have flexible File Paths and Buckets