本文介绍了如何使用AWS AppSync将文件上传到AWS S3的处理方法,对大家解决问题具有一定的参考价值,需要的朋友们下面随着小编来一起学习吧!

问题描述

跟随此docs/tutorial 在AWS AppSync文档中.

Following this docs/tutorial in AWS AppSync Docs.

它说明:

但是,我无法将文件上传到s3存储桶.我了解该教程缺少很多细节.更具体地说,本教程并没有说NewPostMutation.js需要更改.

However, I cannot make my file to upload to my s3 bucket. I understand that tutorial missing a lot of details. More specifically, the tutorial does not say that the NewPostMutation.js needs to be changed.

我通过以下方式对其进行了更改:

I changed it the following way:

import gql from 'graphql-tag';

export default gql`
mutation AddPostMutation($author: String!, $title: String!, $url: String!, $content: String!, $file: S3ObjectInput ) {
    addPost(
        author: $author
        title: $title
        url: $url
        content: $content
        file: $file
    ){
        __typename
        id
        author
        title
        url
        content
        version
    }
}
`

但是,即使执行了这些更改,该文件也没有上传...

Yet, even after I have implemented these changes, the file did not get uploaded...

推荐答案

在引擎盖下需要一些活动部件,以确保在此有效"(TM)之前就位.首先,您需要确保在GraphQL模式中定义的S3对象具有适当的输入和类型

There's a few moving parts under the hood you need to make sure you have in place before this "just works" (TM). First of all, you need to make sure you have an appropriate input and type for an S3 object defined in your GraphQL schema

enum Visibility {
    public
    private
}

input S3ObjectInput {
    bucket: String!
    region: String!
    localUri: String
    visibility: Visibility
    key: String
    mimeType: String
}

type S3Object {
    bucket: String!
    region: String!
    key: String!
}

S3ObjectInput类型当然是在上载新文件时使用的-通过创建或更新模型的方式来嵌入所述S3对象元数据.可以通过以下方法在突变的请求解析器中进行处理:

The S3ObjectInput type, of course, is for use when uploading a new file - either by way of creating or updating a model within which said S3 object metadata is embedded. It can be handled in the request resolver of a mutation via the following:

{
    "version": "2017-02-28",
    "operation": "PutItem",
    "key": {
        "id": $util.dynamodb.toDynamoDBJson($ctx.args.input.id),
    },

    #set( $attribs = $util.dynamodb.toMapValues($ctx.args.input) )
    #set( $file = $ctx.args.input.file )
    #set( $attribs.file = $util.dynamodb.toS3Object($file.key, $file.bucket, $file.region, $file.version) )

    "attributeValues": $util.toJson($attribs)
}

这是假设S3文件对象是附加到DynamoDB数据源的模型的子字段.请注意,对$utils.dynamodb.toS3Object()的调用将建立复杂的S3对象file,这是类型为S3ObjectInput的模型字段.以这种方式设置请求解析器可以处理文件到S3的上载(所有凭据都正确设置后,我们将在稍后进行介绍),但是它没有解决如何获取S3Object的问题.背部.这是附加到本地数据源的字段级解析器的地方.本质上,您需要在AppSync中创建本地数据源,并使用以下请求和响应解析器将其连接到架构中模型的file字段:

This is making the assumption that the S3 file object is a child field of a model attached to a DynamoDB datasource. Note that the call to $utils.dynamodb.toS3Object() sets up the complex S3 object file, which is a field of the model with a type of S3ObjectInput. Setting up the request resolver in this way handles the upload of a file to S3 (when all the credentials are set up correctly - we'll touch on that in a moment), but it doesn't address how to get the S3Object back. This is where a field level resolver attached to a local datasource becomes necessary. In essence, you need to create a local datasource in AppSync and connect it to the model's file field in the schema with the following request and response resolvers:

## Request Resolver ##
{
    "version": "2017-02-28",
    "payload": {}
}

## Response Resolver ##
$util.toJson($util.dynamodb.fromS3ObjectJson($context.source.file))

此解析器仅告诉AppSync我们要获取DynamoDB中存储的JSON字符串作为模型的file字段并将其解析为S3Object-这样,当您查询模型时,而不是返回存储在file字段中的字符串,而是得到一个包含bucketregionkey属性的对象,可用于构建URL来访问S3对象(直接通过S3或使用CDN-这实际上取决于您的配置.

This resolver simply tells AppSync that we want to take the JSON string that is stored in DynamoDB for the file field of the model and parse it into an S3Object - this way, when you do a query of the model, instead of returning the string stored in the file field, you get an object containing the bucket, region, and key properties that you can use to build a URL to access the S3 Object (either directly via S3 or using a CDN - that's really dependent on your configuration).

但是,请确保您已为复杂对象设置了凭据(告诉您,我会回到这里).我将使用一个React示例来说明这一点-定义AppSync参数(端点,身份验证等)时,需要定义一个名为complexObjectCredentials的附加属性,以告知客户端使用哪些AWS凭证来处理S3上传,例如:

Do make sure you have credentials set up for complex objects, however (told you I'd get back to this). I'll use a React example to illustrate this - when defining your AppSync parameters (endpoint, auth, etc.), there is an additional property called complexObjectCredentials that needs to be defined to tell the client what AWS credentials to use to handle S3 uploads, e.g.:

const client = new AWSAppSyncClient({
    url: AppSync.graphqlEndpoint,
    region: AppSync.region,
    auth: {
        type: AUTH_TYPE.AWS_IAM,
        credentials: () => Auth.currentCredentials()
    },
    complexObjectsCredentials: () => Auth.currentCredentials(),
});

假设所有这些内容均已就绪,则可以通过AppSync上传和下载S3.

Assuming all of these things are in place, S3 uploads and downloads via AppSync should work.

这篇关于如何使用AWS AppSync将文件上传到AWS S3的文章就介绍到这了,希望我们推荐的答案对大家有所帮助,也希望大家多多支持!

05-16 03:26