首页
学习
活动
专区
圈层
工具
发布
首页
学习
活动
专区
圈层
工具
MCP广场
社区首页 >问答首页 >PERMISSION_DENIED:权限TABLES_UPDATE_DATA

PERMISSION_DENIED:权限TABLES_UPDATE_DATA
EN

Stack Overflow用户
提问于 2021-05-09 01:41:29
回答 1查看 124关注 0票数 0

我正在尝试使用用Java语言编写的云函数将JSON对象插入到Google BigQuery中。然而,代码给出了一个奇怪的权限错误,我想确认Cloud function被提供了所有的Bigquery权限,以便写入表。还将验证ProjectID、数据集名称和表名称是否正确。

在运行Google Cloud Function中的以下代码时,我遇到了运行时异常。请帮帮忙。错误:

2021-05-08 22:52:45.674 ISTTopicReaderGCPJFunctionesaj66v5ty43 OnError called: com.google.api.gax.rpc.PermissionDeniedException: io.grpc.StatusRuntimeException: PERMISSION_DENIED: Permission 'TABLES_UPDATE_DATA' denied on resource '<removed>' (or it may not exist). OnError called: com.google.api.gax.rpc.PermissionDeniedException: io.grpc.StatusRuntimeException: PERMISSION_DENIED: Permission 'TABLES_UPDATE_DATA' denied on resource '<removed>' (or it may not exist).

代码语言:javascript
运行
复制
package functions;

import com.google.api.core.ApiFuture;
import com.google.cloud.bigquery.BigQuery;
import com.google.cloud.bigquery.BigQueryOptions;
import com.google.cloud.bigquery.Schema;
import com.google.cloud.bigquery.Table;
import com.google.cloud.bigquery.storage.v1beta2.AppendRowsResponse;
import com.google.cloud.bigquery.storage.v1beta2.BigQueryWriteClient;
import com.google.cloud.bigquery.storage.v1beta2.CreateWriteStreamRequest;
import com.google.cloud.bigquery.storage.v1beta2.JsonStreamWriter;
import com.google.cloud.bigquery.storage.v1beta2.TableName;
import com.google.cloud.bigquery.storage.v1beta2.WriteStream;
import com.google.protobuf.Descriptors.DescriptorValidationException;
import java.io.IOException;
import java.util.concurrent.ExecutionException;
import org.json.JSONArray;
import org.json.JSONObject;

//import org.json.simple.JSONArray;
//import org.json.simple.JSONObject;
//import org.json.simple.parser.JSONParser;

import java.util.*;

public class WriteCommittedStream {
  // TODO(developer): Replace these variables before running the sample.
  public static boolean writeCommittedStreamToBQ(String projectId, String datasetName, String tableName,
      HashSet<JSONArray> streamHashSet) throws DescriptorValidationException, InterruptedException, IOException {

    try (BigQueryWriteClient client = BigQueryWriteClient.create()) {

      Iterator<JSONArray> value = streamHashSet.iterator();
      while (value.hasNext()) {
        // Create a JSON object that is compatible with the table schema.
        // JSONObject record = new JSONObject();

        // record.put("col1", String.format("record %03d", value.next()));
        // JSONArray jsonArr = new JSONArray();
        // value.next().
        // jsonArr.get(value.next());
        System.out.println(value.next().get(0).toString());
      }
      // Array -> {data}
      // Array -> [{data}]
      System.out.println("projectId:" + projectId);
      System.out.println("datasetName:" + datasetName);
      System.out.println("tableName:" + tableName);

      BigQuery bigquery = BigQueryOptions.getDefaultInstance().getService();
      Table table = bigquery.getTable(datasetName, tableName);
      TableName parentTable = TableName.of(projectId, datasetName, tableName);
      Schema schema = table.getDefinition().getSchema();

      System.out.println("Schema:" + schema.toString());
      System.out.println("Table:" + parentTable.toString());

      try (JsonStreamWriter writer = JsonStreamWriter.newBuilder(parentTable.toString(), schema).createDefaultStream()
          .build()) {
       
        // Append 10 JSON objects to the stream.
        Iterator<JSONArray> value2 = streamHashSet.iterator();

        /////////////////////
/*
        JSONArray jsonArr = value2.next();
        System.out.println("Inside the loop:"+jsonArr);
        ApiFuture<AppendRowsResponse> future = writer.append(jsonArr);
        AppendRowsResponse response = future.get();
        System.out.println("Appended records successfully." + response.toString());
*/      
        JSONObject record = new JSONObject();
        record.put("RPdeviceName", String.format("record %03d",0));
        record.put("RPorganisationName", String.format("record %03d",1));
        record.put("RPdate", String.format("record %03d",2));
        record.put("RPtime", String.format("record %03d",3));
        record.put("RPmacid", String.format("record %03d",4));

        record.put("status", String.format("record %03d",5));
        record.put("mac", String.format("record %03d",6));
        record.put("date", String.format("record %03d",7));
        record.put("time", String.format("record %03d",8));
        record.put("count", String.format("record %03d",9));
        record.put("peakadc", String.format("record %03d",10));
        record.put("reset", String.format("record %03d",11));
        

        JSONArray jsonArr = new JSONArray();
        jsonArr.put(record);
        System.out.println("Initially." + jsonArr.toString());

        ApiFuture<AppendRowsResponse> future = writer.append(jsonArr);
        AppendRowsResponse response = future.get();
        System.out.println("Appended records successfully." + response.toString());
        //////////////////////
            /*
        while (value2.hasNext()) {
          // Create a JSON object that is compatible with the table schema.
          // JSONObject record = new JSONObject();

          // record.put("col1", String.format("record %03d", value.next()));
          JSONArray jsonArr = value2.next();
          // jsonArr.put(value2.next());
          // System.out.println(jsonArr.get(0).toString());

          System.out.println("Inside the loop:"+jsonArr);
          // jsonArr =

          // To detect duplicate records, pass the index as the record offset.
          // To disable deduplication, omit the offset or use WriteStream.Type.DEFAULT.
          ApiFuture<AppendRowsResponse> future = writer.append(jsonArr);
          
          AppendRowsResponse response = future.get();
          System.out.println("Appended records successfully." + response.toString());
        }*/
      }

      return true;
    } catch (ExecutionException e) {
      // If the wrapped exception is a StatusRuntimeException, check the state of the
      // operation.
      // If the state is INTERNAL, CANCELLED, or ABORTED, you can retry. For more
      // information, see:
      // https://grpc.github.io/grpc-java/javadoc/io/grpc/StatusRuntimeException.html
      System.out.println("Failed to append records. \n" );
      e.printStackTrace();
      return false;
    }
  }
}
EN

回答 1

Stack Overflow用户

发布于 2021-05-10 15:29:50

好吧,这对你应该很有效。

首先,如果您需要两个Google Cloud API/服务(例如CF和BQ )来相互交互,您需要创建一个服务帐户,然后为该帐户提供一个预定义的角色,该角色是与其他API服务交互所需的,如您的案例(BigQuery管理角色/Bigquery.admin)。

解决方案:

  1. 为CF.

创建服务帐户

  1. 您将获得一个json文件,其中包括凭据信息和电子邮件地址。

管理员将此电子邮件地址提供给

  1. - roles/bigquery.admin.

CF代码中的

  1. 包括从CF Check this Samples Code from Google cloud

中使用的所有凭证信息

  1. 部署您的CF。

这个过程在我的Python中运行得很好。

更多信息:

Understanding service accounts

票数 0
EN
页面原文内容由Stack Overflow提供。腾讯云小微IT领域专用引擎提供翻译支持
原文链接:

https://stackoverflow.com/questions/67450448

复制
相关文章

相似问题

领券
问题归档专栏文章快讯文章归档关键词归档开发者手册归档开发者手册 Section 归档