google-cloud-storage's questions - Norwegian 1answer

0 google-cloud-storage questions.

My tech stack is Angularjs, python, webapp2, and AppEngine. I've had immense difficulty in figuring out how to do this. The documentation from Google is very poor. I either want to use blob store or ...

I am reading my data file using the following commands: data_dir = arguments['data_dir'] data = pd.read_csv(data_dir + "/train.csv") I am using this data to train my model on Google Cloud ML, I am ...

When I run gsutil rsynch from the GCP Console, or a .bat file, the full progress data does not display (it used to I'm pretty sure.) I'm on vers 403.0.0 Here is the command: >gsutil rsync -r -n \\...

I'm trying to load the data of a csv file that is saved in GCS into BigQuery. The csv file is in the UTF-8 format and it contains 7 columns. I've specified these columns in the data scheme (all ...

We are considering the alternatives to PubSub, due to high costs. For some of our low-value and high-volume data it can get quite expensive. The plan of using PubSub: Run the service in Kubernetes ...

We have a Google Cloud LB setup with an backend storage bucket and CDN enabled, i added the allUsers member with Storage Object Viewer permissions so we can reach the public data with an normal url. ...

I can move data in google storage to buckets using the following: gsutil cp afile.txt gs://my-bucket How to do the same using the python api library: from google.cloud import storage storage_client ...

I have a GCS bucket containing some files in the path gs://main-bucket/sub-directory-bucket/object1.gz I would like to programmatically check if the sub-directory bucket contains one specific file. ...

For uploading files to google cloud buckets, i'm using JSON API. for that i crated a Bearer Token by using following commands $> gcloud auth activate-service-account myaccount@gserviceaccounts.co -...

As per Google's instructions, I create a cors-json-file.json [ { "origin": ["*"], "responseHeader": ["Content-Type"], "method": ["GET"], "maxAgeSeconds": 10 } ] Then ...

To upload objects to google cloud storage buckets, I need an authentication token. I have chosen JSON API method to upload objects to buckets. <?php $ch = curl_init(); curl_setopt($ch, CURLOPT_URL,...

I have a web application written in go and want to access a local google cloud storage - GCS (not datastore). I am not using GAE, I will be using GCE. How do I set up GCS locally and access it from Go?...

I have this piece of code: ctx:=context.Background() cliente, err := storage.NewClient(ctx) if err != nil { log.Fatal(err) } clienteCS := cliente.Bucket("prueba123456789") ...

I've been trying for past couple of hours to setup a transfer from S3 to my google storage bucket. The error that i keep getting, when creating the transfer is: "Invalid access key. Make sure the ...

I'm trying to use the Google Cloud Storage SDK for Java into my Spring application. Using Maven I've added it to my dependencies: <dependency> <groupId>com.google.cloud</groupId>...

I need to rotate image according to image orientation and resize it. i get the image orientation from exif_read_data($source) function. in dev server when i do. $source="https://bucket.storage....

I want to persist data from a publicly-accessible API that returns a list of JSON objects, one for each of the past N events, when called. The structure of the JSON objects is simple and consistent. N ...

TL;DR. I'm lost as to how to access the data after deleting a PVC, as well as why PV wouldn't go away after deleting a PVC. Steps I'm taking: created a disk in GCE manually: gcloud compute disks ...

When loading data from GCS to BQ, is it possible to set "max_bad_records" option with no limitation like: bq load --max_bad_records=1 \ prj:dst.tbl gs://bkt/very_large_and_very_messy.log \ ./schema/...

I'm trying to get an resumable signed upload url from google cloud storage, but i'm getting invalid policy document error. below is the code i'm using. whats that i'm doing wrong here. Here uploadUrl()...

I use Potoswipe in my project. Potoswipe requires the definition of the images size. In a time when I used PHP that was a simple, nowadays in the serverless time it looks like an issue. How can I get ...

I deployed an instance of Wowza Streaming Engine on Google Cloud thank Made a bucket in Google Cloud Storage and mounted it all with GCFUSE. My bucket connected with success and I can see in it and ...

Is there any way to set Bucket's default cache control (trying to override the public, max-age=3600 in bucket level every time creating a new object) Similar to defacl but set the cache control

Is there a way to get the size in bytes of a folder from google cloud storage through an API? I saw there is a du command for the https://cloud.google.com/storage/docs/gsutil/commands/du for gsutil. ...

i am new in leaflet and google cloud storage(bucket). I want to render a tile raster map in leaflet. My tile raster map is in google bucket. Now i want the raster map to render it in leaflet. I am ...

I am trying to load google cloud storage files to on premise Hadoop Cluster. I developed a workaround(program) to download files on local EdgeNode and distcp to Hadoop. But this seems two-way ...

I want to mount a Google bucket to a local server. However, when I run the line, the directory I point it to is empty. Any ideas? gcsfuse mssng_vcf_files ./mountbucket/ It reports: File system ...

I would like to use google storage for backing up my database. However, for security reason, i would like to use a "service account" with a write only role. But it seems like this role can also ...

I've exported MySQL Database following the MySQL Export Guide successfully. Now, I'm trying to import MySQL Database following the MySQL Import Guide. I've checked the permissions for the ...

I've been looking for a way to retrieve only directories inside a bucket but not what's in them. As per Google Cloud Storage Docs one can filter by prefix by: const Storage = require('@google-cloud/...

I want to use Keras to train a model on a dataset of 40 GB of images and I'm trying to make the process of reading those images as efficient as possible. Downloading them locally is not an option. ...

I tried to run the code bellow by following the google tutorials i found here: https://cloud.google.com/docs/authentication/production def implicit(): from google.cloud import storage # If ...

I'm trying to add Backend bucket ( GCS : Google Cloud Storage ) to Google Kubernetes Engine GCLB ( Google Cloud Load Balancer ). but, few minutes later, It's rollback. How can i add bucket to GKE ...

I created a signed URL for google cloud storage as described in the documentation. I did not apply any optional or as needed headers. With the signed URL, I curl using: curl -v -X PUT filename.jpg "&...

I cannot set the region for a BigQuery dataset when using Direct Runner using Apache Beam. I'm trying to get data from Oracle via JdbcIO.read using Apache Beam to get data and push it to BigQuery ...

The google cloud provides connectors for working with Hadoop.(https://cloud.google.com/hadoop/google-cloud-storage-connector) Using the connector, I receive data from hdfs to google cloud storage ex)...

I currently have a script which saves a csv file locally and then uploads it to Google Cloud storage. I'm looking to migrate this application to Google App Engine, however I understand you're not ...

how we can delete google cloud storage object using JSON API and php curl.

I have created a form for uploading files to Google Cloud Storage using the POST Object XML API. This works fine except for files exceeding 50KB. I have tested on Google Chrome and Firefox. Here is ...

I am working on a type of file called a Variant Call File or VCF, which contain genetic variants. I have up to 5000 of these files for individual patient samples on Google Storage Bucket inside ...

Im new to the Appengine and I can't seem to work out the static file serving. I have read through all the official related docs. I understand that tagging the directory or files as static in the app....

We are creating a data pipeline in GCP and facing some issue during testing. Our current architecture is on AWS, to test we are pushing one copy of data to pubsub from Lambda realtime. Facing latency ...

I have a static web client SPA serviced by a REST API. I'm trying to figure out the best way to host these apps on Google's Cloud Platform using App Engine to host the API, and Cloud Storage to host ...

I'm totally new to google cloud. I've a simple java spring boot application that provides REST Api services hosted on google cloud appengine. Our users are expected to access our apis through our ...

I am running a Google compute instance with a coreos container (image name: coreos-stable-1688-4-0-v20180327). Copying files from Storage to the local filesystem with gsutil seems to work fine -- ...

In AWS it was possible to run cloudwatch to trigger callback lambda functions on events. Is it possible in GCE to automatically tag servers with the user who created it based on the activity logs? ...

I want to export table data from BigQuery to Google Cloud Storage. Problem is, I need data from date1 to date2 and not whole table data. extract_job = client.extract_table( table_ref, ...

I try to upload a video in a Google Cloud Storage bucket by using resumable upload. But I always have the same error : (u'Response headers must contain header', u'location') Here is my code: client =...

I can store my video in my google bucket but I can read it. Here is my code : url_template = ( u'https://www.googleapis.com/upload/storage/v1/b/{bucket}/o?' u'uploadType=resumable' ) ...

<?php include'vendor\autoload.php'; define("PROJECT_ID", 'projectname'); define("BUCKET_NAME", 'bucketname'); $content=file_get_contents('C:\Users\Useraccount\PhpstormProjects\Projectname\filename....

Related tags

Hot questions

Language

Popular Tags