Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

AccessDeniedException: 403 #5

Open
several27 opened this issue Feb 25, 2018 · 85 comments
Open

AccessDeniedException: 403 #5

several27 opened this issue Feb 25, 2018 · 85 comments

Comments

@several27
Copy link

several27 commented Feb 25, 2018

Hi, I tried downloading the dataset but no matter what I do, I keep getting:

AccessDeniedException: 403 **********@gmail.com does not have storage.objects.list access to open-images-dataset.

when running

gsutil -m rsync -r gs://open-images-dataset/validation .

I have tried multiple times signing up here: http://www.cvdfoundation.org/datasets/open-images-dataset/signup.html and I keep getting the success message but I still can't seem to have access to download the data.

You have successfully signed up for downloading Open Images Dataset

Am I doing anything wrong or is the website that's supposed to give me access not working? Also is the access google cloud project specific?

@fruel
Copy link

fruel commented Feb 25, 2018

Same here. Just wanted to create an issue as well. I have been trying to get access since Friday. I tried signing up 3 times already.

@mkocabas
Copy link

Yeah, I have the same issue. It seems there is a problem with CVDF authentication. @tylin and @sergebelongie, could you please sort it out?

@xaionaro
Copy link

The same problem.

AccessDeniedException: 403 ****@**** does not have storage.objects.list access to open-images-dataset.

@tylin
Copy link
Contributor

tylin commented Feb 26, 2018

Hello everyone, I am looking into the problem. I will post it here once the issue is resolved.

@virgilpetcu
Copy link

Hi, any updates on this issue @tylin ? Thanks !

@tylin
Copy link
Contributor

tylin commented Mar 8, 2018

Sorry for taking quite long time to figure it out. Now you should be able to use the form to obtain access permission to the bucket.
Please try:
(1) fill in the request form and submit it
(2) try gsutil command and verify you have the access

It seems the auto confirmation mail reply is broken. I will keep tracking the problem.
In the meanwhile, please let me know if you still have problem to access the bucket.

@several27
Copy link
Author

Working for me! Huge thanks @tylin 🤗

(Still haven't got any email though)

@jjdengjj
Copy link

jjdengjj commented Apr 2, 2018

@tylin The problem still persists.
AccessDeniedException: 403 [email protected] does not have storage.objects.list access to open-images-dataset.

Problem solved! It seems the request sign up page cannot handle email address with special character in between. Here is my case which may help to identify the problem. @tylin
xxxx.xxxx@gmail => Failed
[email protected] => Passed

@TurnaevEvgeny
Copy link

Hi,
Same issue here. Tried different accounts.
Submission form always says You have successfully signed up for downloading Open Images Dataset.

gsutil -m rsync -r gs://open-images-dataset/validation .
AccessDeniedException: 403 [email protected] does not have storage.objects.list access to open-images-dataset.
No special chars or dots in email.

@tylin Any clues? Thanks!

@tvicente
Copy link

tvicente commented May 2, 2018

I have this problem for Open Images V4. Thanks in advance!

@tylin
Copy link
Contributor

tylin commented May 2, 2018

I just followed the instruction and got access permission after submitting the form. If the your problem persists, please send me a mail with your gmail associated account.

@tvicente
Copy link

tvicente commented May 2, 2018 via email

@TurnaevEvgeny
Copy link

Hi @tylin, Started to work today. I didn't do anything. Thanks!

@tylin
Copy link
Contributor

tylin commented May 2, 2018

@tvicente Could you try again now?

@Spark001
Copy link

Spark001 commented May 3, 2018

Hi @tylin, same issue for Open Images V4 after I followed the instructions.
My email is [email protected].
And how could I verify what access permission I get with gsutil?

@Halo9Pan
Copy link

Halo9Pan commented May 3, 2018

Hi @tylin, the same AccessDeniedException problem. I already sent request form
My account is [email protected].
Thanks.

@SophieZhou
Copy link

I have met the same problem, as I have already sent request form. My email is [email protected]
I have tried yesterday and today, and it failed.

@tvicente
Copy link

tvicente commented May 3, 2018

@tylin thanks a lot it works now!

@Cento2
Copy link

Cento2 commented May 3, 2018

I have the same error, my account is [email protected]

@Frank2015a
Copy link

I have the same problem, my account is [email protected]

@wicken
Copy link

wicken commented May 3, 2018

I also have the problem, [email protected]

@tylin
Copy link
Contributor

tylin commented May 3, 2018

@wicken @Frank2015a @Cento2 you should have the access now.

@Halo9Pan
Copy link

Halo9Pan commented May 4, 2018

Hi, @tylin ,please add my account [email protected] into the access list.
Thanks.

@hongsikkim6
Copy link

Please add me into the list. [email protected].
Thank you!

@Trinkle23897
Copy link

Please add me into the list. [email protected]
Thank you!

@Trinkle23897
Copy link

but my computer have different output...

n+e:/media/trinkle/TOSHIBA EXT gcloud auth login

......

WARNING: gcloud auth login no longer writes application default credentials.
If you need to use ADC, see:
gcloud auth application-default --help

You are now logged in as [[email protected]].
Your current project is [None]. You can change this setting by running:
$ gcloud config set project PROJECT_ID

n+e:/media/trinkle/TOSHIBA EXT gsutil -m rsync -r gs://open-images-dataset/ .
Building synchronization state...
Caught non-retryable exception while listing gs://open-images-dataset/: ServiceException: 401 Anonymous caller does not have storage.objects.list access to open-images-dataset.
CommandException: Caught non-retryable exception - aborting rsync

n+e:/media/trinkle/TOSHIBA EXT gsutil -m rsync -r gs://open-images-dataset/train .
ServiceException: 401 Anonymous caller does not have storage.objects.list access to open-images-dataset.

I have tried filling in the table in http://www.cvdfoundation.org/datasets/open-images-dataset/signup.html many times and the gsutil always tells these things...

@tylin
Copy link
Contributor

tylin commented May 4, 2018

@hongsikkim6 @Trinkle23897 You had the access of the bucket. @Halo9Pan You should have the access of the bucket now.

@EliasVansteenkiste
Copy link

Hi @tylin could you add me to the access list?
[email protected]
Thanks in advance

@Noplz
Copy link

Noplz commented May 7, 2018

Hi @tylin same issue here. AccessDeniedException: 403 [email protected] does not have storage.objects.list access to open-images-dataset. Could you please add me to the access list? Thanks !

@shesung
Copy link

shesung commented May 7, 2018

Same problem. Please add me into the list. [email protected]
Thanks!

@kimhyongook
Copy link

@tylin Please add me into the list. [email protected]
Thank you!

@guang-chen
Copy link

@tylin same issue. Please add me [email protected] thanks!

@batracos
Copy link

@tylin, not sure whether here it's the best place but it seems to have become the go-to place for this authentication issue.
Can you please add [ [email protected] ]?
Thanks a lot

@DamonLiuTHU
Copy link

plz add [[email protected]] to the list!
Thanks a lot @tylin

@honey-rjj
Copy link

@tylin same issue. Please add me [email protected] thanks!

@MikuZZZ
Copy link

MikuZZZ commented Jul 11, 2018

Please add me [email protected] thanks a lot! @tylin

@dhimanpd
Copy link

@tylin Can you add [email protected]

@tylin
Copy link
Contributor

tylin commented Jul 11, 2018

Sorry everyone who has the download problem. We copied the data from Google Cloud to Amazon Web Service. The data now has a copy in s3://open-images-dataset bucket. Note you do not need to register to access the bucket. Please follow the instructions in the updated README to download images.

@cxz
Copy link

cxz commented Jul 11, 2018

Still not working for me:

$ aws s3 sync s3://open-images-dataset/train train
fatal error: An error occurred (AccessDenied) when calling the ListObjects operation: Access Denied

@tylin
Copy link
Contributor

tylin commented Jul 11, 2018

@cxz I fixed the ListObjects policy. Could you try again? Thanks!

@cxz
Copy link

cxz commented Jul 11, 2018

Same error.

@tylin
Copy link
Contributor

tylin commented Jul 11, 2018

@cxz Do you try aws s3 --no-sign-request sync s3://open-images-dataset/train train?

@cxz
Copy link

cxz commented Jul 11, 2018

Now it works, thanks!

@guang-chen
Copy link

The AWS S3 works for me. Thanks.

@cxz
Copy link

cxz commented Jul 12, 2018

Running this command is going to create 1.7M files in the same directory. I suspect my filesystem does not support that.

@honey-rjj
Copy link

@tylin Can you add one for me?I want to download a category, not all [[email protected] ]

@tylin
Copy link
Contributor

tylin commented Jul 12, 2018

@honey-rjj you can download individual image in s3 bucket. You don't necessary need to download it from gcs bucket.

@jponttuset
Copy link

jponttuset commented Jul 12, 2018

@cxz Modern filesystems (e.g. EXT4 or NTFS) won't have any problem with this amount of files, the limit is 4,294,967,295, see https://stackoverflow.com/questions/466521/how-many-files-can-i-put-in-a-directory/466596#466596

@cxz
Copy link

cxz commented Jul 12, 2018

@jponttuset Thanks for the info. Maybe it's not a filesystem limit, but what I observe is that as the number of files increase writing to (or reading from) the directory become increasingly slower. My workaround was to download with a custom script instead of aws s3 sync or gsutil sync.

@jponttuset
Copy link

@cxz I have not experienced that myself, I downloaded at a consistent speed from beginning to end, and I don't experience any slowness in reading the files.
I use EXT4, by the way, maybe some other filesystem are not as good.

@cxz
Copy link

cxz commented Jul 12, 2018

I'm using ext4 as well. Maybe it's related to the OS.

@DamonLiuTHU
Copy link

Why not create say ten zip files on aws so that we can sync these things one by one?
Just downloading so much files into one single directory is just too error-prone.
I have encountered the problem of filesystem limit(because the admin of linux server set the file number limit for each directory).

@kekedan
Copy link

kekedan commented Sep 12, 2018

@tylin Can you add [email protected] thanks!

@jponttuset
Copy link

@kekedan,
Current instructions point you to download the images from AWS, where you do not need permissions. The old Google cloud bucket is no longer in use for new users.

@DamonLiuTHU
CVDF made ZIPs available :)

@ahmed123abc
Copy link

hi i'm havhing the same issue
"2019/03/11 13:06:26 Failed to copyto: googleapi: Error 403: [email protected] does not have storage.objects.get access to rclone/placehere., forbidden"
the email i'm using is keysightsd2019 can you help fix the issue please by adding me to.

@jponttuset
Copy link

@ahmed123abc
Current instructions point you to download the images from AWS or Figure Eight, where you do not need permissions. The old Google cloud bucket is no longer in use for new users.

I also looks like you're trying to copy to the bucket?

@ahmed123abc
Copy link

ahmed123abc commented Mar 11, 2019 via email

@huyhieu2135
Copy link

Hello @tylin pls add me [email protected]. Thanks!

@MinKangChew
Copy link

Hi @tylin, I have submitted the request form. Can you please add me in? [email protected]
Thanks in advance!

@jponttuset
Copy link

Current instructions point you to download the images from AWS or Figure Eight, where you do not need permissions. The old Google cloud bucket is no longer in use for new users.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests