Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Accept urls with lengths larger than 2040 characters #279

Open
eoss-tw opened this issue Feb 12, 2020 · 8 comments
Open

Accept urls with lengths larger than 2040 characters #279

eoss-tw opened this issue Feb 12, 2020 · 8 comments
Labels
enhancement New feature or request

Comments

@eoss-tw
Copy link

eoss-tw commented Feb 12, 2020

Hi, I know due to compability the limitation makes sense but we want to use kutt for service urls which encode their config in the urls (e.g. kibana). Hence urls can grow quite large and with modern browsers this is fine. I made a quick change for v1 resp v2 of your api but got an error resolving the urls. I can post the urls into the db but got a 500 with the new link. I am using nginx as proxy, maybe this could be the bottleneck. Is it possible to increase logging to dig any further?

Thanks
Thilo

@eoss-tw
Copy link
Author

eoss-tw commented Feb 12, 2020

Ok, nginx was indeed the bottle neck. Maybe you can exclude the limit to the config for the table generation/validator.
Thilo

@poeti8
Copy link
Contributor

poeti8 commented Feb 16, 2020

The problem with making it configurable is that the table needs to be updated every time. What's the ideal max limit for you?

@eoss-tw
Copy link
Author

eoss-tw commented Feb 17, 2020 via email

@poeti8 poeti8 added the enhancement New feature or request label Mar 9, 2020
@trgwii
Copy link
Member

trgwii commented Sep 24, 2020

This idea sounds terrible, I don't think I would want it to support > 2 000 chars in URLs, and for sure not > 8 000.

Even the nodejs core HTTP implementation supports at most 50 000 chars in total in the entire request header payload, so if they can increase even more you'll eventually run into platform problems anyway (meaning you would need to find a service NOT written in node).

@eoss-tw
Copy link
Author

eoss-tw commented Sep 24, 2020 via email

@trgwii
Copy link
Member

trgwii commented Sep 25, 2020

On the backend side it should not matter/restrict anything.

I disagree, because max data lengths matter for relational databases, they affect the overall size / size per row, as well as query speeds.

@eoss-tw
Copy link
Author

eoss-tw commented Sep 25, 2020 via email

@chq-matteo
Copy link

I was wondering if we could revisit this

The problem with making it configurable is that the table needs to be updated every time. What's the ideal max limit for you?

I see, that's probably not great, we probably don't want to update this often

However, based on rfc9110, maybe it's worth bumping the limit to 8000?

It is RECOMMENDED that all senders and recipients support, at a minimum, URIs with lengths of 8000 octets in protocol elements. Note that this implies some structures and on-wire representations (for example, the request line in HTTP/1.1) will necessarily be larger in some cases.

refs:

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

4 participants