The Wayback Machine - https://web.archive.org/web/20220209042254/https://github.com/go-gitea/gitea/pull/16547
Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add API Token Cache #16547

Merged
merged 2 commits into from Aug 17, 2021
Merged

Add API Token Cache #16547

merged 2 commits into from Aug 17, 2021

Conversation

@zeripath
Copy link
Contributor

@zeripath zeripath commented Jul 25, 2021

One of the issues holding back performance of the API is the problem of hashing.
Whilst banning BASIC authentication with passwords will help, the API Token scheme
still requires a PBKDF2 hash - which means that heavy API use (using Tokens) can
still cause enormous numbers of hash computations.

A slight solution to this whilst we consider moving to using JWT based tokens and/or
a session orientated solution is to simply cache the successful tokens. This has some
security issues but this should be balanced by the security issues of load from
hashing.

Related #14668

Signed-off-by: Andrew Thornton [email protected]

One of the issues holding back performance of the API is the problem of hashing.
Whilst banning BASIC authentication with passwords will help, the API Token scheme
still requires a PBKDF2 hash - which means that heavy API use (using Tokens) can
still cause enormous numbers of hash computations.

A slight solution to this whilst we consider moving to using JWT based tokens and/or
a session orientated solution is to simply cache the successful tokens. This has some
security issues but this should be balanced by the security issues of load from
hashing.

Related go-gitea#14668

Signed-off-by: Andrew Thornton <[email protected]>
@codecov-commenter
Copy link

@codecov-commenter codecov-commenter commented Jul 25, 2021

Codecov Report

Merging #16547 (0615a36) into main (6a33b29) will increase coverage by 0.00%.
The diff coverage is 68.96%.

Impacted file tree graph

@@           Coverage Diff           @@
##             main   #16547   +/-   ##
=======================================
  Coverage   45.43%   45.43%           
=======================================
  Files         749      749           
  Lines       84441    84469   +28     
=======================================
+ Hits        38368    38381   +13     
- Misses      39900    39915   +15     
  Partials     6173     6173           
Impacted Files Coverage Δ
models/models.go 55.34% <33.33%> (-0.64%) ⬇️
models/token.go 73.41% <77.27%> (+1.00%) ⬆️
modules/setting/setting.go 49.90% <100.00%> (+0.09%) ⬆️
services/mailer/mail_comment.go 77.77% <0.00%> (-7.41%) ⬇️
modules/queue/queue_bytefifo.go 73.65% <0.00%> (-3.60%) ⬇️
modules/queue/manager.go 65.34% <0.00%> (-2.85%) ⬇️
modules/notification/mail/mail.go 36.27% <0.00%> (-1.97%) ⬇️
routers/api/v1/repo/pull.go 29.35% <0.00%> (-0.52%) ⬇️
services/pull/pull.go 41.81% <0.00%> (-0.44%) ⬇️
models/repo_list.go 77.82% <0.00%> (+0.77%) ⬆️
... and 3 more

Continue to review full report at Codecov.

Legend - Click here to learn more
Δ = absolute <relative> (impact), ø = not affected, ? = missing data
Powered by Codecov. Last update 6a33b29...0615a36. Read the comment docs.

@lunny
Copy link
Member

@lunny lunny commented Aug 17, 2021

I think we should use a uniform cache infrastructure to do the cache. Previously, we have gitea.com/go-chi/cache now github.com/hashcorp/golang-lru. And the cache is a memory one which is not scaled.

@techknowlogick
Copy link
Member

@techknowlogick techknowlogick commented Aug 17, 2021

I think we should use a uniform cache infrastructure to do the cache. Previously, we have gitea.com/go-chi/cache now github.com/hashcorp/golang-lru. And the cache is a memory one which is not scaled.

This specific cache should only be in memory due to sensitive nature of content (api keys)

@lunny
Copy link
Member

@lunny lunny commented Aug 17, 2021

I think we should use a uniform cache infrastructure to do the cache. Previously, we have gitea.com/go-chi/cache now github.com/hashcorp/golang-lru. And the cache is a memory one which is not scaled.

This specific cache should only be in memory due to sensitive nature of content (api keys)

Yeah, redis or memcache is also memory. And in fact, we have stored session id in redis which is also sensitive.

@lafriks
Copy link
Member

@lafriks lafriks commented Aug 17, 2021

I think we should use a uniform cache infrastructure to do the cache. Previously, we have gitea.com/go-chi/cache now github.com/hashcorp/golang-lru. And the cache is a memory one which is not scaled.

This specific cache should only be in memory due to sensitive nature of content (api keys)

Yeah, redis or memcache is also memory. And in fact, we have stored session id in redis which is also sensitive.

But this does not needs to be shared between instances if we support such in the future so this can be left as is

@zeripath
Copy link
Contributor Author

@zeripath zeripath commented Aug 17, 2021

This is also speed dependent.

If we are hitting an external cache be it in memory or otherwise the performance increase will be lost and it would likely be quicker to just perform the hash yourself.

lunny
lunny approved these changes Aug 17, 2021
@techknowlogick techknowlogick merged commit e0853d4 into go-gitea:main Aug 17, 2021
1 check passed
@zeripath zeripath deleted the cache-successful-tokens branch Aug 17, 2021
@lafriks
Copy link
Member

@lafriks lafriks commented Aug 17, 2021

Btw does tokens has expire time?

@KyKoPho
Copy link

@KyKoPho KyKoPho commented Sep 24, 2021

@zeripath Apologies for the imposition, but would it be possible to get this into the next 1.15 patch?

@go-gitea go-gitea locked and limited conversation to collaborators Oct 19, 2021
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Linked issues

Successfully merging this pull request may close these issues.

None yet

7 participants