Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Instead of blocking scraping, can't we just put a limit on it? #19

Open
fiatjaf opened this issue Mar 12, 2024 · 2 comments
Open

Instead of blocking scraping, can't we just put a limit on it? #19

fiatjaf opened this issue Mar 12, 2024 · 2 comments

Comments

@fiatjaf
Copy link

fiatjaf commented Mar 12, 2024

The allow_scrape_if_limited_to requires a limit to be specified, but we'll never get clients to adhere to such requirements.

Instead why not have an option to impose a hard limit on scraping queries (or on all queries even) So if someone requests all our events we can just return the last 50 or something.

@mikedilger
Copy link
Owner

Interesting. I'm torn though.

I don't want to return results that I know are incorrect (incomplete), I'd rather return an error saying we can't fulfill the request.

@mikedilger
Copy link
Owner

Maybe we could have a config that specifies a max age, and then allow scrapes (backwards in time) up to the max age.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants