-
Notifications
You must be signed in to change notification settings - Fork 12
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Save scraper logs in csv #73
base: master
Are you sure you want to change the base?
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The current implementation generates a separate log.csv
file for each day, which complicates tasks such as reviewing logs from the past X days since it requires opening X different files. This approach is similarly cumbersome when trying to fetch logs for a specific scraper, as we might need to do after (or as part of) bitcoinsearch/bitcoinsearch-app#137.
Switching to a single .csv
file per scraper would streamline this process. By including an ISO 8601 timestamp as the first column in the .csv
file - consistent with the created_at
field in the Elasticsearch documents - logs can be more easily filtered and accessed based on both time and scraper.
…-logs # Conflicts: # bitcoinops/main.py # common/elasticsearch_utils.py
a4bd36f
to
2112dad
Compare
Create a scraper logs for each Scraper in an individual csv file