Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: fix crawl pixiv #66

Merged
merged 2 commits into from
Dec 4, 2024
Merged

fix: fix crawl pixiv #66

merged 2 commits into from
Dec 4, 2024

Conversation

RyouMon
Copy link
Owner

@RyouMon RyouMon commented Dec 4, 2024

(FavoritesCrawler310) C:\Users\Wen\OneDrive\Projects\FavoritesCrawler>favors crawl pixiv
2024-12-04 13:20:41 [scrapy.utils.log] INFO: Scrapy 2.12.0 started (bot: favorites_crawler)
2024-12-04 13:20:41 [scrapy.utils.log] INFO: Versions: lxml 5.3.0.0, libxml2 2.11.7, cssselect 1.2.0, parsel 1.9.1, w3lib 2.2.1, Twisted 24.10.0, Python 3.10.15 | packaged by Anaconda, Inc. | (main, Oct  3 2024, 07:22:19) [MSC v.1929 64 bit (AMD64)], pyOpenSSL 24.3.0 (OpenSSL 3.4.0 22 Oct 2024), cryptography 44.0.0, Platform Windows-10-10.0.22631-SP0
2024-12-04 13:20:41 [scrapy.addons] INFO: Enabled addons:
[]
2024-12-04 13:20:41 [scrapy.utils.log] DEBUG: Using reactor: twisted.internet.selectreactor.SelectReactor
2024-12-04 13:20:41 [scrapy.middleware] INFO: Enabled extensions:
['scrapy.extensions.corestats.CoreStats',
 'scrapy.extensions.closespider.CloseSpider',
 'scrapy.extensions.logstats.LogStats']
2024-12-04 13:20:41 [scrapy.crawler] INFO: Overridden settings:
{'BOT_NAME': 'favorites_crawler',
 'CLOSESPIDER_ITEMCOUNT': 5,
 'DOWNLOAD_DELAY': 0.5,
 'DOWNLOAD_WARNSIZE': 0,
 'NEWSPIDER_MODULE': 'favorites_crawler.spiders',
 'SPIDER_MODULES': ['favorites_crawler.spiders'],
 'STATS_DUMP': False,
 'TELNETCONSOLE_ENABLED': False,
 'USER_AGENT': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 '
               '(KHTML, like Gecko) Chrome/131.0.0.0 Safari/537.36 '
               'Edg/131.0.0.0'}
Unhandled error in Deferred:
2024-12-04 13:20:41 [twisted] CRITICAL: Unhandled error in Deferred:

Traceback (most recent call last):
  File "C:\Users\Wen\miniconda3\envs\FavoritesCrawler310\lib\site-packages\twisted\internet\defer.py", line 2017, in _inlineCallbacks
    result = context.run(gen.send, result)
  File "C:\Users\Wen\miniconda3\envs\FavoritesCrawler310\lib\site-packages\scrapy\crawler.py", line 152, in crawl
    self.engine = self._create_engine()
  File "C:\Users\Wen\miniconda3\envs\FavoritesCrawler310\lib\site-packages\scrapy\crawler.py", line 166, in _create_engine
    return ExecutionEngine(self, lambda _: self.stop())
  File "C:\Users\Wen\miniconda3\envs\FavoritesCrawler310\lib\site-packages\scrapy\core\engine.py", line 101, in __init__
    self.downloader: Downloader = downloader_cls(crawler)
  File "C:\Users\Wen\miniconda3\envs\FavoritesCrawler310\lib\site-packages\scrapy\core\downloader\__init__.py", line 109, in __init__
    DownloaderMiddlewareManager.from_crawler(crawler)
  File "C:\Users\Wen\miniconda3\envs\FavoritesCrawler310\lib\site-packages\scrapy\middleware.py", line 77, in from_crawler
    return cls._from_settings(crawler.settings, crawler)
  File "C:\Users\Wen\miniconda3\envs\FavoritesCrawler310\lib\site-packages\scrapy\middleware.py", line 88, in _from_settings
    mw = build_from_crawler(mwcls, crawler)
  File "C:\Users\Wen\miniconda3\envs\FavoritesCrawler310\lib\site-packages\scrapy\utils\misc.py", line 201, in build_from_crawler
    instance = objcls(*args, **kwargs)
  File "C:\Users\Wen\OneDrive\Projects\FavoritesCrawler\src\favorites_crawler\middlewares.py", line 11, in __init__
    self.access_token = refresh_pixiv()
  File "C:\Users\Wen\OneDrive\Projects\FavoritesCrawler\src\favorites_crawler\utils\auth.py", line 27, in refresh_pixiv
    config = load_config()
builtins.TypeError: load_config() missing 1 required positional argument: 'home'

Copy link

codecov bot commented Dec 4, 2024

Codecov Report

Attention: Patch coverage is 91.89189% with 3 lines in your changes missing coverage. Please review.

Project coverage is 81.83%. Comparing base (f68bcbe) to head (82006d1).
Report is 3 commits behind head on main.

Files with missing lines Patch % Lines
src/favorites_crawler/commands/crawl.py 25.00% 3 Missing ⚠️
@@            Coverage Diff             @@
##             main      #66      +/-   ##
==========================================
+ Coverage   80.78%   81.83%   +1.05%     
==========================================
  Files          37       37              
  Lines        1353     1382      +29     
==========================================
+ Hits         1093     1131      +38     
+ Misses        260      251       -9     
Files with missing lines Coverage Δ
src/favorites_crawler/utils/auth.py 85.71% <100.00%> (+28.21%) ⬆️
src/favorites_crawler/utils/config.py 100.00% <100.00%> (ø)
tests/test_utils/test_auth.py 100.00% <100.00%> (ø)
src/favorites_crawler/commands/crawl.py 67.34% <25.00%> (-2.22%) ⬇️

@RyouMon RyouMon merged commit d5fe54c into main Dec 4, 2024
10 checks passed
@RyouMon RyouMon deleted the fix-crawl-pixiv branch December 4, 2024 06:30
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

1 participant