Skip to content

danielnaumau/web-crawler

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

12 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Web crawler

That's the simple API based app that takes a list of URLs, crawls the data concurrently, and returns the crawled data. The app is hosted here: web-crawler.clevelheart.com

Api details

Request

endpoint: /api/crawl
http method: post
request body:
 {
   "urls": ["https://google.com", "https://github.com", "https://failed_request.com"]
 } 

Response

{
  "results": [
    {
      "url": "https://google.com", 
      "data": "..."
    }, 
    {
      "url": "https://github.com",
      "data": "..."
    }
  ],
  "errors": [
    "msg": "...",
    "url": "https://failed_request.com"
  ]
}

Development

Execute sbt run to run this application locally. It'll be available here: localhost:8080

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published