Crawl Domain Find Expred Domains -- 2
$10-15 USD
Pagato alla consegna
We are looking for a crawler to crawl every page of a website looking for external links pointing to expired domains.
User should definde a list of sites to crawl via text file. Crawler should work logically crawling all pages of a site and not be sitemap dependent. Only unique external domains should be logged to prevent duplicate domain availability lookups.
User should also be able to define a list of urls to ignore checking for availability; eg. [login to view URL] etc. these domains should be user defined in a blacklist text file.
Results should be given in a csv file listing linking domain and available domain.
Rif. progetto: #19765484
Info sul progetto
6 freelance hanno fatto un'offerta media di $28 per questo lavoro
Hi, I have gone through your requirement to scrape lots of websites. I am EXPERT in building scraping tools /scripts. Hence, I can SURELY work on your project. I am having 4 YEARS of EXPERIENCE in developing PHP-PYTHO Altro
Dear Prospect Hiring Manager. Thank you for giving me a chance to bid on your project. i am a serious bidder here and i have already worked on a similar project before and can deliver as u have mentioned "I can do th Altro
I have experiment in crawling data using be4, scrapy,... with python, extract data to xml, json,... Contact me!
Hi there JUNA here. I understand that you need a crawler or A SPIDER for scraping expired domain but my question is will you provide the list of domains that you need to check for the availability otherwise this sho Altro