Parsing of 5GB dump of clean data (estimated 1 day of work)
$100-400 USD
Pagato alla consegna
1. I have a 5GB dump of crawled data that needs to be parsed.
2. It's contact data: name, workplace, phone number, fax, etc etc.. 250k contacts in total.
3. The data is very clean, all from ONE source, but it is pretty complex.
4. Each contact has many attributes, like 1 workplace, n skills, n references to other contacts in the database (colleagues, co-workers in the same workplace, co-worker in the same department etc.)
I need this data parsed. It is currently dumped into MongoDB and we have already written a shell script parsing much of the contact information, but not all. If you want you can continue with this script, or you can start over and create your own.
This should not take longer than a day of work if you know what you are doing.
This is a fair estimate from a developler who has crawled it and started parsing, but now doesn't have the time to complete the job.
There is no rush, so you can work on this over the next 3 weeks.
Rif. progetto: #9141316
Info sul progetto
38 freelance hanno fatto un'offerta media di $238 per questo lavoro
I am an expert in delivering customized scripts and look forward to discuss further about the project needs.
hi, please attach sample crawled data and list out all the fields to parse out I can write a Perl script to extract the crawled data into MySQL database/CSV file
-> I am Interested and would love to work on your project. -> I read through the job details extremely carefully and I am absolutely sure that I can do the project very well. -> I am 4+ years experienced Web Develo Altro
I'm a computer science professional with a PhD degree and I have extensive experience in databases, Perl, PHP, and several other programming languages. Please see reviews on my profile. It would be my pleasure to do th Altro
I have exoperienced in creating sql queries,Dynamic sql queries,SSIS Packages and ,Store procedures and SSRSreports
Hi, if possible that you export you data to a csv or text file it will be more easy to parse the data correctly. I will download the file and parse the data using perl scripts. Best regards, Ilirjan
Years of experience in telecom. As a QA specialist one of my tasks was data verification, this included regular data dump parsing, etc.
Hi i have worked with many big files in low memory. i am al so experienced in map reduce in mongodb and counchdb. so this will be a easy task for me. i am ready to start work now. you need to pay only if you 100% sati Altro
Hi, I can do this with simple regex -> output to text file -> shell script to db import Regards, Prithvi
Hallo! Das kann ich ja sicherlich machen :) I've been working with MongoDB since 3 years ago and I have lots of experiences with stream based data migration, i.e. migrating millions of rows of data from one data source Altro