Find Jobs
Hire Freelancers

573877 Sports Data Feed

N/A

Chiuso
Pubblicato più di 11 anni fa

N/A

Pagato al completamento
Sports Data Feed Project Overview: I currently have a desktop application (gameday payoff) that requires a data feed consisting of lines and scores from 6 major sports. The feed from the server to the client's (the gameday payoff app) already exists and is contained within a SQL db that collects the data from outside sources and then packages them up to be sent to the client. The project I need you for is about accomplishing 2 things. I need a better outside source for my SQL db and I need this source to also be able to send an external JSON feed of this data for another project I have or other customers. General Things to Know: There are 6 sports for which I need data collected on. They are pro football (NFL), college football (NCAAF), pro basketball (NBA), college basketball (NCAAB), pro hockey (NHL), and pro baseball (MLB). Unfortunately, only MLB is being played right now, but I do have page source captures of the other sports. I need the lines, scores, and game time data to be collected for each game. The data will be collected from feeds and scraping sites. Upon data collection, there will be rules in place that will validate the data. Upon data validation, that approved data will be sent to a “staging area”. From the “staging area”, the data will be sent to my SQL db so that it can be processed and sent onto the clients. You do not have to worry about this process of sending the data onto my clients. That is already in place. HOWEVER, you must use the current naming conventions that are currently in my SQL db for the data that you collect. I am not going to have separate naming conventions for the “staging area” that feeds my current process. For teams, you will also be referencing and using an extensive table that has the teams and aliases for sites that have slightly different names for the same team…Also, from the “staging area”, the approved data will be available for a JSON feed. Games and Lines Part of Project: There will be 3 sites/feeds used to gather initial data. The first one, wsex, is already being scraped and collected. All you need to do with that one is get the data we are already collecting into the staging area. The other 2 are data feeds. All you need to do with them is filter/collect the specific data that I need. The feeds/sites are as follows: 1. [login to view URL] (also, the other 5 sports) 2. [login to view URL] (juice needs to be converted from decimal to American) 3. [login to view URL] Data to be collected: 1. Date and time 2. Both Teams 3. Rotation numbers (wsex does not have) 4. Home Team Spread 5. Away Team Spread 6. Home Team Spread Juice Line 7. Away Team Spread Juice Line 8. Home Team Total 9. Away Team Total 10. Home Team Total Juice Line 11. Away Team Total Juice Line 12. Home Money Line 13. Away Money Line Rules for Staging Area: 1. Always start with wsex, then go to bookmaker, and then betonline. 2. For every single game on each of the 3 sites/feeds, the teams, date, and time must be confirmed by 2 of the 3 sites/feeds to become approved and make it into the final phase of the staging area. 3. Once a game has been approved, we need to fill the lines. The lines must be filled in order of wsex, then go to bookmaker, and then betonline. It is my strong desire to use wsex, but if a line is missing from them, we move on. However, we are constantly scraping and updating within this order of line availability. 4. Notification must be sent to me if 1 day has passed without data being received for any of the fields being collected for any of the 3 sites. Such as, wsex has gone a day without collecting any money lines. The reason for this might be because of sites changing so I must know about it. Scores Part of Project: Once a game is official as dictated above, we need to start monitoring the status of the game. If it has not started yet, it's open. If it is currently going on, it's in progress. If it is finished, it is closed. To track this, and other vital information, we must obtain scores. IMPORTANT! Don't be fooled by thinking you are scraping a site in real time when, in fact, the displayed score is a java script that the page source does not have the score for yet. There is a delay for the java to make it to source to be scraped. You must either stay away from these before you waste time writing tons of code or write code that will grab the java before it ever even makes it to the source code for the page. There will be 3 sites/feeds used to get scores. ESPN is already being scraped and collected. All you need to do with that one is have the currently collected data go where it needs to go. The other 2 still need to be scrapped and collected. This is in order of checking. 1. [login to view URL] (also, the other 5 sports. I have some saved pages. Ask me) 2. [login to view URL] (also, the other 5 sports) 3. [login to view URL] (also, the other 5 sports. I have some saved pages. Ask me) Data to be collected (request access to a document I have for more information): 1. Status (Open, In Progress, Closed) 2. Interval (depending on sport it's Quarter, Period, Halftime, Or Inning. IMPORTANT: extra innings in baseball is an issue because all innings for CBS are not shown at same time. They shift innings to accommodate space for extra innings.) 3. Time Remaining for Each Interval except MLB 4. Score for Each Interval 5. Total Score Rules for Scores: 1. Once the sites are scraped for scores, here's how they need to be used. Applying to games that have been approved to be in the “staging area”, we must then reference the starting times for each game. Using the starting time for each game, we must monitor and make sure that the data is showing up for our first choice. Maybe allow for 5 minutes after a game is supposed to start. If data is not there, move to 2nd option and so on. Continue to monitor games as they go through their status (open, active, closed). Such as, if a game is in progress and then no data from a particular site for 5 minutes, move to the next site. Keep checking back though because we want to maintain our order preference in case our 1st choice for scraping does come back up. Using this method, there could be times when 2 different games are receiving data from 2 different sources. 2. Notification must be sent if 1 day has passed without data being received for any of the fields being collected for any of the 3 sites. Such as, ESPN has gone a day without collecting any intervals. The reason for this might be because of sites changing so I must know about it. GENERAL STUFF: 1. I must have a way of knowing when a Team is not in our db. Obviously, if it is not, it won't make it through our system. We currently have a link for this ([login to view URL]). Pay attention to the information that is gathered. We need all this information because it points us to where the problem is. Then, when a bad team name shows up here, I track down what team it is and make an alias for it. Ideally, you would just add your sites/feeds to this URL. 2. I need you to create a URL for me that has everything being scraped. Then I want a URL of data that has passed the rules. This would be the same data that would be used to exit the staging area and be used in my gameday payoff process and the external JSON feed required. An example of such a URL would be here. [login to view URL] As a matter of fact, you are more than welcome to have the work from this and just use it for your requirement. 3. You must break out the sports. There is a table in the db for this assignment. 4. We must keep a history of the pages that are actually being scraped. We do this to track down issues that might result. Such things as how often and what sports would be in the config file. Eventually, we might even write a script to delete folders that are older than a week or something because the data piles up pretty quick. Currently, you can find these folders on the server at (C:\feed_fetcher). Milestones and Compensation: This should take no longer than 40 hours to complete. I will pay $900 for the entire job. Upon providing me the following, I will pay $450. 1. URL that shows everything is being scraped 2. URL after rules are applied 3. Proof that you are using my naming conventions 4. URL for missing teams The balance of the $450 will be provided when the following is provided. 1. Your work from the “staging area” is fed into the current gameday payoff process. If you have done the work correctly and have adopted the right naming conventions, this will be a very simple minimal time task. I will know this is complete when my client's are getting the correct data from this project. 2. You have proof and instructions for using the JSON feed from the data in the “staging area”.
Rif. progetto: 2319851

Info sul progetto

7 proposte
Progetto a distanza
Attivo 12 anni fa

Hai voglia di guadagnare un po'?

I vantaggi delle offerte su Freelancer

Imposta il tuo budget e le scadenze
Fatti pagare per il lavoro svolto
Delinea la tua proposta
La registrazione e le offerte sui lavori sono gratuite
7 freelance hanno fatto un'offerta media di $1.593 USD
Avatar dell'utente
Hello ,we have gone through your project named 573877 Sports Data Feed and we like to convey that we have already done similar kind of projects before also. We can address any concerns that you might have in regards to moving forward with the project. Regards Krish.
$1.000 USD in 20 giorni
5,0 (32 valutazioni)
8,0
8,0
Avatar dell'utente
LETS DO THIS!!!
$900 USD in 14 giorni
4,9 (209 valutazioni)
7,5
7,5
Avatar dell'utente
I think the scope of this project is much larger than you anticipated, conforming bids notwithstanding. I have wide ranging experience and do top quality work. Let me make your project happen for you.
$2.500 USD in 45 giorni
5,0 (1 valutazione)
5,5
5,5
Avatar dell'utente
Hi, I do as what you want as same idea for gambling website to feed data,so I have a good experience of that, so Please can you explain more about " I need a better outside source for my SQL db and I need this source to also be able to send an external JSON feed of this data for another project I have or other customers."
$2.500 USD in 30 giorni
0,0 (0 valutazioni)
0,0
0,0
Avatar dell'utente
HI, I have 5+ year of experience in .net technologies. I have worked on asp.net, C#, AJAX, Java Script,Jquery, WCF, Ms SQL Server. I have worked on .net framework 1.1, 2.0, 3.5 and 4.0. I can write the best optimized SQL queries and database objects. I have workd on SOAP based web service as well as rest based. I have recently cleared Microsoft certification for the WCF. I am a Microsoft certified professional (MCP) and Microsoft certified Technology Specialist (MCTS). I can work 20 - 25 works per week. I am looking for the good challenging work. I am very punctual and dedicated to the work. I can ensure you about awesome quality of work and timely delivery. Thanks Puru
$2.100 USD in 40 giorni
0,0 (0 valutazioni)
0,0
0,0
Avatar dell'utente
Collect your requirements get done.
$1.700 USD in 30 giorni
0,0 (0 valutazioni)
0,0
0,0
Avatar dell'utente
I know the betting system very well I used to work on Wagering, i know everything about lines, scores, etc, etc and i also need the job, try me u won't be dissapointed
$450 USD in 7 giorni
0,0 (0 valutazioni)
0,0
0,0

Info sul cliente

Bandiera: UNITED STATES
United States
0,0
0
Metodo di pagamento verificato
Membro dal dic 24, 2010

Verifica del cliente

Grazie! Ti abbiamo inviato tramite email il link per richiedere il tuo bonus gratuito.
Non è stato possibile inviarti l'email. Riprova per piacere.
di utenti registrati di lavori pubblicati
Freelancer ® is a registered Trademark of Freelancer Technology Pty Limited (ACN 142 189 759)
Copyright © 2024 Freelancer Technology Pty Limited (ACN 142 189 759)
Caricamento anteprima
Autorizzazione per la geolocalizzazione concessa.
La tua sessione è scaduta ed è stato effettuato il log out. Accedi nuovamente per piacere.