I need a web scraper written for the following url:
[login to view URL]
Login is required. Login credentials for Fox Lumber available loads:
username: steve@[login to view URL]
All information needed is available on the main page. The number of rows will vary.
The output should be a pipe (|) delimited file with the following column mappings:
origin_city --> data located in the "Loc City" column
origin_state --> data located in the "Loc State" column
ship_date --> data located in the "Ready" column changed to the YYYY-MM-DD format,
if the date listed is a past date, use the current days date, also in the YYYY-MM-DD format
destination_city --> data located in the "Dest City" column
destination_state --> data located in the "Dest State" column
receive_date --> leave blank
trailer_type --> put the text "Flatbed" for all
load_size --> put the text "Full", if the "Load Comments" column contains the word "Partial" put the text "Partial"
weight --> leave blank
length --> Leave blank
width --> leave blank
height --> leave blank
trip_miles --> leave blank
pay_rate --> leave blank
contact_phone --> put the text "406-363-5140"
contact_name --> data located "Dispatcher" column, add the text "Dispatcher" before the data in the column
tarp_required --> leave blank
comment --> data located in the "Load Comments" column, if "Load Comments" column contains the word "van" add the text "van"
to the trailer_type data. If "Load Comments" column contains the word "maxi" add the text "maxi" to the trailer_type data
load_number --> leave blank
commodity --> leave blank
The first line of the output should contain all of the column headers.
Any field that contain no data should be left blank.
Please do not use words like "null" or "blank" in blank columns.
Below is a sample output of the first 5 columns using sample data:
The deliverable will be a Perl .pl file that must run on
Ubuntu Linux and must use Modern::Perl. The Perl .pl file
should be called '[login to view URL]' and the output file should be
called '[login to view URL]'
It will be scheduled in cron to run unattended every 15 minutes.
Please specific what language/OS/modules you plan to use.
Also, please include the word "raccoon" in your bid so I know that
you read this description.
Thank you for the invitation! I read about 'raccoon' and I can provide you a Perl scraper for above website in less than a day. I'll use WWW::Mechanize, HTML::TreeBuilder::LibXML etc.
7 Freelancer bieten im Durchschnitt $155 für diesen Job
Hello, this is a simple scraping job which can be easily implemented using python and beautiful soup library. Please contact me for the details and fast solution.