I need a script that converts an extremely large text file list of keywords into a
[login to view URL] file, using the correct google xml sitemap format. The features I need are below:
1)The ability to select and read from multiple text files of keywords so i dont have to
combine them into one myslef.
2)Multiple threads, the maximum according to your experience a computer can handle becasue
the text file is so large. I want this task to be finished as fast as possible.
3)After each 50,000 keywords, a new sitemap file should be created. example
[login to view URL]
[login to view URL]
[login to view URL], and so on until job is complete.
In the script, I would like to have the features below.
1)last modification date and time
2)I should be able to type the URL i want to use. example is : [login to view URL]
3)input file(s),
4)Output
here is an example of the input keyword text file below:
keyword1
Key word 2
key WORD3
keyWORD 4
here is an example of how the finished output [login to view URL] should look is below:
<?xml version="1.0" encoding="UTF-8"?>
<urlset xmlns="[login to view URL]">
<url>
<loc>[login to view URL]</loc>
<lastmod>2011-01-27T23:55:42+01:00</lastmod>
<changefreq>daily</changefreq>
<priority>0.5</priority>
</url>
<url>
<loc>[login to view URL] word 2</loc>
<lastmod>2011-01-27T23:55:42+01:00</lastmod>
<changefreq>daily</changefreq>
<priority>0.5</priority>
</url>
<url>
<loc>[login to view URL] WORD3</loc>
<lastmod>2011-01-27T23:55:42+01:00</lastmod>
<changefreq>daily</changefreq>
<priority>0.5</priority>
</url>
<url>
<loc>[login to view URL] 4</loc>
<lastmod>2011-01-27T23:55:42+01:00</lastmod>
<changefreq>daily</changefreq>
<priority>0.5</priority>
</url>
</urlset>
------
send me a private messege for more details.