I got few simple tool written in python ! Problem in them is that those tools are very slow ! I need decent performance improvemnt for them !
Tool 1: [url removed, login to view]!M00BVCxS!n4UbvDvv-z4PMTE9iLcdrAWBH3F_tVgB5KGL_AmufcQ
Consits of few modules:
- [url removed, login to view] - add multithreading - improve algorithm for better performance if possible - maybe using indexed database instesd of file as data source would help?
- [url removed, login to view] - add multithreading if possible - make as fast algorithm as possible - should work with files like 50GB but not eat more than 75% of computer RAM
- [url removed, login to view] - already multothreaded ! but still very slow !! need decent improvemnt because LSA* scripts are most important to me
Tool2: [url removed, login to view]!8wEwAZQS!s8EAQK27Lh1IiA5XMbjGv1oISjbmSyOXKwIK5Phbatw removes duplicates from parallel sets of texts. Now it works but problem is that I want to process 300GB data and the tool tries to load all into memory that needs to be solved. Also introducing multithreading is needed. Other algorithmical improvements are not needed but well seen if possible.
Tool3: [url removed, login to view]!E0tkUKDY!4CxlRzz3omo2ux3CpUjtsqls0wWw_QLkPbEGgHdG4uI
Works good, but too slow. I would need to add multhreading (but the tools should not consume more that 75% of PC RAM). Other algorithmical improvements are in my opinion requred if possible.
14 Freelancer bietetn im Durchschnitt $205 für diesen Job an
hello, you chose, I work for solution. :) Pls look at my profile for my work and experince. looking forward to hear backngtom you with all tools source code. Thanks. Regards, Sandip