Root Level Technology is looking for an experienced and talented Data Engineer for this project. You will help build ETL scripts to interact with large amounts of data to manipulate and prep. This position will be partly client facing and travel could be required.
Daily delivery of work inside the PM tool
Adhering to best practice development practices
Attending all sprint ceremonies on a consistent and timely manner
Meeting delivery deadlines on assigned work
5+ years managing big data and a strong understanding of how to normalise and deduplicate large datasets using different models.
Hands on experience with AWS
Deep understanding of address normalisation
Prior experience working with similar datasets (addresses/names/email) is required.
Experience with the following technologies:
Spark & Pyspark
EMR OR Dataproc
Specifically experience with large schemas and unionising datasets
The Ideal Candidate
Deals well with ambiguous or undefined problems; ability to think abstractly and guide others
Comfortable performing requirements analysis, interfacing with stakeholders of various levels and documenting solutions
Excellent interpersonal skills
Demonstrable knowledge of DevOps in an Enterprise setting
Energetic team player who works well across boundaries and readily adapts to change and enjoys rapid development
Is confident in their skills, abilities and is willing to share their knowledge, while learning from others.
14 Freelancer bieten im Durchschnitt $21/Stunde für diesen Job
I am a professional Data Scientist from Scotland, I am currently working with financial clients I am available to discuss my previous experience relevant to this project let me know when you are available to chat
Hi, i am raju, i have been working in creating ETL module using Azure Data Factory, extracting and modeling in databricks with application of business logic.
I have 4 years of experience in developing ETL Jobs using bigdata technologies on cloud like AWS , Azure. So I think i can help you out on every aspect