Filter

Meine letzten Suchanfragen
Filtern nach:
Budget
bis
bis
bis
Typ
Fähigkeiten
Sprachen
    Jobstatus
    2,000 nutch hadoop Gefundene Jobs, Preise in EUR

    Role/Project Description : Full Stack Java Developer (VolPay platform engineers) Experience : 5+ Years Mandatory Skills : Core Java, Adv Java (Spring, Spring boot, Hibernate / JPA (Must have) • Strong Experience in Core Java and J2EE • Strong Experience in...Experience in web security standards relating to APIs (OAuth, SSL, CORS, JWT, etc.) is a plus • Experience in Object Caching technologies (Redis/Hazlecast or any other) is a plus • Strong understanding of SOA methodologies and service oriented architectures • Full life-cycle product development experience is must • Excellent Problem-solving & Logical Skills • Exposure to the Big Data technologies like Hadoop , Apache Kafka is Mandatory . • Excellent interpersonal communication, stro...

    €10 / hr (Avg Bid)
    €10 / hr Gebot i.D.
    11 Angebote

    Hadoop map-reduce program implementation (, ) to matrix multiplication and subtraction in Python

    €95 (Avg Bid)
    €95 Gebot i.D.
    19 Angebote

    Experience with Big Data and the Cloud platform services: Apache Hadoop, Apache Hive. Hive Tez migration experience and optimization for Hive code. Basic Knowledge in ETL, Data Pipelines using Python, Shell Scripts, Ctrl-M, Apache Airflow; Building & Populating Data Warehouses, and Querying with BI tools. Basic Knowledge in RDBMS fundamentals: Design & Creation of Databases, Schemas, Tables; DB Administration, Security & working with MySQL & IBM Db2. Basic Knowledge in SQL query language, database functions, stored procs, working with multiple tables, joins, & transactions. Should be able to work Atleast 6 hrs

    €1041 (Avg Bid)
    €1041 Gebot i.D.
    35 Angebote

    Require a Big Data Engineer with min 4 years of experience in Big Data and ML. Hands-on experience is a must on Big Data Architecture, ETL and ML Must have at least one complete project cycle experience ie from Initiation to Delivery Must have experience working with min 100 TB Data Lake. Exposure to NOSQL technologies and Hadoop-based analytics is preferred. Hands-on exp is required in Hadoop, Pig, Hive, Storm, Kafka, Spark, and other technologies. This is a permanent work-from-home opportunity with regular Indian business timings. Must be very fluent in English speaking and comprehension as the clients are American. Only immediate joiners with max 15 days notice period need apply. Applicants must provide evidence of previous work done in GitHub, BitBucket along with the app...

    €8 / hr (Avg Bid)
    €8 / hr Gebot i.D.
    1 Angebote

    Write a MapReduce program with python to implement BFS. , and shell script are needed according to the detailed instructions in the uploaded file.

    €128 (Avg Bid)
    €128 Gebot i.D.
    24 Angebote
    Hadoop /NN Beendet left

    Use hadoop to process big data, then use that data in jupyter notebook to create Neural Network machine learning model

    €26 / hr (Avg Bid)
    €26 / hr Gebot i.D.
    6 Angebote

    Any one having the knowledge of following technologies can Bid: -programming language Scala must, python as well -hands on experience on spark -hands on experience on Hadoop ecosystem , hive, sqoop, sql queries, Unix -cloud experience on cloudera or AWS -oozie workflow -experienced on creating cicd pipelines -Unit/Junit testing, integration or end to end testing -kafka

    €266 (Avg Bid)
    €266 Gebot i.D.
    8 Angebote

    Require a Big Data Engineer with min 4 years of experience in Big Data and ML. Hands-on experience is a must on Big Data Architecture, ETL and ML Must have at least one complete project cycle experience ie from Initiation to Delivery Must have experience working with min 100 TB Data Lake. Exposure to NOSQL technologies and Hadoop-based analytics is preferred. Hands-on exp is required in Hadoop, Pig, Hive, Storm, Kafka, Spark, and other technologies. This is a permanent work-from-home with regular Indian business timings. Must be very fluent in English speaking and comprehension as the clients are American. Only immediate joiners with max 15 days notice period need apply. Application without a detailed resume along with evidence of previous work in GitHub, BitBucket etc will b...

    €6 - €8 / hr
    Dringend Versiegelt
    €6 - €8 / hr
    2 Angebote

    Responsibilities: ADF/ Azure Data Engineer Job Description • You will produce...procedures and batch processes; • Develop, implement, and support interfaces that connect our websites, back-end systems, and various 3rd party cloud solutions. Requirements: • Experience with private and public cloud architectures, pros/cons, and migration considerations; • Cloud migration methodologies and processes including tools like Azure Data Factory, Event Hub, etc; • Experience in Apache NiFi, Hadoop (HDFS) , Data Lake Migration, • Support in creating ADF pipeline, data flow and datasets similar to NiFi • Process and verification of the sample raw data and enrich data set • Microsoft Azure certifications are a plus; • Excellent problem solving, anal...

    €1011 (Avg Bid)
    €1011 Gebot i.D.
    3 Angebote

    Responsibilities: ADF/ Azure Data Engineer Job Description • You will produce...procedures and batch processes; • Develop, implement, and support interfaces that connect our websites, back-end systems, and various 3rd party cloud solutions. Requirements: • Experience with private and public cloud architectures, pros/cons, and migration considerations; • Cloud migration methodologies and processes including tools like Azure Data Factory, Event Hub, etc; • Experience in Apache NiFi, Hadoop (HDFS) , Data Lake Migration, • Support in creating ADF pipeline, data flow and datasets similar to NiFi • Process and verification of the sample raw data and enrich data set • Microsoft Azure certifications are a plus; • Excellent problem solving, anal...

    €895 (Avg Bid)
    €895 Gebot i.D.
    2 Angebote

    ...students. RESPONSIBILITIES • Creating Course Content. • Setting Classes and timetable • Coming up with assessments and providing clear instructions for class activities • Coordinating assignments and grades ABOUT YOU • You must be experienced in Data Science, Machine Learning and Artificial Intelligence and proficient in tools such as, Python, SPSS, Power BI, R, Tableau, STATA, E-View, Hadoop, Spark, etc. • You must be proficient in English Language • You must have a flair for teaching • Instructors applying to teach on the platform must have the following training equipment; i. Broadband Internet at 26Mbps ii. HD Webcam iii. Digital Writing Pad iv. External Microphone v. External Headset/Earpiece Finally, you’re passionate ...

    €9 / hr (Avg Bid)
    €9 / hr Gebot i.D.
    12 Angebote

    ...left up to you how you pick necessary features and build the training that creates matching courses for job profiles. These are the suggested steps you should follow : Step 1: Setup a Hadoop cluster where the data sets should be stored on the set of Hadoop data nodes. Step 2: Implement a content based recommendation system using MapReduce, i.e. given a job description you should be able to suggest a set of applicable courses. Step 3: Execute the training step of your MapReduce program using the data set stored in the cluster. You can use a subset of the data depending on the system capacity of your Hadoop cluster. You have to use an appropriate subset of features in the data set for effective training. Step 4: Test your recommendation system using a set of request...

    €19 (Avg Bid)
    €19 Gebot i.D.
    3 Angebote
    Project Writing Beendet left

    I'm looking to extend and rephrase an existing technical paper. The paper should be at least 9 pages from abstract to references (using a referencing tool), supported figures, graphics, etc are expected. Plagiarism must be below 10%. The candidate should be aware of IT terminologies, big data, and Hadoop or known to the relevant field and grammatically error-free.

    €83 (Avg Bid)
    €83 Gebot i.D.
    49 Angebote

    Hi, please see the files attached. Basic Scala 2 programs AND 1 small Project(Hadoop spark aws mill config)

    €13 (Avg Bid)
    €13 Gebot i.D.
    1 Angebote

    I am looking for someone who has experience with Hadoop, Google Cloud & BigQuery, HDFS, Sqoop, Hive, Python, and Linux. It will be a long-term project with at least 4-5 hours of support required per day. Please only reach out to me if you have good experience with the languages and systems listed above.

    €517 (Avg Bid)
    €517 Gebot i.D.
    24 Angebote

    Design and develop a Distributed Recommendation System on Hadoop You are given 2 CSV data sets: (a) A course dataset containing details of courses offered (b) A job description dataset containing a list of job descriptions (Note: Each field of a job description record is demarcated by " ") You have to design and implement a distributed recommendation system using the data sets, which will recommend the best courses for up-skilling based on a given job description. You can use the data set to train the system and pick some job descriptions not in the training set to test. It is left up to you how you pick necessary features and build the training that creates matching courses for job profiles.

    €92 (Avg Bid)
    €92 Gebot i.D.
    5 Angebote

    Big Data Engineer with expertise in the Hadoop Ecosystem, including on GCP, AWS and Snowflake, GCP - Big Query Extensive experience deploying cloud-based applications using Amazon Web Services such as Amazon EC2, S3, RDS, IAM, Auto Scaling, CloudWatch, SNS, Athena, Glue, Kinesis, Lambda, EMR, Redshift, and DynamoDB. Worked on ETL Migration services by developing and deploying AWS Lambda functions for generating a serverless data pipeline which can be written to Glue Catalog and can be queried from Athena. Proven expertise in deploying major software solutions for various high-end clients meeting the business requirements such as big data Processing, Ingestion, Analytics and Cloud Migration from On-prem to AWS Cloud using AWS EMR, S3, DynamoDB Hands of experience in GCP, Big Query,...

    €23 / hr (Avg Bid)
    €23 / hr Gebot i.D.
    13 Angebote

    We are looking for Hadoop developers with Hive, scala and spark

    €106 (Avg Bid)
    €106 Gebot i.D.
    8 Angebote

    We are hiring hadoop developers good at Hive, spark, scala.

    €8 / hr (Avg Bid)
    Featured
    €8 / hr Gebot i.D.
    2 Angebote

    You are given 2 CSV data sets: (a) A course dataset containing details of courses offered (b) A job description dataset containing a list of job descriptions (Note: Each field of a job description record is demarcated by " ") You have to design and implement a distributed recommendation system using the data sets, which will recommend the best courses for up-skilling based on a given job description. You can use the data set to train the system and pick some job descriptions not in the training set to test. It is left up to you how you pick necessary features and build the training that creates matching courses for job profiles.

    €100 (Avg Bid)
    €100 Gebot i.D.
    3 Angebote

    The write-up should include the main problem that can be subdivided into 3 or 4 subproblems. If I'm satisfied, we discuss further on implementation.

    €221 (Avg Bid)
    €221 Gebot i.D.
    4 Angebote

    Problem Statement: Design scalable pipeline using spark to read customer review from s3 bucket and store it into HDFS. Schedule your pipeline to run iteratively after each hour. Create a folder in the s3 bucket where customer reviews in json format can be uploaded. The Scheduled big data pipeline will be triggered manually or automatically to read data from The S3 bucket and dump it into HDFS. Use Spark Machine learning to perform sentiment analysis using customer review stores in HDFS. Data: You can use any customer review data from online sources such as UCI

    €13 (Avg Bid)
    €13 Gebot i.D.
    3 Angebote

    Hi Sri Varadan Designers, I noticed your profile and would like to offer you my project. We can discuss any details over chat. I have task to do in Mapreduce in hadoop

    €15 (Avg Bid)
    €15 Gebot i.D.
    1 Angebote

    ...your candidature suitable for the below opening with our esteemed client. Hiring for "Spark developer role with US IT based Required Experience: 4+ years atleast two years of spark experience Salary : 40 lacs. Required Qualification: B.E/ Location- Hyderbad | Remote till covid end Timmings : 6:30PM-2:00AM IST Job Description: 1) 4-5 years of Hands-on experience on components of Hadoop Ecosystem like HDFS, Hive, Spark, Sqoop, Map Reduce and YARN 2) Handson Experience with data integration projects on AWS and database platforms (RedShift, Athena, Aurora). 3) Experience In AWS service like S3,IAM,EMR,EC2,AWS Glue, EMR 4) Experience in Azure service 5) Experience in Kafka,and steaming processes and real time jobs. 6) Experience work with cliets and requirement gaterin...

    €11 / hr (Avg Bid)
    €11 / hr Gebot i.D.
    6 Angebote

    Hi Amr S., I noticed your profile and would like to offer you my project. We can discuss any details over chat. I have some tasks to do in Hadoop and map reduce

    €31 (Avg Bid)
    €31 Gebot i.D.
    1 Angebote

    I need an observium specialist to do custom dashboard for: 1. Port Status Monitoring for servers 2. Monitor database health (MariaDB, MongoDB, Hadoop) 3. Server up and down status 4. Memory, hard disk, and resource utilisation

    €52 (Avg Bid)
    €52 Gebot i.D.
    2 Angebote
    Data Engineer Beendet left

    ...backdrop of appropriate controls. ● Implement good development and testing standards to ensure quality of deliverables ● Rapidly understand and translate clients’ business challenges and concerns into a solution oriented discussion. Must Have: ● At least 6+ years of total IT experience ● At least 4+ years of experience in design and development using Hadoop technology stack and programming languages ● Hands-on experience in 2 or more areas: o Hadoop, HDFS, MR o Spark Streaming, Spark SQL, Spark ML o Kafka/Flume. o Apache NiFi o Worked with Hortonworks Data Platform o Hive / Pig / Sqoop o NoSQL Databases HBase/Cassandra/Neo4j/MongoDB o Visualisation & Reporting frameworks like D3.js, Zeppellin, Grafana, Kibana Tableau, Pentaho o Scrapy for crawling websites o Good...

    €26 / hr (Avg Bid)
    €26 / hr Gebot i.D.
    6 Angebote
    Hadoop Assignment Beendet left

    ...left up to you how you pick necessary features and build the training that creates matching courses for job profiles. These are the suggested steps you should follow : Step 1: Setup a Hadoop cluster where the data sets should be stored on the set of Hadoop data nodes. Step 2: Implement a content based recommendation system using MapReduce, i.e. given a job description you should be able to suggest a set of applicable courses. Step 3: Execute the training step of your MapReduce program using the data set stored in the cluster. You can use a subset of the data depending on the system capacity of your Hadoop cluster. You have to use an appropriate subset of features in the data set for effective training. Step 4: Test your recommendation system using a set of request...

    €122 (Avg Bid)
    €122 Gebot i.D.
    5 Angebote

    ...same metrics to show which is a better method. OR ii) Improvement on the methodology used in (a) that will produce a better result. 2. Find a suitable paper on replication of data in hadoop mapreduce framework. a) Implement the methodology used in the paper b) i) Write a program to split identified intermediate results from (1 b(i)) appropriately into 64Mb/128Mb and compare with 2(a) using same metrics to show which is a better method. OR ii) Improvement on the methodology used in 2(a) that will produce a better result 3. Find a suitable paper on allocation strategy of data/tasks to nodes in Hadoop Mapreduce framework. a) Implement the methodology used in the paper b) i) Write a program to reallocate the splits from (2 (b(i)) above to nodes by considering the capabi...

    €153 (Avg Bid)
    €153 Gebot i.D.
    4 Angebote

    Problem Statement: Design scalable pipeline using spark to read customer review from s3 bucket and store it into HDFS. Schedule your pipeline to run iteratively after each hour. Create a folder in the s3 bucket where customer reviews in json format can be uploaded. The Scheduled big data pipeline will be triggered manually or automatically to read data from The S3 bucket and dump it into HDFS. Use Spark Machine learning to perform sentiment analysis using customer review stores in HDFS. Data: You can use any customer review data from online sources such as UCI

    €17 - €36
    €17 - €36
    0 Angebote
    scala hadoop Beendet left

    Problem Statement: Design scalable pipeline using spark to read customer review from s3 bucket and store it into HDFS. Schedule your pipeline to run iteratively after each hour. Create a folder in the s3 bucket where customer reviews in json format can be uploaded. The Scheduled big data pipeline will be triggered manually or automatically to read data from The S3 bucket and dump it into HDFS. Use Spark Machine learning to perform sentiment analysis using customer review stores in HDFS. Data: You can use any customer review data from online sources such as UCI

    €7 - €17
    €7 - €17
    0 Angebote

    Daily 8 Ho...Security, System Configuration, Database Security, etc. OS builds (Redhat, Debian/Ubuntu) CA hierarchy, Encryption and key management systems Vendor RfP process by providing a detailed set of technical capabilities and scoring vendor capabilities maintain VRA to automate deployment of PaaS Catalog High level designs and low level design DBaaS (MongoDB, MySQL multi master, PostgreSQL, Couchbase), Hadoop , service orchestration (Ansible, Terraform), Observability (Zabbix, ELK, Grafana, Prometheus, FluentD), APIGW (OpenResty, NGINX, KrakenD), SSL, DNS, HA-Proxy, Isitio, Consul, Vault, CyberArk, Envoy, OAUTH, JWT, Nexus Sonatype repository, messaging (Kafka, RabbitMQ, NATS), gRPC Vmware VRA blueprints development on top of vmware and K8S platforms Advanced knowledge of PaaS...

    €732 (Avg Bid)
    €732 Gebot i.D.
    15 Angebote

    Proficiency in SQL Writing, SQL Concepts, Data Modelling Techniques & Data Engineering Concepts is a must Hands on experience in ETL process, Performance optimization techniques is a must. Candidate should have taken part in Architecture design and discussion. Minimum of 2 years of experience in working with batch processing/ real-time systems using various technologies like Databricks, HDFS, Redshift, Hadoop, Elastic MapReduce on AWS, Apache Spark, Hive/Impala and HDFS, Pig, Kafka, Kinesis, Elasticsearch and NoSQL databases Minimum of 2 years of experience working in Datawarehouse or Data Lake Projects in a role beyond just Data consumption. Minimum of 2 years of extensive working knowledge in AWS building scalable solutions. Equivalent level of experience in Azure or Google Cl...

    €1756 (Avg Bid)
    €1756 Gebot i.D.
    16 Angebote

    Data Engineers 6+ yrs : At least 6+ years of total IT experience ● At least 4+ years of experience in design and development using Hadoop technology stack and programming languages ● Hands-on experience in 2 or more areas: o Hadoop, HDFS, MR o Spark Streaming, Spark SQL, Spark ML o Kafka/Flume. o Apache NiFi Worked with Hortonworks Data Platform o Hive / Pig / Sqoop o NoSQL Databases HBase/Cassandra/Neo4j/MongoDB o Visualisation & Reporting frameworks like D3.js, Zeppellin, Grafana, Kibana Tableau, Pentaho o Scrapy for crawling websites o Good to have knowledge of Elastic Search o Good to have understanding of Google Analytics data streaming. o Data security (Kerberos/Open LDAP/Knox/Ranger) ● Should have a very good overview of the current landscape and ab...

    €1805 (Avg Bid)
    €1805 Gebot i.D.
    2 Angebote

    We are looking for an Hadoop expert for our Organization.

    €7 / hr (Avg Bid)
    €7 / hr Gebot i.D.
    6 Angebote

    I need 3 resumes who have 2 year experience, 3 year experience and 5 year experience in Hadoop/Big Data/Data engineer

    €14 (Avg Bid)
    €14 Gebot i.D.
    38 Angebote

    ...looking for a tech co-founder, but if a contractor fits I'm open to it. If you are interested in being a co-founder let me know and we can discuss details. Need to be in the U.S. Example tech stack may include: Programming languages: HTML5/CSS3 Frameworks: Node.js, React Databases: MySQL Cloud platforms: Amazon EC2, Amazon S3 Analytics: Google Mobile App Analytics, Flurry Analytics, Hadoop, Hive, MixPanel, Localytics, Mode, Parquet, Pig, Presto, Spark CDN services: Amazon CloudFront Streaming protocols: RTMP, Adobe HTTP Dynamic Streaming, Apple HTTP Live Streaming, M2TS, MPEG-DASH, Microsoft Smooth Streaming, WebRTC, RTSP/RTP Media formats: MKV, MP4, AVCHD, AVI, DMW, MOV, FLV, WMV, SWF Codecs: H.264/AVC, FFmpeg, XviD Media containers: MP4, FLV Geolocation:...

    €47 / hr (Avg Bid)
    NDA
    €47 / hr Gebot i.D.
    40 Angebote

    ...plan for "non hyper cloud" deployments. OpenStack, ProxMox, Kubernetes. All are on the table but the most "appropriate" one must be selected considering the architecture and CI/CD capabilities. - Build and maintain "on prem" alternatives of the AWS structure. This will include hardware planing (server) but also deployment of several VMs (or containers at some point) with techs including php+nginx, hadoop with hbase (and phoenix), sql database (probably mysql) and CEPH object storage. - Be the technical champion that solves "problems" for the developers to take their code to live production. Some background - We have a part-time DevOps on board who has done a good job so far but he is unable to coop with the workload. For this role, it is e...

    €16 / hr (Avg Bid)
    €16 / hr Gebot i.D.
    17 Angebote

    Hi I would like to get some help on hadoop stack like Python (Design patterns), Pyspark and SQL. If anyone has knowledge on this stack please let me know

    €122 / hr (Avg Bid)
    €122 / hr Gebot i.D.
    30 Angebote

    We are developing a bioinformatics platform. Aside from the web application written in php, the core technology stack is * php * nodejs * javascript * jquery and we are looking for one or two good full stack developers to help our existing teams. * Experiience with big data (hadoop) * Experience with data upload operations (we need surgeon precision on this topic as we are working with large datasets) is definetely a plus. This is a long term position after a decent amount of trial period.

    €15 / hr (Avg Bid)
    €15 / hr Gebot i.D.
    103 Angebote

    Implementation in hadoop compiler

    €69 (Avg Bid)
    €69 Gebot i.D.
    3 Angebote
    Apache Hadoop Beendet left

    Implementation in Hadoop compiler

    €138 (Avg Bid)
    €138 Gebot i.D.
    4 Angebote

    PaaS l...Security, System Configuration, Database Security, etc. OS builds (Redhat, Debian/Ubuntu) CA hierarchy, Encryption and key management systems Vendor RfP process by providing a detailed set of technical capabilities and scoring vendor capabilities maintain VRA to automate deployment of PaaS Catalog High level designs and low level design DBaaS (MongoDB, MySQL multi master, PostgreSQL, Couchbase), Hadoop , service orchestration (Ansible, Terraform), Observability (Zabbix, ELK, Grafana, Prometheus, FluentD), APIGW (OpenResty, NGINX, KrakenD), SSL, DNS, HA-Proxy, Isitio, Consul, Vault, CyberArk, Envoy, OAUTH, JWT, Nexus Sonatype repository, messaging (Kafka, RabbitMQ, NATS), gRPC Vmware VRA blueprints development on top of vmware and K8S platforms Advanced knowledge of PaaS...

    €611 (Avg Bid)
    €611 Gebot i.D.
    8 Angebote

    need someone to help me with hadoop and to support in completing assignment in HIVE thanks

    €20 (Avg Bid)
    €20 Gebot i.D.
    11 Angebote

    The purpose of this project is to develop a working prototype of a network monitoring and reporting Platform that receives network health and status, traffic data from several network ...a working prototype of a network monitoring and reporting Platform that receives network health and status, traffic data from several network infrastructure monitoring sources, and produces an aggregate of network status data for processing by a data analytics engine. This prototype will be known as NetWatch. The NetWatch solution will utilize data processing and analytics services via the Hadoop infrastructure, and data reporting features of the Hbase or MYSQL/Datameer tool. The prototype will be used by the Network A&E team to determine its viability as a working engine for network status ...

    €7 - €17
    €7 - €17
    0 Angebote

    You are given a large collection of (English) text documents (as files). For each document, compute the top 20 keywords by relevance scores. For a keyword w and a document d the relevance score is given by T(w,d)/D(w,d) : • where T (w,d) = count(w,d)^0.5 where ^ denotes exponentiation and • count(w,d) is the number of occurrences of w in d and • D(w,d) is the fraction of documents in the collection in which w occurs (i.e. x/N if w occurs in x documents out of N, the total number of documents in the collection). Also compute the intersection of all the top-20 keywords.

    €7 - €17
    €7 - €17
    0 Angebote

    I need full stack app developer for an app with the following tech stack: • Programming Languages: JavaScript, Ruby, Java, Sass • Frameworks: Rails • Libraries: React • Databases: MySQL, Amazon RDS, Redis, MongoDB • Servers and cloud computing services: NGINX, Amazon S3, Amazon EC2, Amazon EBS, Amazon ElastiCache • Big data processing: Hadoop, Presto, Airpal, Druid • Workflow management: Airflow

    €366 (Avg Bid)
    €366 Gebot i.D.
    31 Angebote

    I persuing big data program from nit, need support to complete few assignment and capstone project, candidate must know java, scala hadoop, hive and spark

    €115 (Avg Bid)
    €115 Gebot i.D.
    10 Angebote

    - Backup HBase database on internal infrastructure

    €15 / hr (Avg Bid)
    €15 / hr Gebot i.D.
    3 Angebote

    We are looking for a machine learning engineer who must have the following experience: 1. python coding: +7 years of experience 2. Machine Leaning: +5 years of experience (Scikit-Learn, TensorFlow, Caffe, MXNet, Keras, XGBoost) 3. AI/Deep Learning: +5 years of experience 4. Cloud computing: AWS, S3, EC2, EMR, SageMaker, ECS, Lambda, IAM 5. distributed computing technology: Hadoop, Spark, HBase, Hive / Impala, or any similar technology Should be an independent developer, NO CONSULTING COMPANY There will be series of technical interview about python coding, machine learning, AI , cloud computing. Candidate must have an excellent skill in python coding and be able to answer challenging python questions during the interview

    €52 / hr (Avg Bid)
    €52 / hr Gebot i.D.
    13 Angebote