Dieses Jobangebot ist archiviert und steht nicht mehr zur Verfügung.
Vakante Jobangebote finden Sie unter Projekte.
Vakante Jobangebote finden Sie unter Projekte.
Architect Big Data
Eingestellt von Next Ventures Ltd
Gesuchte Skills: Sql, Javascript, Python, Java
Projektbeschreibung
7+ years of experience of IT platform implementation in a highly technical and analytical role.
5+ years' experience of Big Data platform implementation, including 3+ years of hands-on experience in implementation and performance tuning Hadoop/Spark implementations.
Demonstrated ability to think strategically about business, product, and technical challenges in an enterprise environment. Track record of thought leadership and innovation around Big Data.
Strong understanding of ETL processes and data flow architectures and tools
Have designed and built a scalable big data infrastructure that has been in use for several years
Experience designing ETL processes and data flow architectures and tools
Experience designing architectures that have to work in a highly regulated industry (finance domain).
Experience designing architectures that can incorporate data from multiple data sources
Experience educating other team members on a technology stack
Highly technical and analytical, possessing 5 or more years of IT platform implementation experience.
Deep understanding of Apache Hadoop 2 and the Hadoop ecosystem. Experience with one or more relevant tools ( Sqoop, Flume, Kafka, Oozie, Hue, Zookeeper, HCatalog, Solr, Avro).
Familiarity with one or more SQL-on-Hadoop technology (Hive, Pig, Impala, Spark SQL, Presto).
Experience developing software code in one or more programming languages (Java, JavaScript, Python, etc).
Being a team player
5+ years' experience of Big Data platform implementation, including 3+ years of hands-on experience in implementation and performance tuning Hadoop/Spark implementations.
Demonstrated ability to think strategically about business, product, and technical challenges in an enterprise environment. Track record of thought leadership and innovation around Big Data.
Strong understanding of ETL processes and data flow architectures and tools
Have designed and built a scalable big data infrastructure that has been in use for several years
Experience designing ETL processes and data flow architectures and tools
Experience designing architectures that have to work in a highly regulated industry (finance domain).
Experience designing architectures that can incorporate data from multiple data sources
Experience educating other team members on a technology stack
Highly technical and analytical, possessing 5 or more years of IT platform implementation experience.
Deep understanding of Apache Hadoop 2 and the Hadoop ecosystem. Experience with one or more relevant tools ( Sqoop, Flume, Kafka, Oozie, Hue, Zookeeper, HCatalog, Solr, Avro).
Familiarity with one or more SQL-on-Hadoop technology (Hive, Pig, Impala, Spark SQL, Presto).
Experience developing software code in one or more programming languages (Java, JavaScript, Python, etc).
Being a team player
Projektdetails
Geforderte Qualifikationen
-
Kategorie:
IT Entwicklung, Webentwicklung