Dieses Jobangebot ist archiviert und steht nicht mehr zur Verfügung.
Vakante Jobangebote finden Sie unter Projekte.
Vakante Jobangebote finden Sie unter Projekte.
Big Data Architect- Zurich- Switzerland
Eingestellt von Empiric Solutions
Gesuchte Skills: Sql, Javascript, Python, Java
Projektbeschreibung
Empiric Solutions are currently seeking for an experienced Big Data Architect for a Blue Chip Banking Industry based in Zurich for an initial 6-12 months contract.
Skills and Experience:
- 7+ years of experience of IT platform implementation in a highly technical and analytical role.
- 5+ years' experience of Big Data platform implementation, including 3+ years of hands-on experience in implementation and performance tuning Hadoop/Spark implementations.
- Demonstrated ability to think strategically about business, product, and technical challenges in an enterprise environment. Track record of thought leadership and innovation around Big Data.
- Strong understanding of ETL processes and data flow architectures and tools
- Have designed and built a scalable big data infrastructure that has been in use for several years
- Experience designing ETL processes and data flow architectures and tools
- Experience designing architectures that have to work in a highly regulated industry (finance domain).
- Experience designing architectures that can incorporate data from multiple data sources
- Experience educating other team members on a technology stack
- Highly technical and analytical, possessing 5 or more years of IT platform implementation experience.
- Deep understanding of Apache Hadoop 2 and the Hadoop ecosystem. Experience with one or more relevant tools ( Sqoop, Flume, Kafka, Oozie, Hue, Zookeeper, HCatalog, Solr, Avro).
- Familiarity with one or more SQL-on-Hadoop technology (Hive, Pig, Impala, Spark SQL, Presto).
- Experience developing software code in one or more programming languages (Java, JavaScript, Python, etc).
- Being a team player
Please apply with your most recent CV to be considered for this role or call Naz for a confidential chat.
Skills and Experience:
- 7+ years of experience of IT platform implementation in a highly technical and analytical role.
- 5+ years' experience of Big Data platform implementation, including 3+ years of hands-on experience in implementation and performance tuning Hadoop/Spark implementations.
- Demonstrated ability to think strategically about business, product, and technical challenges in an enterprise environment. Track record of thought leadership and innovation around Big Data.
- Strong understanding of ETL processes and data flow architectures and tools
- Have designed and built a scalable big data infrastructure that has been in use for several years
- Experience designing ETL processes and data flow architectures and tools
- Experience designing architectures that have to work in a highly regulated industry (finance domain).
- Experience designing architectures that can incorporate data from multiple data sources
- Experience educating other team members on a technology stack
- Highly technical and analytical, possessing 5 or more years of IT platform implementation experience.
- Deep understanding of Apache Hadoop 2 and the Hadoop ecosystem. Experience with one or more relevant tools ( Sqoop, Flume, Kafka, Oozie, Hue, Zookeeper, HCatalog, Solr, Avro).
- Familiarity with one or more SQL-on-Hadoop technology (Hive, Pig, Impala, Spark SQL, Presto).
- Experience developing software code in one or more programming languages (Java, JavaScript, Python, etc).
- Being a team player
Please apply with your most recent CV to be considered for this role or call Naz for a confidential chat.
Projektdetails
Geforderte Qualifikationen
-
Kategorie:
IT Entwicklung, Webentwicklung