Dieses Jobangebot ist archiviert und steht nicht mehr zur Verfügung.
Vakante Jobangebote finden Sie unter Projekte.
Vakante Jobangebote finden Sie unter Projekte.
Big Data Senior Developer/Big Data Senior Architect - Zurich, Switzerl
Eingestellt von Infoplus Technologies UK Ltd
Gesuchte Skills: Java
Projektbeschreibung
ROLE: BIG DATA SR DEVELOPER
MANDATORY SKILLS
- Good understanding of concepts on Spark Streaming, Kafka, Flume, HBase and Solr
- Hands on experience in development of Spark Streaming, HBase and Kafka
- Must have used various Spark streaming functions
- Good understanding of memory management within the Spark streaming jobs
- Must have hands on experience in writing Java API scripts to read/persist data in HBase
- Strong experience in Core Java
- Experience on Spark streaming and HBase Performance Tuning
- Experience working with Cloudera Hadoop distribution
ADDITIONAL SKILLS
- Experience in Finance domain
- Exposure to WebSphere MQ
BIG DATA SR ARCHITECT - ZURICH
ROLE: BIG DATA SR ARCHITECT
MANDATORY SKILLS
- Good understanding of architectural concepts on Spark Streaming, Kafka, Flume, HBase and Solr
- Proficient in defining technical architecture for Real Time data processing
- Proficient in designing, developing & deploying scalable big data analytic solutions based on Spark Streaming and HBase
- Strong technical knowledge and programming experience in Real Time streaming with Spark streaming, Flume and Kafka
- Prior experience in working on Spark Stateful transformations
- Hands-on experience in development (Spark/HBase/Kafka/Flume), capacity planning, deployment and troubleshooting
- Experience on Spark streaming and HBase Performance Tuning
- Should have led full life cycle development of Hadoop/Spark implementation for high volume data
- Experience working with Cloudera Hadoop distribution
- Strong experience in Core Java
ADDITIONAL SKILLS:
- Experience in Finance domain
- Exposure to WebSphere MQ
- Experience in installing and configuring Hadoop on AWS
- Experience in data quality, exception management, reconciliation and data migration
MANDATORY SKILLS
- Good understanding of concepts on Spark Streaming, Kafka, Flume, HBase and Solr
- Hands on experience in development of Spark Streaming, HBase and Kafka
- Must have used various Spark streaming functions
- Good understanding of memory management within the Spark streaming jobs
- Must have hands on experience in writing Java API scripts to read/persist data in HBase
- Strong experience in Core Java
- Experience on Spark streaming and HBase Performance Tuning
- Experience working with Cloudera Hadoop distribution
ADDITIONAL SKILLS
- Experience in Finance domain
- Exposure to WebSphere MQ
BIG DATA SR ARCHITECT - ZURICH
ROLE: BIG DATA SR ARCHITECT
MANDATORY SKILLS
- Good understanding of architectural concepts on Spark Streaming, Kafka, Flume, HBase and Solr
- Proficient in defining technical architecture for Real Time data processing
- Proficient in designing, developing & deploying scalable big data analytic solutions based on Spark Streaming and HBase
- Strong technical knowledge and programming experience in Real Time streaming with Spark streaming, Flume and Kafka
- Prior experience in working on Spark Stateful transformations
- Hands-on experience in development (Spark/HBase/Kafka/Flume), capacity planning, deployment and troubleshooting
- Experience on Spark streaming and HBase Performance Tuning
- Should have led full life cycle development of Hadoop/Spark implementation for high volume data
- Experience working with Cloudera Hadoop distribution
- Strong experience in Core Java
ADDITIONAL SKILLS:
- Experience in Finance domain
- Exposure to WebSphere MQ
- Experience in installing and configuring Hadoop on AWS
- Experience in data quality, exception management, reconciliation and data migration
Projektdetails
Geforderte Qualifikationen
-
Kategorie:
IT Entwicklung