Dieses Jobangebot ist archiviert und steht nicht mehr zur Verfügung.
Vakante Jobangebote finden Sie unter Projekte.
Vakante Jobangebote finden Sie unter Projekte.
Spark Developer - Java/Scala/Big Data Analytics/Hadoop
Eingestellt von RM IT Professional Resources AG
Gesuchte Skills: Java, Engineer, Python, Client
Projektbeschreibung
SPARK DEVELOPER/ENGINEER - JAVA/SCALA/BIG DATA - with profound analytical knowledge wanted for our Basel based client.
YOUR EXPERIENCE/SKILLS:
- Proven experience in implementing solutions to process large amounts of data in a HADOOP ECOSYSTEM utilising Apache Spark, therefore in-depth experience of Hadoop technologies like. MAPREDUCE, HDFS AND HBASE is mandatory
- Experience IMPLEMENTING GENERIC COMPONENTS for the ingestion, validation, and structuring of disparate data sources into a Big Data platform
- Excellent PROGRAMMING AND SCRIPTING SKILLS IN JAVA, C/C++, SCALA, BASH, R AND PYTHON
- Experience with MASTER DATA MANAGEMENT (MDM) including ontology curation and data cleansing in addition to common SDLC tools and practices for AGILE, including Continuous Delivery
- Skills in exploratory statistics to relate to data analysts and statisticians
- Languages: fluent English both written and spoken
YOUR TASKS:
- Interacting with the architecture team to refine requirements
- Cooperating with the project team to implement solutions on the existing Hadoop platform
- Assisting the Platform Engineer to ensure that dependent components are provisioned
START: ASAP
DURATION: 3 - 6 MM+
LOCATION: Basel, Switzerland
REF.NR.: BH 11362
Does this sound like an interesting and challenging opportunity to you? Then take the next step by sending us your CV as a Word document and a contact telephone number.
DUE TO WORK PERMIT RESTRICTIONS WE CAN UNFORTUNATELY ONLY CONSIDER APPLICATIONS FROM EU OR SWISS CITIZENS AS WELL AS CURRENT WORK-PERMIT HOLDERS FOR SWITZERLAND.
GOING THE EXTRA MILE
NEW TO SWITZERLAND? In case of successful placement, we support you with:
- All administrative questions
- Finding an apartment
- Health - and social insurance
- Work permit and much more
YOUR EXPERIENCE/SKILLS:
- Proven experience in implementing solutions to process large amounts of data in a HADOOP ECOSYSTEM utilising Apache Spark, therefore in-depth experience of Hadoop technologies like. MAPREDUCE, HDFS AND HBASE is mandatory
- Experience IMPLEMENTING GENERIC COMPONENTS for the ingestion, validation, and structuring of disparate data sources into a Big Data platform
- Excellent PROGRAMMING AND SCRIPTING SKILLS IN JAVA, C/C++, SCALA, BASH, R AND PYTHON
- Experience with MASTER DATA MANAGEMENT (MDM) including ontology curation and data cleansing in addition to common SDLC tools and practices for AGILE, including Continuous Delivery
- Skills in exploratory statistics to relate to data analysts and statisticians
- Languages: fluent English both written and spoken
YOUR TASKS:
- Interacting with the architecture team to refine requirements
- Cooperating with the project team to implement solutions on the existing Hadoop platform
- Assisting the Platform Engineer to ensure that dependent components are provisioned
START: ASAP
DURATION: 3 - 6 MM+
LOCATION: Basel, Switzerland
REF.NR.: BH 11362
Does this sound like an interesting and challenging opportunity to you? Then take the next step by sending us your CV as a Word document and a contact telephone number.
DUE TO WORK PERMIT RESTRICTIONS WE CAN UNFORTUNATELY ONLY CONSIDER APPLICATIONS FROM EU OR SWISS CITIZENS AS WELL AS CURRENT WORK-PERMIT HOLDERS FOR SWITZERLAND.
GOING THE EXTRA MILE
NEW TO SWITZERLAND? In case of successful placement, we support you with:
- All administrative questions
- Finding an apartment
- Health - and social insurance
- Work permit and much more
Projektdetails
Geforderte Qualifikationen
-
Kategorie:
IT Entwicklung, Ingenieurwesen/Technik