Dieses Jobangebot ist archiviert und steht nicht mehr zur Verfügung.
Vakante Jobangebote finden Sie unter Projekte.
Vakante Jobangebote finden Sie unter Projekte.
Big Data Engineer - Analytics - Hadoop - Data Warehouse - Banking - Sw
Eingestellt von Coopers Group GmbH
Gesuchte Skills: Engineer, Apache, Engineering, Python
Projektbeschreibung
BIG DATA ENGINEER WITH ANALYTICS required, to work on a project within a major Bank in Switzerland. Our client is looking for an experienced implementer of productive, business rule and metadata driven Big Data solutions, using Apache Hadoop and Spark.
RESPONSIBILITIES
- Interact with the architecture team to refine requirements
- Work within project team to implement solutions on the existing Hadoop platform
- Work with Platform Engineers to ensure Dependent components are provisioned
- Provide input in defining the design/architecture of the envisaged solution
- Implement business rules for streamlining data feed(s)
- Implement rule based framework to abstract complex technical implementation into reusable, generic components
REQUIRED
- Proven experience implementing solutions to process large amounts of data in a Hadoop ecosystem, utilising Apache Spark
- Experience implementing generic components for the ingestion, validation, and structuring of disparate data sources into a distributed data platform
- Experiencing implementing complex event processing patterns
- Strong programming and Scripting skills; eg Java, Scala, Python, R
- Strong understanding of Hadoop technologies; eg MapReduce, HDFS, HBase, Hive, Sqoop, Flume, Kafka...
- Software Engineering background, with experience using common SDLC tools and practices for agile, ideally including Continuous Delivery
- Experience in BRMS driven solutions
- Experience with master data management (MDM) including ontology curation and data cleansing
- Experience with end-user notepad tools; eg, Jupyter, Zeppelin
- Ability to operate in a global team and coordinate activities with remote resources
- Capable of documenting solutions so they can be re-used
- Skills in exploratory statistics to relate to data analysts and statisticians
DESIRABLE
- Macroeconomic domain knowledge
- Experience in econometrics
- Experience with data visualisation tools, eg Tableau
START DATE: APRIL 2018
DURATION: 3 MONTHS, INITIALLY (OPTION FOR EXTENSION)
LOCATION: BASEL, SWITZERLAND
FOR FURTHER INFORMATION, PLEASE SEND US YOUR UPDATED CV WITH CONTACT DETAILS.
RESPONSIBILITIES
- Interact with the architecture team to refine requirements
- Work within project team to implement solutions on the existing Hadoop platform
- Work with Platform Engineers to ensure Dependent components are provisioned
- Provide input in defining the design/architecture of the envisaged solution
- Implement business rules for streamlining data feed(s)
- Implement rule based framework to abstract complex technical implementation into reusable, generic components
REQUIRED
- Proven experience implementing solutions to process large amounts of data in a Hadoop ecosystem, utilising Apache Spark
- Experience implementing generic components for the ingestion, validation, and structuring of disparate data sources into a distributed data platform
- Experiencing implementing complex event processing patterns
- Strong programming and Scripting skills; eg Java, Scala, Python, R
- Strong understanding of Hadoop technologies; eg MapReduce, HDFS, HBase, Hive, Sqoop, Flume, Kafka...
- Software Engineering background, with experience using common SDLC tools and practices for agile, ideally including Continuous Delivery
- Experience in BRMS driven solutions
- Experience with master data management (MDM) including ontology curation and data cleansing
- Experience with end-user notepad tools; eg, Jupyter, Zeppelin
- Ability to operate in a global team and coordinate activities with remote resources
- Capable of documenting solutions so they can be re-used
- Skills in exploratory statistics to relate to data analysts and statisticians
DESIRABLE
- Macroeconomic domain knowledge
- Experience in econometrics
- Experience with data visualisation tools, eg Tableau
START DATE: APRIL 2018
DURATION: 3 MONTHS, INITIALLY (OPTION FOR EXTENSION)
LOCATION: BASEL, SWITZERLAND
FOR FURTHER INFORMATION, PLEASE SEND US YOUR UPDATED CV WITH CONTACT DETAILS.
Projektdetails
Geforderte Qualifikationen
-
Kategorie:
IT Entwicklung, Ingenieurwesen/Technik