Dieses Jobangebot ist archiviert und steht nicht mehr zur Verfügung.
Vakante Jobangebote finden Sie unter Projekte.

Big Data Engineer/Architect

Eingestellt von Lawrence Harvey Enterprise

Gesuchte Skills: Engineer, Client, Apache, Engineering

Projektbeschreibung

My well recognised financial client are seeking an experience Big Data Engineer/Architect to lead the implementation of solutions on a greenfield POC project!

You will be an experienced implementer of productive, business rule and metadata driven Big Data solutions, using Apache Hadoop and Spark. The successful candidate should have demonstrable experience of implementing similar solutions, be capable of working independently as well as comfortable working within an agile project team. The ability to relate complex data problems to business users and vice-versa, is crucial.

The Job:

Interact with the architecture team to refine requirements
Work within project team to implement solutions on the existing Hadoop platform
Work with Platform Engineers to ensure Dependent components are provisioned
Provide input in defining the design/architecture of the envisaged solution
Implement business rules for streamlining data feed(s)
Implement rule based framework to abstract complex technical implementation into reusable, generic components

We are looking for candidates with:

Proven experience implementing solutions to process large amounts of data in a Hadoop ecosystem, utilising Apache Spark
Experience implementing generic components for the ingestion, validation, and structuring of disparate data sources into a distributed data platform
Experiencing implementing complex event processing patterns
Strong programming and Scripting skills; eg Java or Scala or Python or R
Strong understanding of Hadoop technologies; eg MapReduce, HDFS, HBase, Hive, Sqoop, Flume, Kafka
Software Engineering background, with experience using common SDLC tools and practices for agile, ideally including Continuous Delivery
Experience in BRMS driven solutions
Experience with master data management (MDM) including ontology curation and data cleansing
Experience with end-user notepad tools; eg, Jupyter, Zeppelin
Ability to operate in a global team and coordinate activities with remote resources
Capable of documenting solutions so they can be re-used
Skills in exploratory statistics to relate to data analysts and statisticians

Nice to have:

Economics or Macroeconomic domain knowledge
Experience in econometrics
Experience with data visualisation tools, eg Tableau

Lawrence Harvey are a preferred supplier for this client! Apply now with an up to date CV for interview slots arranged.

Lawrence Harvey is acting as an Employment Business in regards to this position.

Projektdetails

  • Einsatzort:

    Basel, Schweiz

  • Projektbeginn:

    asap

  • Projektdauer:

    3 - 6 months

  • Vertragsart:

    Contract

  • Berufserfahrung:

    Keine Angabe

Geforderte Qualifikationen

Lawrence Harvey Enterprise