Dieses Jobangebot ist archiviert und steht nicht mehr zur Verfügung.
Vakante Jobangebote finden Sie unter Projekte.

Hadoop Senior Data Engineer

Eingestellt von Next Ventures Ltd

Gesuchte Skills: Sql, Engineer, Oracle, Eclipse

Projektbeschreibung

My client requires a HADOOP SENIOR DATA ENGINEER to work on a product team using Agile Scrum methodology to design, develop, deploy and support solutions that leverage the Cargill big data platform.

The Sr. Data Engineer will work with Enterprise Architecture, D&BI Solution Architects, and Business Analysts to understand business requirements and to build big data and advanced analytics solutions to meet their needs and objectives. This role requires the ability to interpret and apply data ingestion/storage/usage patterns developed by the architecture team in order to build Hadoop solutions.

PRIMARY ACCOUNTABILITIES:

SOLUTION ANALYSIS & DESIGN:

Work with businesses, process owners, and product team members to define product backlog items and design solutions for Cargill's big data and Advanced Analytics solutions.

Perform data modelling and prepare data in databases for reporting through various analytics tools.

Create or modify design documentation as defined by team development standards, processes, and tools.

Ensure the solution designed and built is supportable as part of a DevOps model

DEVELOPMENT, TESTING, AND QUALITY:

Perform integration development to move data from production systems to database/data warehouses using ETL tools (Sqoop).

Perform data transformation. (Impala, Spark, SQL)

Perform unit testing and data validation using SQL queries.

Support testing by fixing defects and making necessary Back End design changes.

Ensure adherence development and architecture standards and best practices.

Provide necessary technical support through all phases of testing and incident handling after deployment.

REQUIRED TECHNICAL SKILLS:

Previous experience delivering technical solutions as part of an agile scrum DevOps team.

Experience building Big Data Solutions in a secure Hadoop environment using NoSQL technology.

4+ years of experience database coding and developing tables/views or data warehouses in SAP HANA, Oracle, or SQL Server

3+ years data modelling for reporting using ETL/ingestion tools: Sqoop, Flume. SLT, Streamsets, Business Objects Data Services (BODS), Kudu, Kafka

-Experience with Scripting languages (SQL, Spark/Scala) to manipulate data.
-Experience with Spark, Hive, Parquet, Impala, Kafka
-Experience with NoSQL data stores (especially Cassandra).
-Comfortable Scripting in.NIX environment (ssh and standard commands)

Experience working with Front End visualization tools like Power BI, Tableau, and Business Objects.

Version control, particularly GitHub.

Development tools Eclipse, IntelliJ

Agile, quick learner of new technologies

BUSINESS FLUENCY IN ENGLISH.

Please contact Stuart Holman or email (see below)

Projektdetails

  • Einsatzort:

    Amsterdam, Niederlande

  • Projektbeginn:

    asap

  • Projektdauer:

    6 months +

  • Vertragsart:

    Contract

  • Berufserfahrung:

    Keine Angabe

Geforderte Qualifikationen

Next Ventures Ltd