Dieses Jobangebot ist archiviert und steht nicht mehr zur Verfügung.
Vakante Jobangebote finden Sie unter Projekte.

Hadoop Architect Developer

Eingestellt von CompuCom

Gesuchte Skills: Client, Engineering, Python, Linux

Projektbeschreibung

Our client is the world's leading financial services company seeking a SENIOR HADOOP ARCHITECT/DEVELOPER to work on a security analytics project. Background in financial services, banking or card processing would be ideal but not required

PRIMARY RESPONSIBILITIES:

- Understand the Security Analytics business requirements and work with technology teams to recommend best approach to delivering solutions on the Hadoop Platform.
- Work with security, data platform team, engineering and technology to implement the Security Analytics platform.
- Ensure the Security Analytics platform conforms to corporate standards.
- Participate in code reviews.
- Define and support process for access to the Security Analytics Platform including balancing the needs of various business groups with security requirements, and Platform capacity.
- Work with Architecture and Development teams to understand usage patterns and work load requirements of new projects in order to ensure the Security Analytics Platform can meet demand.
- Develop administrator processes, document the processes, and then train other client users, as warranted, to take over the processes;
- Maintain all system configuration documentation by collecting, storing, and updating the documentation.

QUALIFICATIONS:

- MUST have experience programming with Hadoop (2 to 3 years) - will not be considered for role without Hadoop
- 7-10 years' experience working with Data Warehouses
- Experience programming large databases
- Must have good communication skills
- Minimum of 10 years of technical experience with a focus on open source and large data implementations in the Petabyte range
- Business acumen/solution expertise and technical expertise are both required for the role
- Experience developing solutions for Banking, Fraud, Risk or Marketing groups desired
- Solid understanding of all phases of development using multiple methodologies ie Waterfall, Agile
- Proven Linux experience including:
- Basic Administration
- Files and Permissions
- Directory Navigation
- Job Scheduling
- Shell Scripts
- Hadoop Distributed Data Files System experience including:
- Use and set up of Blocks, Namenodes, Datanodes
- File Systems Interfaces, parallel copies, cluster balance and archiving
- Scaling Out including data flow, combiner functions, running distributed jobs
- Hadoop Streaming with Python
- Hadoop Pipes
- Sorting, joins and side data distributions
- MapReduce
- Strong knowledge of Hive including:
- Familiarity with Hive Query Language
- Plug Ins, Interfaces, User Defined Functions, SerDes
- Basic Operators and Functions, Web Interface and Hive Client
- General relational DB knowledge
- Experience with: Oozie, Sqoop, Datameer, Flume
- Desired but not required experience: Python, Hbase, Pig, Cascading, Tableau, SAS, EMC Green Plum

This contract position is expected to be 6 months in duration with the possibility of extension.

Projektdetails

  • Vertragsart:

    Contract

  • Berufserfahrung:

    Keine Angabe

Geforderte Qualifikationen

CompuCom