Dieses Jobangebot ist archiviert und steht nicht mehr zur Verfügung.
Vakante Jobangebote finden Sie unter Projekte.

Hadoop Engineer

Eingestellt von CompuCom

Gesuchte Skills: Engineer, Sql, Python, Java

Projektbeschreibung

HADOOP ENGINEER FOSTER CITY, CA 6 MONTH CONTRACT

Did you know that CompuComs employee benefits start on the first day of employment? Join COMPUCOM and enjoy our generous, DayOne Benefits(SM)!

MUST HAVE SKILLS: Hands-on experience using MapReduce. Hands-on experience using projects in Apache Hadoop ecosystem such as FLUME, Pig, Hive, HBase. Fluent in at least one Scripting language (Shell/Perl/Python/Java/etc.)..*Must sit in Foster City.

Looking for a talented Hadoop Engineer with experience on working with very large data sets and knowledge of building programs that leverage Hadoop and MPP Database platforms. The Engineer will have significant knowledge of Big Data technologies and tools with the ability to share ideas among a collaborative team. Some of the responsibilities include loading data from several disparate, structured and non-structured data sets, documentation, performance testing and debugging applications.

The Hadoop Developer will also be an expert with traditional Database Development (SQL, Stored Procedures, User defined functions and ETL Development). The Developer will understand Fact Dimensional Modeling and ETL steps necessary to load into Data Warehouse systems.

REQUIRED:

- BS in Computer Science or a related field
- Java experience
- Hands-on experience using MapReduce
- Hands-on experience using projects in Apache Hadoop ecosystem such as FLUME, Pig, Hive, HBase
- Fluency in at least one Scripting language (Shell/Perl/Python/etc.)
- vExpert SQL development skills preferred
- Deep understanding and experience with Hadoop internals: MapReduce (YARN), HDFS, Streaming, HCatalog, Oozie
- Hands-on experience in the following areas: Elasticssearch, LuceneSearch, JQuery
- Strong desire to work for a fast-paced, flexible environment of a startup
- Ability to create and manage big data pipeline, including Pig/MapReduce jobs
- Deep understanding and experience with Linux internals, virtual machines, and open source tools/platforms
- Experience building large-scale distributed applications and services
- Experience with agile development methodologies
- Knowledge of industry standards and trends
- Significant experience with Data Warehousing (Fact and Dimensional modelling) and RDBMS ETL development
- Local candidates required

TECHNOLOGY EXPERIENCE:

- Hadoop, 2+ years
- Linux, 2+ years
- Scripting Language, 2+ years
- SQL Query, 2+ years

Dallas-based CompuCom Systems, Inc. is a leading provider of end user enablement, service experience management, and cloud technology services to Fortune 1000 companies. CompuCom partners with enterprises to develop smarter ways they can work, grow, and produce value for their business. Founded in 1987, privately held CompuCom has approximately 11,500 associates and supports more than 4 million end users in North America.

Projektdetails

  • Einsatzort:

    California, Vereinigte Staaten

  • Projektbeginn:

    asap

  • Projektdauer:

    Keine Angabe

  • Vertragsart:

    Contract

  • Berufserfahrung:

    Keine Angabe

Geforderte Qualifikationen

CompuCom