Vakante Jobangebote finden Sie unter Projekte.
Senior Systems Admin
Eingestellt von CompuCom
Gesuchte Skills: Support, Network, Java, Client
Projektbeschreibung
The successful candidate will possess a work style that promotes progress through collaborative teamwork in a dynamic environment. Must be enthusiastically committed to delivering world-class technology solutions and the highest levels of customer service, be committed to continual process, product and performance improvements, be able to communicate with confidence, clarity, and honesty, take initiative and assume ownership of a wide variety of systems, processes and deliverables.
This position requires a strong technology background, Hadoop and Vertica expertise, excellent communication skills and experience in managing enterprise scale infrastructure and analytics applications. Successful candidates must be action oriented, capable of working on concurrent complex projects and able to communicate clearly and effectively to cross-functional teams and business audiences.
RESPONSIBILITIES
- Hadoop Application Administration, Job Management and Operational Support
- Hadoop Application Architecture expertise (structure, data, eco system)
- Work with the core architecture team to design, build and maintain systems and storage infrastructure that houses large-scale BI/Analytics/Reporting Applications
- Recognize and manage technical dependencies and/or limitations that impact analytic work
- Actively question and challenge customers to understand their requirements and reach the best solutions, near term and long term
- Understand and adhere to design and documentation standards for infrastructure builds (Servers, network and storage/database)
- Perform technical proof of concepts
- Maintain uptime requirements with highly available clusters
- Improve automated monitoring and failure recovery
- Benchmark and tune Analytics clusters
- Manage backups for key data stores
- Support configuring, sizing, tuning and monitoring analytic clusters
- Implement security and regulatory compliance measures
- Streamline cluster scaling and configuration
- On-call support required
QUALIFICATIONS
- Bachelor's degree in Computer Science, Math, Engineering or a related field
- 5+ years of experience with Hadoop, Vertica and deployment of Java-based applications and services
- 5+ years of experience building and implementing large-scale distributed data processing solutions with solid understanding of server/storage technologies and ability to implement complex solutions
- Experience working with large-scale analytical environments, such as Hadoop and MapReduce, analytic databases like Vertica, Teradata, Netezza, Greenplum, Aster Data, Paraccel
- Integration/automation experience with Chef
- Execution of software, platform, content and configuration upgrades as needed
- Familiarity with virtualized server environments
- Solid understanding of Linux and Windows operating systems and ability to install and configure operating systems packages
- Understanding of network technology and communication principles
- Strong analytical aptitude and problem solving skills to identify the root cause and implement the fix
PREFERRED QUALIFICATIONS
- Knowledge and experience of analytics solutions and processes in web analytics a big plus
- Experience with and knowledge of multiple Datawarehouse/Data Mart architectures
- Familiarity with large scale multi-terabyte environments or experience with distributed systems a plus
- Vertica Application Administration and Operational Support a plus
- Vertica Application Architecture expertise a plus (structure, data, ecosystem)
THIS IS A CONTRACT TO HIRE OPPORTUNITY.
Projektdetails
-
Einsatzort:
Glendale, Vereinigte Staaten
-
Projektbeginn:
asap
-
Projektdauer:
Keine Angabe
- Vertragsart:
-
Berufserfahrung:
Keine Angabe
Geforderte Qualifikationen
-
Kategorie:
IT Entwicklung, Sonstiges