Dieses Jobangebot ist archiviert und steht nicht mehr zur Verfügung.
Vakante Jobangebote finden Sie unter Projekte.
Vakante Jobangebote finden Sie unter Projekte.
Senior Java and Hadoop Developer
Eingestellt von Darwin Recruitment
Gesuchte Skills: Java, Linux, Client, Spring
Projektbeschreibung
Our client, a service provider to the Swiss IT market currently seeks a Senior Software Developer with experience in Java, Streaming, RESFul API and Hadoop.
YOUR ROLE:
As a member of the Development team, you will be part of a project leveraging industry leading technologies based on Java/Cluster/Computing/RESTFul and Streaming APIs/Message Brokers/Time Series and Graph databases on Linux to build a high-performance scalable distributed computing platform for large data sets.
YOUR PROFILE:
- BS/MS in Computer Science or equivalent
- At least 5 years' experience of software development in a Linux environment
- At least 5 years' coding experience using Java or other Object Oriented programming languages
- Proven experience developing, deploying and operating cluster-based applications typically for Hadoop 1.x or 2.x (Yarn)
- Sound experience of working with and/or designing Streaming and RESTFul APIs (Storm, Jersey or others)
- Experience with large data sets and/or streams, including data management (eg. validation, transformation, ingestion)
- Knowledge of NoSQL document stores (ElasticSearch, Cassandra, MongoDB or others)
- Proven experience with Spring, Hibernate or JPA.
- Experience of Cloud computing platform and infrastructure (AWS, OpenShift, Heroku, OpenStack) as user and publisher
- Proficient knowledge of Time Series, Graph databases (Neo4j, Titan or others) and Message Brokers (Kafta, ActiveMQ or others)
- Proficient knowledge of software development methodologies especially Agile and Scrum
- Fluent in English
- EU Citizenship or Valid Swiss work permit
YOUR ROLE:
As a member of the Development team, you will be part of a project leveraging industry leading technologies based on Java/Cluster/Computing/RESTFul and Streaming APIs/Message Brokers/Time Series and Graph databases on Linux to build a high-performance scalable distributed computing platform for large data sets.
YOUR PROFILE:
- BS/MS in Computer Science or equivalent
- At least 5 years' experience of software development in a Linux environment
- At least 5 years' coding experience using Java or other Object Oriented programming languages
- Proven experience developing, deploying and operating cluster-based applications typically for Hadoop 1.x or 2.x (Yarn)
- Sound experience of working with and/or designing Streaming and RESTFul APIs (Storm, Jersey or others)
- Experience with large data sets and/or streams, including data management (eg. validation, transformation, ingestion)
- Knowledge of NoSQL document stores (ElasticSearch, Cassandra, MongoDB or others)
- Proven experience with Spring, Hibernate or JPA.
- Experience of Cloud computing platform and infrastructure (AWS, OpenShift, Heroku, OpenStack) as user and publisher
- Proficient knowledge of Time Series, Graph databases (Neo4j, Titan or others) and Message Brokers (Kafta, ActiveMQ or others)
- Proficient knowledge of software development methodologies especially Agile and Scrum
- Fluent in English
- EU Citizenship or Valid Swiss work permit
Projektdetails
Geforderte Qualifikationen
-
Kategorie:
IT Entwicklung