Acest anunț a expirat și nu este disponibil pentru aplicare
Responsibilities:
- As an Engineer you will be responsible for developing, deploying, maintaining / operating, testing and evaluating big data solutions within the organization
- Integrate solutions with the architecture used across the company.
Experience required:
- Mid/senior level, at least 3 years in working with Cloudera data platform
Technical skills:
- Knowledge in configuring & troubleshooting of all the components in the Hadoop ecosystem like Cloudera, Cloudera Manager, HDFS, Hive, Impala, Oozie, YARN, Sqoop, Zookeeper, Flume, Spark, Spark standalone, Kafka (incl. Kafka Connect), Apache Kudu, Cassandra, HBase
- Develop and maintain documentation relating to Hadoop Administration tasks (upgrades, patching, service installation and maintenance).
- Understand Hadoop’s Security mechanisms and implement Hadoop Security (Apache Sentry, Kerberos, Active Directory, TLS/SSL).
- Understand the role of Certificate Authorities, the setup of Certificates and their configuration in relation to Linux and TLS/SSL
- Intermediate programming/scripting skills. Ideally in Java or Python and ksh/bash
- Understanding of networking principles and ability to troubleshoot (DNS, TCP/IP, HTTP).
Nice to have / a plus:
- Knowledge of one or more of the following: ElasticSearch, Kibana, Grafana, git/scm, Atlassian Suite (Confluence, Jira, Bitbucket) Jenkins/TeamCity, Docker and Kubernetes is highly appreciated
- Should have experience in scripting for automation requirement. (. scripting via Shell, Python, Groovy etc)
- Work and continuously improve the DevOps pipeline and tooling to provide active .
Benefits:
- Interesting salary conditions
- Undetermined period of contract
- Career plan (professional, academic and financial)
- Medical insurance
- Lunch tickets
- Professional and friendly working environment.