
Big data administrator on TDP - senior developer
- Categories
- Big Data
- Tags
- DevOps
- Hadoop
- Databricks
- Kubernetes
- TDP [more][less]
Job Description
Adaltas is a team passionate about open source and big data. We actively contribute to the development of TDP, the 100% open source data platform. As part of the development to provide support and professionnal service on TDP, we are looking for a big data administrator:
- Deployment and operation of TDP clusters
- Participation in the community development of TDP:
- Build components from sources
- Integration with the CI/CDS infrastructure to automate unit tests and integration tests
- Application of patches on the components
- Writing Ansible deployment roles
- Editor support on TDP stack
- Professionnal services including audit, architecture, security, version upgrade management, …
Company presentation
Adaltas specializes in data processing and storage. We operate on-premises and in the cloud to empower our clients’ teams in architecture, operations, data engineering, data science and DevOps. We are active contributors to the TDP project, working closely with our customers. We invite you to consult Alliage, our support and consulting offer dedicated to TDP.
Expected Skills
- Apache ecosystem (Hadoop, HBase, Hive, Ranger, etc.)
- Kubernetes ecosystem and object storage
- Ansible, Jenkins, Maven, Terraform, Git, …
Experience (> 2-3 years) on a similar mission or position is essential to respond to this opportunity. You also have significant experience in the field of software, for example within a team operating a critical infrastructure.
Profile
A master degree in IT and data is expected along some experience in the field. Each application will be carefully examined by our recruitment team.
Above all, we are looking for a profile that fits into the spirit of Adaltas, whether through their appetite for open source software or their attraction to bringing their expertise and growing the company.
Teamwork, sense of priorities, autonomy and knowledge sharing are qualities we look for in a candidate.
A satisfactory level of English is essential to carry out this mission, whether in oral or written expression.
Complementary information
This is the opportunity to work on a driving project of the big data ecosystem and to become an open source contributor. We offer permanent contracts. Compensation is based on your experience and skills.
Contact
For additional information and to submit your application, please contact David Worms:
- david@adaltas.com
- +33 6 76 88 72 13
- https://www.linkedin.com/in/david-worms/
Looking for new challenges, do not hesitate to apply to join Adaltas. If no job description suits you expectations but becoming a consultant at Adaltas is your career choice, submit an unsolicited application.
Open job opportunities
Back to careersData streaming engineer - mid level developer
Prototype, build, deploy and operate data ingestion pipelines on critical infrastructure, generate KPIs with real-time queries.
Data Engineer Databricks and Azure - mid level developer
Collaborate with other data engineers, business analysts and data scientists to solve challenging business problems on the Databricks and Azure platforms
Big data architect with CDP - senior developer
Design and develop solutions including platform architecture, data ingestion, data lakehouse architecture and data science usages.
Big data administrator Cloudera CDP - mid level developer
Deploy and operate of big data clusters based on the Cloudera CDP platform.