Adaltas was commissioned to implement a Hadoop service offering transverse to the EDF group. This mission continues to this date around the industrialization of Hadoop, MongoDB and ElasticSearch. We have built an active supervision based on Shinken, used to monitor the state of the cluster components and also to trigger actions to restart these components in the event of a service interruption. In parallel, we were asked to accompany the business projects hosted by the platform. We have thus shaped the implementation of the “DataLake” carried by EDF Commerce and several other use cases.
ElasticSearch, Shinken, MongoDB
60 noeuds, 30 VMs
Adaltas set up and secured the SoLocal production cluster based with the best practices for the Cloudera distribution. To date, the mission is focused on maintaining the cluster in operational quality and on migrating existing projects on this new instance. After installing tools such as Docker, Mattermost, GitLab and Wekan, we also conduct in-service training for Hadoop components and DevOps methodologies.
Docker, Mattermost, GitLab et Wekan
Adaltas set up a big data training tailored for a business use case of the airline company. After a preliminary study, the training reproduced in a simplified but real version the client supply chain with the technologies Spark Streaming, Kafka and Hbase.
Spark streaming, Kafka, HBase
Ministry of Finances
Adaltas partnered with GFI Informatics designed a “DataLake” to collect and analyse sensible information. In addition, our consultants worked with the customer’s data scientists to ensure the smooth running of algorithms executed with the Spark engine. As part of this mission, we operate and secure a Hadoop cluster based on the Cloudera distribution.
Adaltas conducted workshop with the client discussing the architecture of a multi-tenant Hadoop cluster based on the Hortonworks distribution. The focus was on the different integration scenarios with the customer infrastructure, as well as the implementation of a security perimeter using Knox to control external access.
Adaltas secured with Kerberos the Hadoop cluster of Silca based on the MapR distribution. In addition, we have implemented scalability tests to validate the performance of the OpenStack-based client architecture and the Ceph file system.
Banque de France
Adaltas assists the Banque de France in the creation of a Big Data service offer. It began by upgrading and securing the existing HDP platforms, as well as by developing, supporting and executing strategies to deploy new platforms. Tasks include the development of puppet deployment modules, integration scripts between Ambari supervision and existing bank tools, operation of all deployed HDP platforms, BigData project definition and support for our users. Today, we are involved in the creation of the cross-entity DalaLake creation project.