Because of its importance within a Big Data project, we help you define and implement the appropriate infrastructure that is compatible with your existing and anticipated IT environment.

Our skills cover key topics in design and architecture such as monitoring, diagnostics and reporting, deployment, configuration and security. Our expertise extends to a multitude of technologies and distributions.

We have repeatedly secured with Kerberos distributions from Hortonworks, Cloudera and MapR, and have the experience of conducting workshops with several players in your organization to integrate Big Data platforms with technologies such as: Active Directory, FreeIPA, MIT Kerberos, and OpenLDAP.


Existing systems re-use

Audit of the current systems, definition of a strategic view for the Big Data service offer, strategies elaboration for version upgrade, high-availability setup, cluster securisation, … Preparation for the change of depreciated technologies and change management of the involved cross-functional teams.

Capacity planning

Capacity planning audit, which consists of a mapping of the system’s resources and a validation and optimization of its configurations. Also includes a study of the cluster’s anticipated usage, monitoring of its performance capabilities through various stress tests and definition of optimization possibilities.


Strong authentication

Access securisation of the various service processes through the strong network authentication protocol Kerberos. Integration with existing OpenLDAP and ActiveDirectory solutions of the company.


Definition of personal and applicative users, storage in one or more LDAP/AD, management of user groups.

Flow encryption

Communication securisation within and between nodes for SSL/TLS enabled services.

Security perimeter

Deployment and configuration of firewalls using complete or selective rules, HTTP proxies and alternative authentication mechanisms to Kerberos and network isolation.


Supervision & monitoring

Deployment of Hadoop clusters management and provisioning softwares (Ambari and Cloudera Manager) and custom supervision solutions based on standard infrastructure supervision softwares like Nagios and Shinken. Development of curative solutions automatically scheduled by supervision tools.

Backup & replication

Convergence of customer policies and Big Data system requirements : scheduled backup of configurations, applicative logs and metadatas, planification of HDFS and HBase’s data replication.


Association of existing tools with the Big Data platform : business applications, ETL, reporting tools, workstations, development environments (IDE), …


Deployment of fault-tolerant systems, investigation and anticipation of critical parts, conception of architecture schemas showing dependencies and consequences of failures.