Founded in 2004 on two main ideas: innovation as a decisive differentiation factor for companies and the capacity to mobilise the best talents as a condition of success, Adaltas defines itself as a big data pure player on Hadoop and NoSQL technologies.
Our big data expertise began in 2009 with the support of EDF and the Linky POC (intelligent electric meter). In 2012, we undertook the industrialisation of an offer of service shared by the whole EDF group to offer Hadoop components such as HDFS, YARN, HBase, Hive and Oozie in a multi-tenant architecture. The number of supported components grew to support the likes of Spark, Solr, Atlas, Storm, MongoDB and ElasticSearch.
The industrialisation of this offer allowed us to acquire expertise on data analysis and treatment, data governance, operational deployment and management. Among the multiple addressed subjects, we can cite the setup of chains of data collection and preparation, data visualisation, application of Machine Learning algorithms, access securisation and stream encryption, implementation of right management politics, component monitoring, integration with the company’s SI best practices, client team formation, qualification of new components, …
These past few years, this expertise was used for customers such as Voyages-SNCF, the ministry of Finances, Air France, BRGM, Silca and SoLocal. Adaltas secured the main Hadoop stacks: Hortonworks, Cloudera and MapR. Our consultants also work as backups for more generalist companies like GFI, SII and Capgemini. We also created the big data class at ECE Engineering school in 2013 and keep giving it.
Let’s say things as they are, we are a group of consultants passionate about our profession. We get our hands dirty, engage an honest conversation with our customers and are foreign to any internal politics. Moreover, every consultants engages in at least two developer conventions per yer.