GFT is driving the digital transformation of the world’s leading financial institutions. Other sectors, such as industry and insurance, also leverage GFT’s strong consulting and implementation skills across all aspects of pioneering technologies, such as cloud engineering, artificial intelligence, the Internet of Things for Industry 4.0, and blockchain. With its in-depth technological expertise, strong partnerships and scalable IT solutions, GFT increases productivity in software development. This provides clients with faster access to new IT applications and innovative business models, while also reducing risk.
GFT is looking for a Big Data Engineer with hands on industry experience of designing and delivering Big Data Solutions. This experience will have been gained in complex environments where technology plays a mission critical role in the value chain. The successful candidate will have a brilliant understanding of and passion for Big Data technologies in addition to knowing how to shape solutions for clients in the financial services industry.
- Spearhead design and development of Big Data based solutions on NoSql and Hadoop stack
- Stakeholder relationship management with ability to prime an outcome in difficult scenario
- Understand complex data and aggregation requirements and translate them to functionalities
- Interface with SMEs, Business analysts and other IT teams to understand the requirements and translate them into technical deliverables.
- Have to be critical thinkers who can connect the dots, and work innovatively and independently and is also good in conflict resolution
- Must be comfortable working with complex systems and enjoy working in the analytical space (e.g. risk / pnl and complex data processing)
Essential Skills Required:
- Demonstrable experience implementing NoSql solutions (MongoDB/Cassandra) and Hadoop (Cloudera or Hortonworks)
- Hands-on experience with design and build out of Hadoop solutions. Should be able to code Hadoop solution as and when needed.
- Big data Components and technologies including Kafka, Spark, HDFS, MapReduce, Hive, HBase, Spark GraphX, ZooKeeper, elastic search. Hadoop multi-tenancy architecture and design
- Demonstrable experience with languages like – Java or Scala.
- Demonstrable experience with designing solutions for large data platforms
- Help program and project managers in the design, planning and governance of solution
- Development experience preferably in Java or Scala.
- Design and implementation in Big Data
- Good data mapping/modelling skills – understanding data requirements
- Strong communicator and be able to interface with application teams and add value. Excellent oral and written communications skills to meet high standards of consulting business.
- Self-motivated and self-driven individual who is able to work autonomously as well as a member of a team, including ability to multi-task
- TDD/BDD orientated experience in an agile environment
- Ability to code if needed (proof of concept level coding)
What we offer you:
You will be working with some of the brightest people in business and technology on challenging and rewarding projects in a team of likeminded individuals. GFT prides itself on its international environment that promotes professional and cultural exchange and encourages further individual development.
Founded in 1987 and located in 13 countries to ensure close proximity to its clients, GFT employs over 5,000 people. GFT provides them with career opportunities in all areas of software engineering and innovation. The GFT Technologies SE share is listed in the Prime Standard segment of the Frankfurt Stock Exchange