Data Engineering Intern
Who we're looking for
A suitable candidate for this position should:
- Understand OOP or Functional programming concepts
- Understand basics of one language: Java, Scala, Python or C++
- Have an interest in Hadoop Ecosystem: HDFS, Hive, HBase, Spark, Kafka, Oozie etc.
- Understand basics of Linux: CLI, Shell scripting, etc.
- Have good English skills – written and spoken
Your future role
We are looking for Data Engineering Interns.
As a future Data Engineer, you will become a specialist in real-time, Cloud and vast Data processing. You will be working in a nowadays-high demand IT branch called Big Data, Artificial Intelligence and Machine Learning.
You will become a part of first Technology Consulting Team in Baltics that is focusing on Big Data, Artificial Intelligence and Machine Learning technologies across the Globe!
What we offer
Training program to get in touch with Big Data technology stack. Excellent opportunity to start your career in Big Data, Machine Learning and Artificial Intelligence field. Training program consists of real-world daily tasks and will broad trainees skill-set. After successfully passing training program, trainees will have opportunity to join Deloitte Team for a paid practice.
Our consultant work lies with the various clients across multiple geographies and includes inventing and developing solutions on such platforms as Azure, AWS, GCP or on-premises platforms with Cloudera/Hortonworks Data Platforms.
By joining our team, you will have an opportunity to work with such technologies as Spark (2.x), Kafka, Hive and HBase.
Monthly scholarship 430 euro net for 40 hour work week, depending on experience, education, and other strengths.