This job offer has expired.
Data Engineering Intern
Analytics and Information Management (AIM) helps clients design, build and run insight-driven organizations by helping to maximize the potential value of analytics and information to deliver operational excellence, new products and services, competitive agility, and growth. Analytics and Information Management (AIM) services offer an integrated approach with our broad range of solution offerings including shape strategy, manage data, deliver information, improve performance, optimize insights, amplify intelligence, build capabilities, and manage environments.
Who we're looking for
A suitable candidate for this position should:
- Understand OOP or Functional programming concepts
- Understand basics of one language: Java, Scala, Python or C++
- Have an interest in Hadoop Ecosystem: HDFS, Hive, HBase, Spark, Kafka, Oozie etc.
- Understand basics of Linux: CLI, Shell scripting, etc.
- Have good English skills – written and spoken
Your future role
We are looking for Data Engineering Interns.
As a future Data Engineer, you will become a specialist in real-time, Cloud and vast Data processing. You will be working in a nowadays-high demand IT branch called Big Data, Artificial Intelligence and Machine Learning.
You will become a part of first Technology Consulting Team in Baltics that is focusing on Big Data, Artificial Intelligence and Machine Learning technologies across the Globe!
What we offer
Training program to get in touch with Big Data technology stack. Excellent opportunity to start your career in Big Data, Machine Learning and Artificial Intelligence field. Training program consists of real-world daily tasks and will broad trainees skill-set. After successfully passing training program, trainees will have opportunity to join Deloitte Team for a paid practice.
Our consultant work lies with the various clients across multiple geographies and includes inventing and developing solutions on such platforms as Azure, AWS, GCP or on-premises platforms with Cloudera/Hortonworks Data Platforms.
By joining our team, you will have an opportunity to work with such technologies as Spark (2.x), Kafka, Hive and HBase.
Monthly scholarship 430 euro net for 40 hour work week, depending on experience, education, and other strengths.