We are looking for a passionate and proactive Data Engineer for our client Hiab. In this role you will work closely with Hiabs Data Scientists and Data Analysts to enable them to do their job of developing data models and algorithms. If you are interested in challenging yourself and you are eager to show your skills as a part of a team of experienced professionals, here is an opportunity not to be missed! Keep reading and apply from the link down below, we will fill the position as soon as we find the right candidate.
ABOUT THE ROLE As a Data Engineer, you are responsible for developing and maintaining data sources and models as well transforming data into formats that can be easily analyzed, thereby ensuring business value for Hiab and its customers. You construct, test and maintain architectures as well as develop processes for data set modelling, mining and production. Your key task is to ensure the overall reliability and end-to-end functionality of data pipelines.
You are offered
* Interesting and challenging work tasks
* A unique opportunity to extend your skills
As a consultant at Academic Work you are offered a great opportunity to grow as a professional, extend your network and establish valuable contacts for the future. Read more about our offer.
* Collaborating with different stakeholders, both internal and external
* Developing the Hiab architecture and platforms for advanced analytics and AI capabilities
* Continuous integration and deployment (AI DevOps)
* Defining, developing and maintaining data models and data sources, processes for data mining, and production from Hiabs IoT Cloud and main business applications and platforms (ERP, CRM, PLM/PDM, EAI and custom applications)
* Implementing data ingestion, transformation and cleansing activities to ensure a stable architecture
* Communicating results and assisting data scientists
WE ARE LOOKING FOR
* Suitable educational background (e.g. Computer Science, Information technology or related field)
* Experience on building end-to-end data pipelines, streaming data, data lake and data warehouse
* Knowledge of programming languages like Python, Scala, Java and database query languages SQL/NOSQL
* Experience with cloud architecture and skills with the most common cloud-based data processing, management and big data components on AWS (such as S3, Kinesis, Redshift, IoT, Athena, Lambda, Glue etc.)
* Fluent in both written and spoken English
AWS certificates and knowledge on MongoDd, influxDd, spark and Kafka are considered as advantages.
As a person you are:
* Curious, you are passionate to create new business models and deliver business outcomes
* Collaborative and challenging in a positive way
* Resilient and able to deliver under deadlines
* Start: As soon as possible
* Work extent: 12 months, after (or during) the recruitment can be discussed
* Location: Helsinki/Tampere
* Contact information: This recruitment process is conducted by Academic Work. It is a request from Hiab that all questions regarding the position are handled by
Our selection process is continuous and the advert may close before the recruitment process is completed if we have moved forward to the screening or interview phase.
OTHER INFORMATION Hiab is a part of Cargotec Oyj. They are the world's leading provider of on-road load handling equipment, intelligent services, smart and connected solutions. As the industry pioneer and with a proud 75 year history, Hiab is committed to inspire and shape the future of intelligent load handling.