Treść ogłoszenia
Kategoria: Pozostałe
Opis oferty:
Inter Cars Capital Groupis the leading distributor of automotive spare parts in Central and Eastern Europe. Our logistic network spans 18 countries and over 500 distribution points across Europe, supplying hundreds of thousands of garages and shops with spare parts, oils, tires and garage equipment from our portfolio of 2.7 million products. We make our offer available also to the retail customer, though our e-commerce solutions and major digital marketplaces. Inter Cars IT department, headquartered in Warsaw, Poland takes care of daily operation and development of over a hundred different applications, ranging from ERP, logistics and warehouse management systems to innovative e-commerce solutions and web applications. Hundreds of Inter Cars employees and hundreds of thousands of customers rely on our systems in their everyday work. Observing the increasing importance of IT in every part of our business, Inter Cars has started a Digital Transformation program aimed at increasing the development pace and modernizing the technological stack of our systems. We plan to improve our capabilities using Agile methodologies and leading development practices, such as DevOps and microservice architecture. We are looking for best people to help us make the next step.
Responsibilities:
Become a member of a team of developers and help us shape the future of our data engineering.
Build an event sourcing platform which will become the main information hub in the company.
Design and develop data structures, pipelines and ETL processes to help move and transform terabytes of data between the applications.
Develop a Big Data analytics environment, including data lake and computation infrastructure, storing and analyzing terabytes of key company data.
Execute Data Governance practices, keeping our data secure, transparent and in good quality.
Cooperate with vendors and external consultants.
Minimum Qualifications
At least 2 years of experience in development, deployment and operation of IT systems.
Excellent understanding of relational databases and practical SQL.
1+ years of professional programming in Python and/or Scala.
Practical experience with some Big Data technologies from Apache stack, such as Spark, Hadoop or Kafka.
Basic experience with Linux systems, especially CLI and basic bash understanding.
Knowledge of system integration techniques, such as web services, API and messaging.
Good understanding of general computer science issues such as computational complexity, algorithms, performance analysis and optimization.
Strong communication skills, sense of ownership and drive.
Independence, pragmatic approach and attention to details.
Fluency both in spoken and written English.
Preferred Qualifications
Good understanding of modern Data Science techniques and tools.
2+ years of professional programming in Python and/or Scala.
Experience with utilized tools: Spark, Airflow, Kafka, Hadoop ecosystem (HDFS, YARN), NiFi, Sqoop, Zeppelin, Hive, HBase, Gitlab, Docker, Oracle, Azure.
Familiarity with Git, Bash, Java, XQuery.
Good general knowledge of software and IT infrastructure technologies, including issues such as web applications, databases, system integration, security, server and network infrastructure and cloud solutions.
Understanding of Agile project methodologies, such as Scrum or Kanban.
What we offer
Work on interesting projects in the field of digital transformation at a large European organization.
Participation in international projects.
High independence and the opportunity to implement your own ideas.
Highly motivated and engaged team with strong technical competences.
Flexible employment arrangements, such as flexible hours and partial remote work.
Training and personal development opportunities.