Data Engineer応募後で応募 求人ID R0062079 掲載日 05/09/2022 Location:Bratislava, Slovakia
The Future Begins Here
At Takeda, we are creating a future-ready organization, one that evolves at the speed of science and technology using data and digital to meet the needs of patients, our people, and the planet. In order to make that happen, we need your help.
Join our exciting and new Innovation Capability Center (ICC) in Bratislava, Slovakia. This specialized, innovative and state of the art center of technological resources will drive data and digital capabilities and solutions across Takeda. The Center will be home to approximately 300 change agents, organized into chapters and cross-functional squads based on agile ways of working.
Grow your data and digital skills, create and enhance solutions for patients around the globe, and help us create a data and digital-forward Takeda. This is a unique opportunity to become the heart of our internal innovation engine.
As a Data Engineer you will interpret and develop advance techniques in partly structured and unstructured big data across different partner organizations. You will lead data analysis, uses and explores data, languages, tools and software to best construct data for predictive modelling, tests the model, trains data to deploy the modelling within complex business environments and a large complexity of IT systems and data.
- Leads data analysis, uses and explores data, languages, tools and software to best construct data flows for predictive modelling, core data modeling, data-cleansing, transformation and ingestion.
- Independent support and technical guidance of teams working on existing or new computer science platforms.
- Lead the development of existing and new applications to design and implement data driven solutions, with an impact on the daily operation of our manufacturing processes and facilities.
- Works in an Agile/Scrum environment to support and deliver solutions with a customer driven mindset
- Provides guidance to team members
- Supports the long-term data strategy in Global Manufacturing and Supply (GMS) and Global Quality (GQ)
- Determines data structure, uses different technologies, big data preparations, programming and loading as well as initial exploration in the process of searching and finding data patterns.
- Oversees data science input and requests, translates these from data exploration - large record and unstructured data sets - to mathematic algorithms and uses various tooling from programming languages to new tools (artificial and machine learning) to translate, build and optimize data models.
- Drives ongoing tests in the search for data driven solutions, collects and prepares the data, optimizes algorithm implementations to test, scale, and deploy future data models and ingestions.
- Develops Roadmaps, Builds/tests/deploys complex data solutions related to GMS/GQ use-cases and business problems
- Leads and guides data exploration from analysis to scalable models, works independently and decides quick on transfers in complex data analysis and modelling.
- Is fully aware of the impact on IT Structure and Architecture and influences this discipline.
- Independently use own judgement to identify data requirements and leads the design and implementation of the data strategy.
- Ensures execution and documentation to Takeda Quality Management System (QMS), suggest adjustments to the Software Development Life Cycle (SDLC) and other standards, policies and procedures.
- Maintains up-to-date knowledge on modern data technologies, explores new platforms and beta tooling and guides colleagues as a mentor.
- Represent the team while working on projects across GMS/GQ. Is internal and external expert to-go-to in how to drive advanced Computer Science and Engineering skills and techniques.
- Builds and maintains relationships with partner organizations in support of data analysis, modelling and deployment.
SCOPE OF SUPERVISION:
NUMBER SUPERVISED WORKERS
COMPETENCIES, EDUCATIONAL AND SKILLS
- Bachelor’s Degree in Computer Science or equivalent
- 2+ years’ experience or relevant project / coursework
- Excellent oral and written communications skills, ability to write and speak in the English languages
- Excels in problem solving and analytical skills
- Up-to-date specialized knowledge of data wrangling, manipulation and management of technologies to affect change across business units.
- Comfortable to work in an agile and rapid changing environment with high quality deliverables.
- In-Depth experience in Data Lake or Data Warehouse, Datamart environments
- In-Depth knowledge of Cloud environments, especially AWS (S3, EC2, EMR, Athena)
- Expertise in ELT/ETL (Glue, Informatica, or Databricks)
- Expertise in programming Python , R and SQL
- Experience in Spark or Hadoop
- Experience with relational databases
- Understanding or Application of Machine Learning and / or Deep Learning
- Experience with data formats including Parquet, ORC or AVRO
- Experience with NoSQL data stores (Cassandra, MongoDB, AWS Neptune, …)
- Experience with reporting and analytics tools (Qlik, PowerBI, Athena,)
- Experience with at least 2 of the following frameworks: Spring, Django, R Shiny, Tensorflow, MXNet
- Experience with CI/CD, DevOps (GitHub)
- Understanding of Web Services, JSON formats
- Experience in a pharmaceutical/life-sciences environment.
- Access to transportation to attend meetings
- Ability to fly to meetings regionally and globally, up to 10% of the working time during projects phases