Data Modelers are responsible for designing and creating the data warehouse. They orchestrate, check / validate related extraction, transformation and load of data.
After the groundwork has been laid, Data Modelers also must test their designs to ensure the system runs smoothly.
Furthermore, Data modelers must be experts at taking a big-picture view of the EIM's data situation. This means that Data Modelers need to be able to read, analyze and digest what a business wants to accomplish with its data, and design the best possible DW / ETL process around those goals.
Database designs take on many forms, including star and snowflake sachems.
The successful candidate will also evangelize and pursue continuous improvement in data quality as a culture within EIM ans well as champion the standardization of data management processes to improve efficiency and quality of service.
As such, will be tasked with the mandate to assist EIM tower to define GBU wide data governance and management capabilities, while taking into account the various dimensions of maintainability, reusability, scalability, and integration with existing / new solutions with the keen focus to balance between business needs and total cost of ownership.
At the same time, the final candidate will also provide insights into the architectural solutions and offer technical assistance to the enterprise reporting platform.
Will further be strengthening the GBU’s Analytics capabilities by defining and professing the vision and guidelines in adopting industrialized best practices for future states system that aligns to our Enterprise Architecture roadmaps.
She / he will also assess customer needs and always refer to the Global CoreBI Roadmap in order to provide effective solutions through regular analysis of data and information reported from a broad variety of sources.
Data Modeler will also remain sensitive to the broader work environment (most of the time global) and fosters relationships with Business Operation & Support teams in order to achieve BI Council objectives.
KEY ACCOUNTABILITIES :
Assist in defining the optimum architectures of ETL tools and data warehousing by creating models and its data quality management solutions, roping in industrial and security best practices when possible to manage data quality and integrity.
Establish and formalize guiding principles, procedures and standards that build business metrics and data models to integrate with current ETL tools and application interfaces.
Ensures compliance with company polices, audit requirements , data privacy acts within the area of expertise. As such, a suitably high level of detail and organization is necessary.
Participates in solution implementations, upgrades, enhancements. Provide technical leadership for implementation teams, Review and validate high level design documentation and detailed level design documentation.
Should be able to provide some insights into data analysis and key performance metrics across functions so as to aid strategic decisions
Also, will be a key actor to build, develop and grow any business relationships vital for the success of our EIM Strategy
The Data Modeler will work with cross-functional teams to evaluate future programs. Will also assess improvement opportunities and developing new opportunities to deliver more customer value.
Based on advanced functional knowledge and communication skills, will have to support and influence business to ensure project team will make the right decisions for the company.
Education Background :
Bachelor's degree in computer science, information technology or another computer-based discipline.
Experience & knowledge :
Strong technical background in the areas of Data integration and DW, Data Quality, Data governance and Data Security
Strong with data modeling concepts (star schemas, snowflake schemas, highly normalized data models) having experience in any data modelling tool, data warehouse platform or data marts.
Experience handling large data sets efficiently
Strong expertise writing complex SQL
Experience working on SAP Power Designer (data modeling), Informatica Cloud (ETL), AWS services (ex : AWS Aurora PostgreSQL), Python is a big plus
Should possess excellent analytical, conceptual and root cause analysis skills.
Experience with Data in the pharmaceutical industry as well as knowledge of key business processes & compliance requirements is a plus.
Able to juggle multiple projects and work both independently, will and as part of a team
Keep abreast with industry technologies and tools in Data domain and determines feasibility of emerging technologies / trends and its application into Sanofi environment
Core competencies :
Excellent communication skills and able to collaborate with clients, development team, and other partners to understand the requirements
Be able to listen to and interpret what the business side wants to see accomplished, discuss details in an easy-to-understand way, and then translate it all into nuts-and-bolts technical output
Has excellent interpersonal and presentation skills with a proven ability to effectively communicate in both written and verbal formats across various levels of management and peer.
Be able to work with divergent and virtual team with a proven ability to influence.
Ability to work with resources spread across the region
Strong verbal & written communication skills in English
Action orientated with ability to work quickly and efficiently in a fast moving environment
At Sanofi diversity and inclusion is foundational to how we operate and embedded in our Core Values. We recognize to truly tap into the richness diversity brings we must lead with inclusion and have a workplace where those differences can thrive and be leveraged to empower the lives of our colleagues, patients and customers.
We respect and celebrate the diversity of our people, their backgrounds and experiences and provide equal opportunity for all.