Job Description
Job Id:
*Copy Job Id. It is required to apply for this specific role.
Data Model and Structure Definition:
• Design and develop conceptual, logical, and physical data models for data sets across a variety of financial data sets.
• Collaborate with product management and the economic research team to understand data requirements and translate them into effective data models.
• Ensure data models are aligned with business goals and effectively support transactional, analytical and operational needs.
Data Integration and ETL:
• Design and implement ETL processes to integrate data from various sources into the data lakes.
• Ensure data quality, consistency, and integrity throughout the data lifecycle.
• Develop and maintain documentation for data models and ETL processes.
Minimum Requirements:
• 5+ years’ experience working within AWS technologies (S3, Redshift, RDS, etc.).
• 3+ years’ experience working with dbt.
• 3+ years of experience with orchestration tooling (Airflow, Prefect, Dagster).
• Strong programming skills in languages such as Python or Java.
• Familiarity with big data technologies (e.g., Hadoop, Spark, Kafka, Delta Lake, Iceberg, Arrow, Data Fusion).
• Familiarity with data governance tooling such as Monte Carlo, Atlan.
• Excellent problem-solving and analytical skills.
• Strong communication and interpersonal skills.
• Ability to work collaboratively in a team environment.