My client, one of the big names in the iGaming industry, is looking for a Data Operations Analyst will help ensure the reliability and accuracy of client-facing data platforms. This position focuses on daily monitoring of data pipelines, validation of cross-system consistency, and initial troubleshooting.
Main Responsibilities:
Monitor data pipelines integrating sources like MySQL, MongoDB, Kafka, and APIs into Redshift.
Review automated QA outputs and resolve data quality issues.
Detect and address anomalies such as missing data or delayed partitions.
Execute standard fixes including DAG re-runs and metadata syncs.
Escalate complex issues with clear documentation.
Collaborate with QA and data engineers to improve test coverage and issue tracking.
Maintain runbooks and report on data quality trends.
Desired experience:
1–3 years in data QA, operations, or analytics support.
Proficient in SQL (ideally Redshift/PostgreSQL).
Familiarity with ETL processes and data flow concepts.
Strong attention to detail and problem-solving mindset.
Clear written communication and a collaborative approach.
Nice to have: experience with Airflow, Flink, dbt, AWS services, Python scripting, and Iceberg/Parquet formats.