Description
About our client
Our Client operates in the Information Technology Services and Information Technology Consulting Industry, with its headquarters rooted strongly in Singapore. It has its branches spread to more than 50 countries, providing employment to more than 2,40,000 people all over the world. Their core business is assisting clients in their Information Technology Management in technology operations, infrastructure and application. They believe in making their share of contribution to the Digital Transformation of the world.
Job description
Responsibilities:
- Support daily business-as-usual (BAU) operations, audit and compliance
- Communicate with vendor and BAU teams for data ingestion queries and issues
- Manage data ingestion and DAG maintenance in Airflow
- Develop and modify Python scripts for data ingestion
- Write optimised SQL queries in Snowflake
- Ensure documentation of applications
Requirements:
- Bachelor's degree in information technology/ computer science / software engineering
- Knowledge and hands-on experience writing effective SQL queries and statements
- Understanding of AWS services
- At least 5 years of experience in a similar capacity
- At least 3 years of proficiency in using Python to develop and modify scripts
- At least 3 years of proficiency in managing data ingestion and DAG maintenance in Airflow
Preferred Requirements:
- Knowledge in Hadoop Ecosystem like Spark or PySpark
- Knowledge in AWS services like S3, Data Lake, Redshift, EMR, EC2, Lambda, Glue, Aurora, RDS, Airflow
- 1-2 years of proficiency in using Snowflake, including query optimisation to control Snowflake’s warehouse cost
- Previous work experiences from investment banking/finance industry with demonstrated exposure to be ready to take on an individual contributor role