MÔ TẢ CÔNG VIỆC
-
- Build data ingestion pipelines for APIs, web data, and files.
- Process structured and semi-structured data (JSON, XML, HTML, CSV).
- Apply sustainable web data collection methods, including session handling, proxies, and request automation.
- Ensure compliance with security regulations and access restrictions when integrating external data sources.
- Optimize SQL queries and transformations for analytical purposes.
- Design, build, and maintain robust Python-based data pipelines.
- Implement testing frameworks and best engineering practices to ensure reliability.
- Ensure data quality, consistency, and scalability across the entire workflow.
- Further details will be shared during the interview.
QUYỀN LỢI
-
- An attractive income level of up to 40,000,000 VNĐ.
- Connection bonus: 1~10 million bonus for anyone who introduces friends and acquaintances to the company.
- Working with large and advanced systems, have the opportunity to develop comprehensive technical skills with complex problems, requiring high accuracy.
- Become one of the influential KeyPerson in the project, high chance to become Leader, Project Manager.
- Participating in personnel engagement activities: Weekend Online Game prizes (Half-Life, AOE, Dota2, LOL, Pubg...), Team Building by week, by month, by project.
- Opportunity for advancement based on ability with corresponding increase in rank and salary increase.
- Opportunity to implement ambitious projects in many countries, exposure to the latest technologies and learn from good colleagues.
- Participate in skills training courses: AWS, Microservices, foreign languages (English, Japanese)...
YÊU CẦU
-
• 7+ years of experience in data engineering, with 3+ years in a Lead or Tech Lead role.
• Strong expertise in Azure Data Platform, including:
○ Azure Data Lake Storage (ADLS Gen2)
○ Azure Data Factory
○ Azure Synapse Analytics (knowledge)
• Deep hands-on experience with Databricks (Apache Spark, Delta Lake, notebooks, workflows).
• Strong experience designing and operating Snowflake data warehouses.
• Advanced skills in SQL and Python for large-scale data processing.
• Proven experience leading complex data migrations and modernization initiatives.
• Strong understanding of data modeling, performance tuning, and cost optimization.
• Ability to manage multiple workstreams and shifting priorities.
• Excellent English communication skills for client leadership, workshops, and documentation.
.png)