This is a full-time work from home opportunity for a star ETL Data Engineer from LATAM.
IDT(www.idt.net) is an American telecommunications company founded in 1990 and headquartered in New Jersey. Today it is an industry leader in prepaid communication and payment services and one of the world’s largest international voice carriers. We are listed on the NYSE, employ over 1300 people across 20+ countries, and have revenues in excess of $1.5 billion.
We are looking for a BI Engineer to join our USA based client's BI engineering team. If you are highly motivated, ambitious, ready to learn and make a direct impact, this is your opportunity! The individual in this role will perform data analysis, ELT/ETL design and support functions to deliver on strategic initiatives to meet organizational goals.
If you're looking for a company that will give you the maximum flexibility in choosing a location to work, this opportunity is for you!
Responsibilities:- Develop, document, and test ELT/ETL solutions using industry standard tools (Snowflake, Denodo Data Virtualization, Looker).
- Recommend process improvements to increase efficiency and reliability in ELT/ETL development.
- Extract data from multiple sources, integrate disparate data into a common data model, and integrate data into a target database, application, or file using efficient ELT/ ETL processes.
- Collaborate with Quality Assurance resources to debug ELT/ETL development and ensure the timely delivery of products.
- Should be willing to explore and learn new technologies and concepts to provide the right kind of solution.
- Target and result oriented with strong end user focus.
- Effective oral and written communication skills with BI team and user community.
Requirements:- 5+ years of experience in ETL/ELT design and development, integrating data from heterogeneous OLTP systems and API solutions, and building scalable data warehouse solutions to support business intelligence and analytics.
- Demonstrated experience in utilizing python for data engineering tasks, including transformation, advanced data manipulation, and large-scale data processing.
- Experience in data analysis, root cause analysis and proven problem solving and analytical thinking capabilities.
- Experience designing complex data pipelines extracting data from RDBMS, JSON, API and Flat file sources.
- Demonstrated expertise in SQL and PLSQL programming, with advanced mastery in Business Intelligence and data warehouse methodologies, along with hands-on experience in one or more relational database systems and cloud-based database services such as Oracle, MySQL, Amazon RDS, Snowflake, Amazon Redshift, etc.
- Proven ability to analyze and optimize poorly performing queries and ETL/ELT mappings, providing actionable recommendations for performance tuning.
- Understanding of software engineering principles and skills working on Unix/Linux/Windows Operating systems, and experience with Agile methodologies.
- Proficiency in version control systems, with experience in managing code repositories, branching, merging, and collaborating within a distributed development environment.
- Excellent English communication skills.
- Interest in business operations and comprehensive understanding of how robust BI systems drive corporate profitability by enabling data-driven decision-making and strategic insights.
Pluses- Experience in developing ETL/ELT processes within Snowflake and implementing complex data transformations using built-in functions and SQL capabilities.
- Experience using Pentaho Data Integration (Kettle) / Ab Initio ETL tools for designing, developing, and optimizing data integration workflows.
- Experience designing and implementing cloud-based ETL solutions using Azure Data Factory, DBT, AWS Glue, Lambda and open-source tools.
- Experience with reporting/visualization tools (e.g., Looker) and job scheduler software.
- Experience in Telecom, eCommerce, International Mobile Top-up.
- Education: BS/MS in computer science, Information Systems or a related technical field or equivalent industry expertise.
- Preferred Certification: AWS Solution Architect, AWS Cloud Data Engineer, Snowflake SnowPro Core.
Please attach CV in English.
The interview process will be conducted in English.
Only accepting applicants from LATAM.