Founded in 2017, RPG Commerce is a leading direct-to-consumer (DTC) social e-commerce company dedicated to building cutting-edge everyday essentials. With an in-house portfolio ranging from innovative active wear to home and living products, RPG currently carries an in-house brand portfolio of 10+ brands, including Thousand Miles, Bottoms Lab, Eubi, Montigo and Cosmic Cookware.
Recently securing a Series B funding round of RM127 million, RPG Commerce has over 100 employees in KL, Singapore and Philippines who work everyday to develop, design and produce high-quality products all over the world. To find out more, please visit https://www.rpg.ventures/.
RPG is a rapidly growing DTC E-Commerce company, creating innovative brands from the ground up. We don't believe in 9-to-5, meetings for the sake of meetings and casual Fridays (it's pretty casual everyday). We want you to push boundaries, change the game and create impact everyday by doing things that matter.
You will be able to join forces to develop a data warehouse with the hundreds of data sources in all the third party apps that we are using. If your passion is enabling a startup to grow and envisioning clear directions through data, using the latest technologies in the cloud, you can help achieve all these.
If you're smart, hardworking and anyone has ever called you crazy, then we want you to become part of the family!!!
Key Responsibilities :
- Develop and deploy data ingestion models, and support existing processes/ELT’s (extract/transform/load), functions (in Python/SQL) in a cloud data warehouse environment using Airbyte and GCP services.
- Enablement of AirFlow and DBT into the ELT process.
- Develop audits for data integrity & quality at scale, implementing alerting and anomaly detection as necessary
- Help us achieve operational excellence, by focusing on efforts towards optimization and automation.
- Partner with internal RPG business stakeholders, Analytics, and Data teams to support their data infrastructure needs and troubleshoot data-related technical issues with lead.
- Collaborate with our Analytics and Data team to build scalable data models & warehouses.
- Create and support ELT data pipelines built on BigQuery and DBT (soon to be) while ensuring high quality data.
- Develop tools and solutions to enable our CI/CD workflow.
- Mentor junior engineers and establish best engineering practices across the team by doing code reviews and ensuring accurate code documentation.
- 3+ years of software development, data engineering, or related experience manipulating, processing and extracting value from large datasets.
- Demonstrated strength in data modeling, ETL/ELT development, and data warehousing.
- Experience writing complex SQL statements and developing in Python
- Experience working with data modeling and orchestration tools such as Airflow, DBT etc
- Experience with software configuration management tools (Git) and CI/CD tools such as Jenkins and Nexus
- Takes pride in an efficient designs and accurate results
- Ability to objectively analyze pros/cons and respective tradeoffs of a design path, and partner with team members to arrive at the most optimal solution.
Bonus (nice to have):
- Experience in using data aggregators (Airbyte, FiveTran, Stitch etc.)
- Experience in creating API endpoints for data products.
- Experience in creating data platforms from ground zero.
- Can showcase your work in interview sessions be it complex SQL statements that you have written before or ingestion process that was written in Python or other languages.
- Will love it if you are good at documentations.