BlueWave Warehouse Management System is the best in it’s class. Featuring user friendly interface, incredibly effective and affordable for your warehouse operation, inventory and order fulfilment across B2B and B2c commerce environment.

+603 7625 6001

E1008, Phileo Damansara 1, PJ

Mon - Fri: 8:00 - 18:00

Data Services Engineer

Overview

Be part of our data services family and get involved to unlock complex data in the ever growing ecommerce space. Join us and start revolutionizing the data consumption landscape by building the future of data.

As a Data Services Engineer, you will work collaboratively with industry leading brand partners in the commerce space to building data that matters to them the most. As part of a Data Services and Management team, you will be responsible for planning BI strategy roadmap, architecting, deploying and managing data warehouse, providing data insights to help brand owners uplift both their topline and bottom-line.

Data cost nothing, and is worthless; unless you make sense of it. You may spend time to read, analyze and digest what our client wants to accomplish with its data, while improving the best possible strategies to achieve these goals.​

The environment

The candidate shall be working on data sources from several applications, from multivendor marketplaces, warehouse management systems, multichannel ecommerce platforms, ERP systems/ 3rd party middleware, with database sources built around mySQL/ AWS RDS, MS SQL, MongoDB, etc.

This would include taking part in data modeling, data management, dimensional modeling, datamart creation, ETL design and development, data profiling and data validation, quality management to cope with the ever growing ecommerce data needs for new and emerging market.

Price intelligence would mark another important chapter of our data services agenda and involves scrapping competitive products data at scale. Maintaining both the throughput performance and data integrity would mean relying on a highly scalable architecture, while keeping up with the ever-changing website formats.

Scrapping from more than 6 online market channels in six countries across South East Asia (SEA) region, the scrapping operation require a reliable infrastructure built for reliability at scale. The candidate will be taking advantage of automated web scrappers and proxy management to ensure round-the-clock, bottleneck-free scrapping operation.​

our candidate

The candidate is required to have strong command of English with great interpersonal skills to communicate, interpret and gather client’s business requirement; while proactively explore client’s data situation from a holistic view to determine the best possible database needs.

Extensive know-how and hands-on experience in at least one of the following relational database, MySQL, MS SQL, SQL (ANSI format) is mandatory; Good knowledge and hands-on experience big data technologies and databases, from Snowflake, AWS Redshift/Athena/EMR; ETL tools StitchData, Fivetran, Segment, or any custom-built solutions. Good scripting experience in Python, R, PHP, JavaScript, etc. in design and build of BI dashboard, storyboarding, and automated data scrapping is definitely a big plus.

A contender for this role possesses a passion for data, new technology, and enjoys solving problems and sharing knowledge with others, excels under pressure, and is continuously looking for opportunities for personal and team improvement.

the tools

  • Warehouse: Snowflake, Redshift/Athena/EMR/RDS
  • ETL: StitchData, Python ETL
  • Database: mySQL, MS SQL, PostgresSQL
  • Scrapping: Puppeteer, Scrappy
  • BI: PowerBI, Tableau, custom built systems

Job Description

  • Plan and architecture design: data modeling, data management, and 3rd party application integration and development
  • Streamline the ETL processes, performing data preparation utilizing a wide selection of data connectors
  • Query performance tuning while ensuring that the underlying data infrastructure are consumed in the most efficient way
  • Involve in dimensional modeling, data mart creation, mapping & partitioning, data management, data profiling and data quality management
  • Plan and develop database schemas, analyze datasets from sources and develop logical models for various BI
  • Work with clients in understanding their standard reporting and dashboard needs, from KPI metrics, data discovery and analytics, storyboarding, etc.
  • Make use of automated web scrapper, proxy management, Scrappy framework, Puppeteer to run highly scalable scrapping operation

join our team !