PT. Synapsis Sinergi Digital
Connect Everything
Job Description
  • Create and maintain optimal data pipeline architecture.
  • Assemble large, complex data sets that meet functional / non-functional business requirements.
  • Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc.
  • Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS ‘big data’ technologies.
  • Build analytics tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency and other key business performance metrics.
  • Work with stakeholders including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs.
  • Keep our data separated and secure across national boundaries through multiple data centers and AWS regions.
  • Create data tools for analytics and data scientist team members that assist them in building and optimizing our product into an innovative industry leader.
  • Work with data and analytics experts to strive for greater functionality in our data systems.
Requirements
  • Bachelor’s Degree in Computer Science, Mathematics or related field (IT).
  • Proficient command of English and Bahasa both written and verbal.
  • Energy, enthusiasm, ambition to grow, and a drive to succeed.
  • Experience with at least one of following: Data Engineering, Big Data Technologies, or Data Transformation, Data.
  • Platform and Data modelling.
  • Experience in architecting and building scalable data platforms.
  • Experience with Informatica or other related data Integration tools.
  • Experience with Big Data Cloud Technologies (Data Lake, Azure, Google, AWS etc.) or experience with open source technologies (Spark, Kafka, Presto, Hive, Cassandra etc.).
  • Experience with SQL and/or NOSQL databases.
  • Experience in both batch and stream processing technologies.
  • Experience Java, GO, R, MatLab, C# & C++ and Python programming languages.
  • Machine learning experience with Spark or similar.
  • Certified Data Engineers or Solution Architect in 1 or more Cloud Technologies (AWS, GCP, Azure).
  • Ability to manage numerous requests concurrently and be able to prioritize and deliver.
  • Good communication and presentation skills.
  • Excellent problem-solving skills.
  • Strong analytical and planning skills.
  • Production experience in building real-time analytics applications.
Contacts

Further information: [email protected]