Job Opening details
At SensorFlow we are planning for considerable growth over the next 12 months and need a data engineer to design and develop SensorFlow’s data infrastructure. As we are looking to build the data pipeline from scratch, you will have full autonomy and the technical backing from our engineering team in designing, developing and maintaining this infrastructure.
Job Roles & Responsibilities
Design, develop and maintain SensorFlow’s infrastructure for streaming, processing and storage of data. Build tools for effective maintenance and monitoring of the data infrastructure.
Contribute to key data pipeline architecture decisions and lead the implementation of major initiatives.
Work closely with stakeholders to develop scalable and performant solutions for their data requirements, including extraction, transformation and loading of data from a range of data sources.
Develop the team’s data capabilities – share knowledge, enforce best practices and encourage data-driven decisions.
Solid Computer Science fundamentals, excellent problem-solving skills and a strong understanding of distributed computing principles.
At least 2 years of experience in a similar role, with a proven track record of building scalable and performant data infrastructure.
Expert SQL knowledge and deep experience working with relational and NoSQL databases (e.g. HBase, Cassandra).
Advanced knowledge of Apache Kafka and demonstrated proficiency in Hadoop v2, HDFS, MapReduce.
Experience with stream-processing systems (e.g. Storm, Spark Streaming), big data querying tools (e.g. Pig, Hive) and data serialization frameworks (e.g. Protobuf, Thrift, Avro).
Bachelor’s or Master’s degree in Computer Science or related field.
- Data storage: Amazon DynamoDB
- Service Layer: Amazon Lambda, Amazon API Gateway
- Web frontend: Angular
- Mobile: Ionic
Annual Salary: SGD$ 72-96K
Monday – Friday, 10am to 6pm.
61 Ubi Road 1, #01-36, OXLEY BIZHUB 1, Singapore 408727