Job Summary
3 - 7 years
Job Qualification
Bachelors/ Degree,
• Software Engineer,
• Computer Programmers, Other,
Data Entry Machine Operator,
• Computer Programmer
Last Date to Apply
25 Apr 2021
Data Engineer
21st Century Talent Services Private Limited
Posted on: 24 Feb 2021
Job Description

Job Description :

Job Description

Department: Data Engineering Department Section: AI toolkit Role: Data Engineer Primary qualifications: ? You are passionate about building and developing a world-class engineering culture and products ? You are humble and drive positivity ? You value and respect diversity and inclusion ? You are capable of effectively communicating in a Data-driven manner What you get to do: ? Create innovative Data products using information collected by the organization. ? Model Data and process flows for both live and offline Data in a way that maps storage systems to business requirements ? Collaborate with Data scientists to productize algorithmic prototypes for statistical analysis and machine learning for prediction and clustering ? Implement Data pipelines for Data transformation and integration (streaming and batch) ? Develop and improve the current Data architecture with an emphasis on Data quality , improved monitoring , and high availability ? Analyze trade-offs involving latency , throughput , and transactions for distributed systems ? Champion Data governance , security , privacy , quality and retention policies Experience: ? 2+ years of overall programming/Data engineering experience ? 2+ years of writing software , preferably in Scala and python Your background includes: ? Expertise designing and maintaining Databases (object , columnar , in-memory , relational) ? Proven track record of successful communication of Data infrastructure , Data models , and Data engineering solutions ? Experience with relational Data stores as well as one or more NoSQL Data stores (e.g. Mongo , Cassandra) ? Prior experience in Data warehouse modernization building complete Data warehouse solutions , star/snowflake schema designs , infrastructure components , ETL/ELT pipelines , and reporting/analytic tools ? Experience building production-grade Data backup/restore , and disaster recovery solutions ? Hands-on experience with batch and streaming Data (e.g. , Cloud Dataflow , Beam , Spark , Cloud Pub/Sub , Apache Kafka) ? Advanced SQL skills , and proficient in one or more programming languages such as Python ? Familiarity with python Data science tooling (pandas , scipy , sklearn) ? Demonstrated proficiency with Data structures , algorithms , distributed computing , storage systems ? A Bachelor's or masters degree in Computer Science , or related field. Required: ? Working with TensorFlow , TensorFlow extended Scikit-learn and Pytorch. ? Working with protocol buffers. ? Experience designing and maintaining Data warehouses ? Developing for Google Cloud Platform/Google Professional Data Engineer Certified ? Experience with Python , Java , Scala ? Experience using Docker and Kubernetes ? Language proficiency in English and Japanese or English Required Skills
Programming, Data Engineering, Writing, Scala, Python, Databases, Toolkit, Passionate, Developing, Diversity and Inclusion, Communicating, Data-Driven, Data Products, Process Flows, Storage Systems, Prototypes, Statistical Analysis, Prediction, Clustering, Pipelines, Data Transformation, Integration, Data Architecture, Data Quality, High Availability, Distributed Systems, Data Governance, Proven Track Record, Communication, Data Infrastructure, Data Models, NoSQL, Mongo, Cassandra, Data Warehouse Modernization, Snowflake Schema, Infrastructure, ETL, Reporting, Production, Disaster Recovery, Cloud Dataflow, Spark, Cloud, Apache Kafka, Advanced SQL, Programming Languages, Python Data Science, Pandas, Scipy, Data Structures, Algorithms, Distributed Computing, Computer Science, TensorFlow, Scikit-Learn, Protocol Buffers, Data Warehouses, Google Cloud Platform, Java, Docker, Language Proficiency
Send me OTP quit