Apply for Data Engineer

  1. Experience: 2 Years Qualification: BS or MS in Computer Engineering, Computer Science  Key Responsibilities: • Work with data science teams to implement required data flows from scratch, or making minimal modifications to existing data flows. • Create and implementing checklists for new-product deployments (monitoring mechanisms, standardizing integration endpoints, and building support-handover instructions) • Collaborating with external teams to validate and present data flow designs. • Work with data scientists to build data flows that generate training/scoring/live data to data visualizations, insights, or machine learning models • Work with client-facing teams to build data dictionaries or client inquiries • SQL development as part of agile team workflow • Real-time data pipelines and applications using serverless and managed AWS services such as Lambda, Kinesis, API Gateway, etc. • Being part and contributing towards a strong team culture and ambition to be on the cutting edge of big data • Evaluate and improve data quality by implementing test cases, alerts, and data quality safeguards • Strong knowledge of cloud batch and stream processing architectures. • Strong desire to learn. Experience and Skill Set: • Familiarity experience with GCP BigData Tooling: BigQuery, BigTable, etc • Familiarity with data visualizations tools such as Power BI, Tableau, Qlik Sense, Data Studio etc. • Familiarity with data lake concepts - Strong analytic skills related to working with unstructured datasets and combining data from a variety of heterogeneous data sources to provide actionable insights • Ability to manage file operations, library configurations, and run scripts from shell (bash, zsh etc. ) • Knowledge of MS Office Suite – particularly Excel • Strong communication skills—primarily English conversations. • The ability to manage clients and internal stakeholders in ensuring that data is received and managed in timely and efficient manner • Familiarity with relational data theory, ER models, DB technologies (e.g., NOSQL vs RDBMS) • Familiarity with ad-hoc data analysis – SQL is a must and knowledge of R/Python/Scala data frames is preferable. • Familiarity with task management tools, like JIRA, Trello etc. • Strong problem-solving and critical thinking skills, and capability to adapt to new problems.
    1. * First Name
    2. Middle Name
    3. * Last Name
  1. * Required field

Back to Job List