Earnbetter

Job Search Assistant

Teradata ETL Data Engineer

Tata Consultancy Services • Seattle, WA 98127 • Posted 1 day ago via LinkedIn

Boost your interview chances in seconds

Tailored resume, cover letter, and cheat sheet

In-person • Full-time • Senior Level

Job Highlights

Using AI ⚡ to summarize the original job post

The ETL Data Engineer at Tata Consultancy Services is responsible for developing data pipelines to transfer data from transactional systems to data warehouses, troubleshooting production issues, and optimizing data pipelines for performance. This role requires strong ETL and database concepts, proficiency in SQL, experience with ETL tools like Teradata utilities, and solid skills in programming languages like Python and Java.

Responsibilities

  • Develop data pipeline to transfer data from transactional system to Data warehouse.
  • Troubleshoot production issues through job log reviews, implementing design changes via change management and adhering to the code promotion process.
  • Contribute to the development and maintenance of documentation for troubleshooting procedures, best practices, and system configurations.
  • Optimize data pipelines or performance by collaborating with DBAs and implementing SQL coding and design changes within the job structure.
  • Resolve data issues arising from load delays or database server downtimes.
  • Actively participate in troubleshooting and resolving issues related to data loading delays and database server outages.
  • Collaborate with stakeholders to understand business requirements, providing insights and recommendations for optimizing data processing workflows.

Qualifications

Required

  • 5+ years of ETL development and script based ETL development
  • Strong ETL and database (Oracle/Teradata/SQL Server) concepts
  • Ability to develop complete ETL jobs
  • SQL proficiency (intermediate skill level minimum)
  • Experience in ETL tool like Teradata utilities/tools
  • Data Validation Experience: aggregations, data types, casting errors, rounding errors, etc.
  • Solving basic data types problems (casting, NULL handling)
  • Solid skills writing software in one or more languages: Python, Java
  • Solid skills with writing/translating SQL queries and SQL data validation
  • Comfort with basic Linux/Unix commands (file manipulation, file inspection, and ssh) and ability to understand and modify shell scripts (bash, zsh, etc.) for loading data
  • Comfort with basic git commands (Gitlab)
  • Excellent oral and written communication skills

Full Job Description

Job Title: ETL / Data Engineer

Experience required: (in Yrs)


5+ yrs ETL development and script based ETL development


Technical/Functional Skills:

• Strong ETL and database (Oracle/Teradata/SQL Server) concepts

• Should be able develop complete ETL jobs

• SQL proficiency (intermediate skill level minimum)

• Experience in ETL tool like Teradata utilities/tools.

• Data Validation Experience: aggregations, data types, casting errors, rounding errors, etc.

• Solving basic data types problems (casting, NULL handling)

• Solid skills writing software in one or more languages: Python, Java

• Solid skills with writing/translating SQL queries and SQL data validation

• Comfort with basic Linux/Unix commands (file manipulation, file inspection, and ssh) and ability to understand and modify shell scripts sh(bash, zsh, etc.) for loading data

• Comfort with basic git commands (Gitlab)

• Excellent oral and written communication skills


Experience Required:


• Experience in ETL tool like Teradata utilities/tools.

• Data Validation Experience: aggregations, data types, casting errors, rounding errors, etc.

• Solving basic data types problems (casting, NULL handling)

• Solid skills writing software in one or more languages: Python, Java

• Solid skills with writing/translating SQL queries and SQL data validation

• Comfort with basic Linux/Unix commands (file manipulation, file inspection, and ssh) and ability to understand and modify shell scripts sh(bash, zsh, etc.) for loading data


Roles & Responsibilities


• Develop data pipeline to transfer data from transactional system to Data warehouse.

• Troubleshooting production issues through meticulous job log reviews, implementing necessary design changes via change management and adhering to the code promotion process.

• Contributed to the development and maintenance of documentation for troubleshooting procedures, best practices, and system configurations, facilitating knowledge sharing within the team and managing Change Requests

• Optimizing data pipelines or performance by collaborating with DBAs and implementing SQL coding and design changes within the job structure.

• Resolving data issues arising from load delays or database server downtimes, with a focus on ensuring customer satisfaction.

• Actively participated in troubleshooting and resolving issues related to data loading delays and database server outages, ensuring minimal disruption to data flow and system operations.

• Collaborated with stakeholders to understand business requirements, providing valuable insights and recommendations for optimizing data processing workflows.