Earnbetter

Job Search Assistant

Logo

Staff Software Engineer

Kohl's • Remote • Posted 1 day ago

Boost your interview chances in seconds

Tailored resume, cover letter, and cheat sheet

Remote • Full-time • Senior Level

Job Highlights

Using AI ⚡ to summarize the original job post

The Staff Software Engineer at Kohl's will work on various projects including credit data migration, compliance reporting, Airflow 2 migration, and more. This role involves data migration, performance tuning, and building and monitoring distributed caching technologies. The position is fully remote, with the option to work remotely or telecommute to the company headquarters in Menomonee Falls, WI.

Responsibilities

  • Assist with the data migration of all credit processes into modern secure architecture.
  • Serve as the primary point of contact with the compliance team for required data reporting efforts.
  • Migrate Airflow from a previous version to a newer version.
  • Work on production versions of data ingestion for the legacy media.
  • Support the software migration process and move it into production.
  • Implement ETL process with Big Data Technologies and Database Design.
  • Design and create automation workflows and execution.
  • Engage in performance tuning and task implementation of Directed Acyclic Graph (DAG).
  • Build and monitor distributed caching technologies.
  • Drive development, testing, deployments, and iterative improvement of product capabilities and features.
  • Leverage critical thinking, experimentation, data, and industry best practices to implement desired business outcomes.
  • Develop high quality applications that are secure, easy to operate, difficult to break, and extremely observable with measurable results.
  • Responsible for all technical aspects of the product application lifecycle.
  • Establish product engineering and software standards.

Qualifications

Required

  • Bachelor's degree in Management Information Systems, Computer Science, Computer Engineering, or related field of study.
  • 5 years of experience in the job offered or any related occupation.
  • Demonstrated experience in Jenkins or Maven, GIT, SQL or PL/SQL.
  • Experience in implementing ETL process with Big Data Technologies.
  • Knowledge of Spark, Python, Scala, and Airflow.
  • Experience with MapReduce, Pig, Hive, Kafka, Sqoop, and Flume.
  • Experience in designing and creating automation workflows and execution.
  • Knowledge of Apache Airflow Developing DAG, Performance tuning of the DAGs and task implementation.
  • Experience with REDIS.
  • Experience with Teradata and Netezza.

About Kohl's

Kohl's is a popular omnichannel retailer offering a wide range of products including clothing, shoes, toys, home décor, appliances, and electronics. They provide convenient shopping with free shipping, easy returns, and exclusive discounts for Kohl's Cardholders and Rewards members. With over 1,100 stores nationwide, Kohl's is a one-stop shop for the whole family.

Full Job Description

100% remote position. May work remotely or telecommute to company HQ in Menomonee Falls, WI.

Job Duties:The employee will work on the following products:- Credit Data Migration: The employee will be responsible for assisting with the data migration of all credit processes into modern secure architecture. Thecredit data migration project will move away from substantially outdated versions of the software to a more current version that poses less of a security risk.- Cobrand Compliance Reporting: The employee will serve as the primary point of contact with the compliance team to assist with required data reporting efforts.- Airflow 2 Migration: The Airflow 2 Migration project will require data migration from a previous version of Airflow to a newer version of Airflow.- Market Mix Models Migration: The Marketing Team is working on bringing marketing efforts in-house for cost savings and data quality. The employee will work on production versions of data ingestion for the legacy media.- JIRA Data into KDP: All KT balanced teams use JIRA for project management. As the reporting capabilities in JIRA are quite limited, the team brought JIRA data into KDP in partnership with the Data Science team. The employee will support the software migration process and move it into production.- Implements ETL process with Big Data Technologies and Database Design, including: Spark, Python, Scala, and Airflow;- Design and create automation workflows and execution to support migration projects, including Airflow 2 Migration and Market Mix Models Migration efforts;- Engage in performance tuning and task implementation of Directed Acyclic Graph (DAG), include use of Apache Airflow Developing DAG;- Building and monitoring distributed caching technologies, including: REDIS;- Drives development, testing, deployments, and iterative improvement of product capabilities and features in collaboration with designers, product managers, and other engineers on the product team;- Leverages critical thinking, experimentation, data, and industry best practices to implement desired business outcomes;- Develops high quality applications that are secure, easy to operate, difficult to break, and extremely observable with measurable results;- Responsible for all technical aspects of the product application lifecycle including code, infrastructure, data, security, and CICD;- Establishes product engineering and software standards;

- Demonstrates strong knowledge of new technologies, modern application architecture, and industry best Practices

Required Minimum Position Qualifications: Bachelor's degree in Management Information Systems, Computer Science, Computer Engineering, or related field of study; and 5 years of experience in the job offered or any related occupation in which the required experience was gained.Position also requires demonstrated experience in the following:- Jenkins or Maven- GIT- SQL or PL/SQL- Implementing ETL process with Big Data Technologies- Spark, Python, Scala and Airflow- MapReduce, Pig, Hive, Kafka, Sqoop, and Flume- Designing and creating automation workflows and execution- Apache Airflow Developing DAG, Performance tuning of the DAGs and task implementation- REDIS- Teradata and Netezza

To Apply: Mail resume to N56 W17000 Ridgewood Dr., Menomonee Falls, WI 53051, ATTN: Jenna Schlintz,Ref. Job Title, or, online at https://careers.kohls.com/.

#LI-DNP

#LI-DNI