DE Jobs

Search from over 2 Million Available Jobs, No Extra Steps, No Extra Forms, Just DirectEmployers

Job Information

Honeywell Sr Data Engineer in Phoenix, Arizona

The future is what you make it.

When you join Honeywell, you become a member of our global team of thinkers, innovators, dreamers and doers who make the things that make the future. That means changing the way we fly, fueling jets in an eco-friendly way, keeping buildings smart and safe and even making it possible to breathe on Mars.

Working at Honeywell isn’t just about developing cool things. That’s why all our employees enjoy access to dynamic career opportunities across different fields and industries.

Are you ready to help us make the future?

Join a company that is transforming from a traditional industrial company to a contemporary digital industrial business, harnessing the power of cloud, big data, analytics, Internet of Things, and design thinking.

You will lead change that brings value to our customers, partners, and shareholders through the creation of innovative software and data-driven products and services. You will work with customers to identify their high value business questions and work through their data to search for answers. You will be responsible for working within Honeywell to identify opportunities for new growth and efficiency based on data analysis.

Responsibilities

As a Lead Data Analyst/Data Engineer, you will be providing technical leadership to a team that delivers contemporary analytics solutions for the Information Management & Analytics function at Honeywell. You will build strong relationships with leadership to effectively deliver contemporary data analytics solutions and contribute directly to business success. You will deliver solutions on data warehouse systems eco-system viz. Snowflake, IICS, Airflow, Control-M etc.

You will identify and implement process improvements – and if you don’t like to do the same thing twice, you will automate where possible. You are always keeping an eye on scalability, optimization, and process. You have worked with Snowflake, IICS before.

You will work on a team including scrum masters, product owners, data architects, data engineers/analyst, data scientists and DevOps. You and your team collaborate to build products from the idea phase through launch and beyond. The software you write makes it to production in sprints. Your team will be working on creating a new platform using your experience of APIs, microservices, and platform development. You need to take the responsibility of taking the quality of the work and make sure that all your team’s work is reviewed before deploying.

YOU MUST HAVE

  • Bachelor's degree in Computer Science, Engineering

  • 10+ years of experience

  • Have 6 - 8 years of data engineering experience in any data warehousing platform. Preferably Snowflake data platform and IICS (Informatica Intelligent Cloud Services).

  • Expert data warehousing concepts.

  • Expert in SQL.

  • Expert in ETL

  • Python programming experience.

  • Experience in developing end to end data pipelines i.e., Ingestion, transformation, storage.

  • Experience in leading data engineers to provide best practices in implementing end to end pipelines.

WE VALUE

  • Should have developed and deployed complex big data ingestion jobs in IICS (or other ELT tool) bringing prototypes to production on Snowflake (or any data warehouse platform).

  • Experience in building advanced analytics solutions with data from enterprise systems like ERPs, CRMs, Marketing tools etc.

  • Have worked in Snowflake data platform in designing/implementing the data models and SQL based transformations.

  • Experience in leading a team of 4 to 6 data engineers from end-to-end data pipeline implementation.

  • Have experience in developing and building applications to process very large amounts of data (structured and unstructured), including streaming real-time data.

  • Experience in writing complex SQL statements and debugging/improving performance of SQL statements using query profilers.

  • Experience in working with cloud-based deployments. Understanding of containers (Docker) & container orchestration (Swarm or Kubernetes).

  • Good understanding of branching, build, deployment, CI/CD methodologies such as GitHub, Octopus and Bamboo

  • Experience working with in Agile Methodologies and Scrum Knowledge of software best practices, like Test-Driven Development (TDD)

  • Effective communication skills and succinct articulation

  • Experience with dimensional modeling, data warehousing and data mining.

  • Experience with machine learning solutions and data science methods promotion.

  • Database performance management and API development

  • Technology upgrade oversight

  • Experience with visualization software, Tableau preferred but not necessary.

  • Understanding of best-in-class model and data configuration and development processes

  • Experience working with remote and global teams and cross team collaboration.

  • Consistently makes timely decisions even in the face of complexity, balancing systematic analysis with decisiveness.

Honeywell is an equal opportunity employer. Qualified applicants will be considered without regard to age, race, creed, color, national origin, ancestry, marital status, affectional or sexual orientation, gender identity or expression, disability, nationality, sex, religion, or veteran status.

DirectEmployers