This job posting has expired
Senior Data Engineer
Location : Hadapsar, Pune, Maharashtra
Headquarters : Ireland
Hiring Mode : Full Time
Experience : Mid Level
- If you desire to be part of something special, to be part of a winning team, to be part of a fun team – winning is fun. We are looking forward to a Senior Data Engineer based in Pune, India. In Eaton, making our work exciting, engaging, meaningful; ensuring safety, health, wellness; and being a model of inclusion & diversity are already embedded in who we are - it’s in our values, part of our vision, and our clearly defined aspirational goals. This exciting role offers opportunity to:
- The Data Engineer will be involved in the architecture, design, and management of large-scale, highly-distributed, multi-tenant data stores. In addition to building and maintaining these data stores, the Edge and Cloud Engineer will also work to ensure that the data is easily accessible and available to data scientists and business users across the enterprise when and where it is needed
- .He/She can program in several languages and understands the end to end software development cycle including CI/CD and software release.
- The candidate will demonstrate exceptional impact in delivering projects in terms of architecture, technical deliverables and project delivery throughout the project lifecycle. The candidate is expected to be conversant with Agile methodologies and tools and have a track record of delivering products in a production environment
- Work with a team of experts in deep learning, machine learning, distributed systems, program management, and product teams, and work on all aspects of design, development and delivery of deep learning enabled end-to-end pipelines and solutions.
- Lead the development of technical solutions and implement architectures for project and products across data engineering and data science teams
- Lead the architecture, design, and development of new intelligent power technology products and production quality end to end systems
- Experience on ML/AI, ML Ops, and Dev Ops, Software development environment
- Work with your team and others, defining the architecture, design, and management of secure, large-scale, highly-distributed, geo-redundant, multi-tenant data stores
- Knowledge with microservices, cloud APIS (e.g. AWS and MS Azure)
- Is accountable for end-to end delivery of solutions from requirements gathering to production
- Author high-quality, highly-performance, unit-tested code to extract and transform data based on business and data science needs
- Bachelor’s degree in computer science or software engineering
- 5+ years of progressive experience in delivering technology solutions in a production environment
- 5+ years of experience in the software industry as a developer, with a proven track record of shipping high quality products
- 3+ years working with customers (internal and external) on developing requirements and working as a solutions architect to deliver
- Bachelor Degree in Computer Science or Software Engineering or Information Technoogy
- Experience on Cloud Development Platforms - Azure & AWS and their associated data storage options
- Cloud based Analytics (AWI, REST API, Microservices)
- Knowledge of IoT technologies, including cloud processing, like Azure IoT Hub.
- Experience on CI/CD (Continuous Integration/Delivery) i.e. Jenkins, GIT, Travis-CI
- Virtual build environments (Container, VMs and Microservices) and Container orchestration - Docker Swarm, Kubernetes/Red Hat Openshift.
- Relational & non-relational database systems - SQL, Postgres SQL, NoSQL, MongoDB, CosmosDB, DocumentDB
- Data Warehousing & ETL - Write complex queries that are accessible, secure and perform in optimized manner that outputs to different consumers and systems
- ETL on Big Data Technologies - Hive, Impala
- Progamming Knowledge - Java and/or Python and associated IDE's (Eclipse, IntelliJ, PyCharm etc.)
- APIs development to support data consumption needs of internal and external stakeholders
- Data visualization tools such as Power BI, QlikView and/or Tableau
- Data pipelining, scripting, reporting
- In-depth knowledge of SOA (Service Oriented Architecture)
- Experience in Azure Tools - Blob, SQL, Data Lake, Hive, Hadoop, Data Factory , Databricks, Azure Functions
- SW Development life-cycle process & related tools
- Agile development methodologies and concepts including handson with Jira, bitbucket and confluence.
- Knowledge of streaming technologies like Apache Kafka, AWS Kinesis, Azure EventHubs
- Knowledge of Cloudera Hadoop, ML ops, Dev ops
- Knowledge of data analysis tools, like Apache Presto, Hive, Azure Data Lake Analytics, AWS Athena, Zeppelin
- Ability to specify and write code that is accessible, secure, and performs in an optimized manner with an ability to output to different types of consumers and systems
- Experience in Design Thinking or human-centered methods to identify and creatively solve customer needs, through a holistic understanding of customer’s problem area
- Knowledgeable in leveraging multiple data transit protocols and technologies (MQTT, Rest API, JDBC, etc)
- Knowledge of Hadoop and MapReduce/Spark or related frameworks
- Knowledge of Scala
- Excellent verbal and written communication skills including the ability to effectively communicate technical concepts as a part of virtual, global teams
- Good interpersonal, negotiation and conflict resolution skills
- Ablity to understand academic research and apply new data science techniques
- Experience being part of larger teams with established big data platform practices, as well as smaller teams where they made a bigger impact in terms of scope.
- Experience of working with global teams work Experience and awareness, Strong communication skills to interact with global teams.
- Innate curiosity
- Self-directed and hungry to learn – a person, who with time in his/her hands, will independently find interesting ways to push the envelope, learning new skills and growing Self and the team.
- Team player- we work in small, fast moving teams.
- Yes! Because you are the one we are looking for, we hope to hear from you now!