We use cookies to improve the user experience, analyze traffic and display relevant ads.
Details Accept
Enter position

Pipeline Engineer Salary in Hyderabad - PayScale

Receive statistics information by mail
Unfortunately, there are no statistics for this request. Try changing your position or region.

Recommended vacancies

Business Intel Engineer I, LMAQ
Amazon, Hyderabad, TS, IN
DESCRIPTIONThe last mile of operations in Amazon is enabled through proprietary tech and products. As the Last Mile BI team, we partner with product, tech and program teams worldwide to drive greater cost efficiencies tomorrow, in a fast changing landscape. We work with driver support analytics, geospatial models (addresses, geocodes, maps), operator tools, routing/sort tech, and topology of delivery nodes. We are looking for a seasoned Business Intelligence Engineer who meets our bar on Customer Obsession, Ownership, Bias for Action and Dive Deep. As a Business Intelligence Engineer of Amazon last mile team, you will be responsible for shaping the strategy and direction of customer-facing products that are core to the customer experience. As a key member of the last mile leadership team, you will continually raise the bar on both quality and performance. You will bring innovation, a strategic perspective, a passionate voice, and an ability to prioritize and execute on a fast-moving set of priorities, competitive pressures, and operational initiatives. You will partner closely with product and technology teams to define and build innovative and delightful experiences for customers. You must be highly analytical, able to work extremely effectively in a matrix organization, and have the ability to break complex problems down into steps that drive product development at Amazon speed. You will set the tempo for defect reduction through continuous improvement and drive accountability across multiple business units in order to deliver large scale high visibility/ high impact projects. You will lead by example to be just as passionate about operational performance and predictability as you will be about all other aspects of customer experience. The successful candidate will be able to: · Effectively manage customer expectations and resolve conflicts that balance client and company needs. · Develop process to effectively maintain and disseminate project information to stakeholders. · Be successful in a delivery focused environment and determining the right processes to make the team successful. · This opportunity requires excellent technical, problem solving, and communication skills. The candidate is not just a policy maker/spokesperson but drives to get things done. · Possess superior analytical abilities and judgment. Use quantitative and qualitative data to prioritize and influence, show creativity, experimentation and innovation, and drive projects with urgency in this fast-paced environment. · Partner with key stakeholders to develop the vision and strategy for customer experience on our platforms. Influence product roadmaps based on this strategy along with your teams. · Support the scalable growth of the company by developing and enabling the success of the Operations leadership team. · Serve as a role model for Amazon Leadership Principles inside and outside the organization · Actively seek to implement and distribute best practices across the operationWe are open to hiring candidates to work out of one of the following locations:Hyderabad, TS, INDBASIC QUALIFICATIONS- 2+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience- Experience with data visualization using Tableau, Quicksight, or similar tools- Experience with one or more industry analytics visualization tools (e.g. Excel, Tableau, QuickSight, MicroStrategy, PowerBI) and statistical methods (e.g. t-test, Chi-squared)- Experience with scripting language (e.g., Python, Java, or R)PREFERRED QUALIFICATIONS- Knowledge of data modeling and data pipeline design- Experience with statistical analysis, co-relation analysis
Sr Data Engineer, Ops Finance Transportation
Amazon, Hyderabad, TS, IN
DESCRIPTIONThe Amazon transportation controllership team is looking for a Senior Data Engineer who is data-driven and detail oriented, to partner in expanding our financial data systems to support our global transportation network. You should have excellent business and communication skills to be able to work with business owners and tech leaders to gather infrastructure requirements, design data infrastructure, build up data pipelines and data-sets to meet business needs. You stay abreast of emerging technologies, investigating and implementing where appropriate. This person must be strategic and have strong analytical skills, robust interpersonal and influencing skills, strong organizational and project management skills, and is a self-starter. They will need to roll up their sleeves, work both independently and collaboratively across teams, and have a high level of ownership and accountabilityKey job responsibilities- Architect and develop end to end scalable data infrastructure and data pipelines- Develop data products, deploy highly performant solutions, build data infrastructure leveraging AWS services (such as EC2, Redshift, EMR, Lambda, Quicksight etc.)- Improve existing solutions and come up with next generation Data Architecture to improve scale, quality, timeliness, coverage, monitoring and security.- Develop new data models and end to data pipelines.- Lead design reviews for other Data Engineers on the team and actively participate in design reviews of related tech products.- Provide technical and thought leadership for Data Engineering and Business Intelligence- Create and implement Data Governance strategy for mitigating privacy and security risks- Design, implement, and support a platform providing secured access to large datasets.- Interface with multiple finance and accounting customers, gathering requirements and delivering complete data solutions.- Develop ETL pipeline that links various datasets that could serve the analysis objectives in the most efficient way- Analyze and solve problems at their root, stepping back to understand the broader context.- Keep up to date with advances in big data technologies and run pilots to design the data architecture to scale with the increased data volume using AWSA day in the lifeWork with the largest dataset in logistics management anywhere in the world to drive business insights, support operations and build analytical solutions for partners which can help drive cost control and decision making. As a Data Engineer, you will be working in one of the world's largest and most complex data warehouse environments. You will design, implement and support scalable data infrastructure solutions to integrate with multi heterogeneous data sources, aggregate and retrieve data in a fast and safe mode, curate data that can be used in reporting, analysis, machine learning models, Gen Ai and ad-hoc data requests. You will be exposed to cutting edge AWS big data technologies.About the teamTrans Controllership in Finance is one of the most challenging and exciting spaces to work for, with new geographies, new modes and new carriers being launched every day and being required to seamlessly integrate with the Transportation technical systems for automatic Ingestion of Freight invoices, thorough auditing, timely payment and for proper accounting to keep our books clean and accurate. We are open to hiring candidates to work out of one of the following locations:Hyderabad, TS, INDBASIC QUALIFICATIONS- 5+ years of data engineering experience- Experience with data modeling, warehousing and building ETL pipelines- Experience with SQL- Experience in at least one modern scripting or programming language, such as Python, Java, Scala, or NodeJS- Experience mentoring team members on best practices- - Design & Implementation of automated data contract and data quality enforcement mechanisms- - Expertise at least in 1 data Orchestration tool- - Working experience in CDKPREFERRED QUALIFICATIONS- Experience with big data technologies such as: Hadoop, Hive, Spark, EMR- Experience operating large data warehouses- Master's degree- Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets- Knowledge of distributed systems as it pertains to data storage and computing- - Knowledge of software engineering best practices across the development lifecycle, including agile methodologies, coding standards, code reviews, source management, build processes, testing, and operations- - Proven success in communicating with users, other technical teams, and senior management to collect requirements, describe data modeling decisions and data engineering strategy- - Experience providing technical leadership and mentoring other engineers for best practices on data engineering- - Strong understanding and experience with DBMS internals.- - Strong understanding of system performance and scaling- - Knowledge on LLM and Gen Ai.
Snowflake Engineer
, hyderabad, hyderabad district, IN
Req ID: NTT DATA Services strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward:thinking organization, apply now.We are currently seeking a Snowflake Engineer to join our team in Hyderabad, Telangana (IN:TG), India (IN).Architecting and implementing very large scale data intelligence solutions around Snowflake Data Warehouse.Implementing ETL pipelines within and outside of a data warehouse using Python and Snowflakes Snow SQL.Querying Snowflake using SQL. Developing scripts using Unix, Python, etc. for loading, extracting, and transforming data.Assist with production issues in Data Warehouses like reloading data, transformations, and translationsAbout NTT DATA ServicesNTT DATA Services is a recognized leader in IT and business services, including cloud, data and applications, headquartered in Texas. As part of NTT DATA, a 30 billion trusted global innovator with a combined global reach of over 80 countries, we help clients transform through business and technology consulting, industry and digital solutions, applications development and management, managed edge:to:cloud infrastructure services, BPO, systems integration and global data centers. We are committed to our clients' long:term success. Visit nttdata or LinkedIn to learn more.NTT DATA Services is an equal opportunity employer and considers all applicants without regarding to race, color, religion, citizenship, national origin, ancestry, age, sex, sexual orientation, gender identity, genetic information, physical or mental disability, veteran or marital status, or any other characteristic protected by law. We are committed to creating a diverse and inclusive environment for all employees. If you need assistance or an accommodation due to a disability, please inform your recruiter so that we may connect you with the appropriate team.
Lead DevOps Engineer
TTEC, Andhra Pradesh, Hyderabad
When everything's connected, how we connect is everything… and we'd like to connect with you too! We are looking for you to help us deliver exceptional customer experiences as a Lead DevOps Engineer. At TTEC, we help global brands provide a great experience to their customers, build customer loyalty, and grow their business. We were founded on one guiding principle: customer experiences that are simple, inspired, and more human deliver lasting value for everyone. Your role brings that principle to life. TTEC, a 50,000 employee, global customer experience pioneer, is opening a new information technology and data science center of excellence in Hyderabad where you'll have the opportunity to get in on the groundfloor of this expansion. As a technologist, we know you're in high demand. And we know it's important you find the right fit for your future. Have ideas you want to contribute? We're listening. Looking for exposure to different clients, different technologies? It's what we do. Want to make an impact on the future? We're innovating every day. Teamwork key? You'll have the opportunity to work on global projects with a knowledge-thirsty, international team. Join our inclusive IT team and you'll help create meaningful employee experiences that drive memorable customer experiences. As part of our Product Engineering organization, you will be a valuable member of our Humanify Product Engineering group as a Lead DevOps Engineer. You will work directly with the development team to support several TTEC Digital product projects. What you'll be doin: * Influence selection and adoption of latest DevOps technologies (CI/CD pipelines) * Work as a dedicated DevOps Engineer for various product projects * Streamline automation into all aspects of a software lifecycle * Conduct and automate systems tests for security, performance, and availability * Implement automation tools and frameworks * Troubleshoot production issues and coordinate with the development team to streamline code deployment * What skills you will bring: * 7 years of experience developing or supporting SaaS or PaaS products * 3 years of hands on DevOps experience or equivalent software-engineering * Strong background in Linux/Unix administration * Ability to use a wide variety of cloud services (experience with AWS is required) * Strong experience with Kubernetes and Docker * CI/CD pipeline automation (experience with Jenkins required) * A working understanding of programming and script (bash/perl/python/js) * Knowledge of best practices and IT operations in an always-up, always-available service * Good understanding of software development lifecycle (Agile/Scrum experience a plus). * Experience with automation/configuration management using either Puppet, Chef or an equivalent. * Experience with IaaS using tools like Terraform or similar * Excellent logic, problem-solving, and troubleshooting skills. * Experience deploying and managing Wildfly, Tomcat, Jetty, Apache or Nginx deployments a plus * Excellent verbal and written communication skills. * Flexibility to handle demands of a rapidly growing and changing workplace. * Systems Engineering and Performance is a plus * BS or MS in CS/IS or related degree or equivalent experience. Employment Requirements: TTEC requires all employees hired in the India to successfully pass a background check including employment credentials, education, permanent and current address verification and if applicable, immigration and work permit documentation as a condition of employment. Depending on location and client program, a drug test may also be required as a condition of employment. TTEC is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or protected veteran status. You will be asked to share with TTEC as well as its representatives, all such personal information that it may require to conduct such background verification and also agree to TTEC and its representatives collecting and storing such personal information and transferring the same, whether in India or abroad.
Senior DevOps Engineer - Other New Projects Team
TTEC, Andhra Pradesh, Hyderabad
When everything's connected, how we connect is everything… and we'd like to connect with you too! We are looking for you to help us deliver exceptional customer experiences as a Senior DevOps Engineer. At TTEC, we help global brands provide a great experience to their customers, build customer loyalty, and grow their business. We were founded on one guiding principle: customer experiences that are simple, inspired, and more human deliver lasting value for everyone. Your role brings that principle to life. TTEC, a 50,000 employee, global customer experience pioneer, is opening a new information technology and data science center of excellence in Hyderabad where you'll have the opportunity to get in on the groundfloor of this expansion. As a technologist, we know you're in high demand. And we know it's important you find the right fit for your future. Have ideas you want to contribute? We're listening. Looking for exposure to different clients, different technologies? It's what we do. Want to make an impact on the future? We're innovating every day. Teamwork key? You'll have the opportunity to work on global projects with a knowledge-thirsty, international team. Join our inclusive IT team and you'll help create meaningful employee experiences that drive memorable customer experiences.As part of our Product Engineering organization, you will be a valuable member of our Humanify Product Engineering group as a Senior DevOps Engineer. You will work directly with the development team to support several TTEC Digital product projects. What you will be doing:* Work as a dedicated DevOps Engineer for various product projects * Streamline automation into all aspects of a software lifecycle * Conduct and automate systems tests for security, performance, and availability * Implement automation tools and frameworks * Troubleshoot production issues and coordinate with the development team to streamline code deployment What skills you will bring:* 3 years of experience developing or supporting SaaS or PaaS products * 2 years of hands on DevOps experience or equivalent software-engineering * Strong background in Linux/Unix administration * Ability to use a wide variety of cloud services (experience with AWS is required) * Strong experience with Kubernetes and Docker * CI/CD pipeline automation (experience with Jenkins required) * A working understanding of programming and script (bash/perl/python/js) * Knowledge of best practices and IT operations in an always-up, always-available service * Good understanding of software development lifecycle (Agile/Scrum experience a plus). * Experience with automation/configuration management using either Puppet, Chef or an equivalent. * Excellent logic, problem-solving, and troubleshooting skills. * Experience deploying and managing Wildfly, Tomcat, Jetty, Apache or Nginx deployments a plus * Excellent verbal and written communication skills. * Flexibility to handle demands of a rapidly growing and changing workplace. * Systems Engineering and Performance is a plus * BS or MS in CS/IS or related degree or equivalent experience. Employment Requirements: TTEC requires all employees hired in the India to successfully pass a background check including employment credentials, education, permanent and current address verification and if applicable, immigration and work permit documentation as a condition of employment. Depending on location and client program, a drug test may also be required as a condition of employment. TTEC is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, or protected veteran status. You will be asked to share with TTEC as well as its representatives, all such personal information that it may require to conduct such background verification and also agree to TTEC and its representatives collecting and storing such personal information and transferring the same, whether in India or abroad.
Principal Software Engineer - IT
Dell, Hyderabad, Any
IT Principal Software EngineerThe Software Engineering team delivers next-generation software application enhancements and new products for a changing world. Working at the cutting edge, we design and develop software for platforms, peripherals, applications and diagnostics - all with the most advanced technologies, tools, software engineering methodologies and the collaboration of internal and external partners.Join us to do the best work of your career and make a profound social impact as an IT Principal Software Engineer on our Software Engineer-IT team in Hyderabad.What you'll achieveAs an IT Principal Software Engineer, you will deliver products and improvements for a changing world. Working at the cutting edge, you will craft and develop software for platforms, peripherals, applications and diagnostics - all with the most sophisticated technologies, tools, software engineering methodologies and partnershipsYou will:•Develop technical solutions to meet business requirements•Assist in training, work assignment and mentoring of less experienced developers•Provide training and mentor other engineers when required•Be a domain specialist in one or more technical programming specialtiesTake the first step towards your dream careerEvery Dell Technologies team member brings something unique to the table. Here's what we are looking for with this role:Essential Requirements8-9 years of development experience in technologies with hands-on experience Java 11, Spring 5, PostgreSQL, ReactJS and PCF Cloud microservicesREST & SOAP API development. Strong understanding around API and MicroservicesExperience in using Dev Ops Engineering like CICD, build automation, pipeline integration, GitLab.OAUTH 2.0, SAML authentication and HTML, CSS, Javascript, XML and JSON modulesExperience in Test Driven Development Framework. Ability to have unit test plan and develop unit test cases.Desirable RequirementsIntegration and middleware knowledge on Messaging Queues and Rabbit MQWho we are:We believe that each of us has the power to make an impact. That's why we put our team members at the center of everything we do. If you're looking for an opportunity to grow your career with some of the best minds and most advanced tech in the industry, we're looking for you.Dell Technologies is a unique family of businesses that helps individuals and organizations transform how they work, live and play. Join us to build a future that works for everyone because Progress Takes All of Us.Application closing date: 30 April 2024Dell Technologies is committed to the principle of equal employment opportunity for all employees and to providing employees with a work environment free of discrimination and harassment. Read the full Equal Employment Opportunity Policy here .Job ID:R243190Dell's Flexible & Hybrid Work CultureAt Dell Technologies, we believe our best work is done when flexibility is offered.We know that freedom and flexibility are crucial to all our employees no matter where you are located and our flexible and hybrid work style allows team members to have the freedom to ideate, be innovative, and drive results their way. To learn more about our work culture, please visit our locations page.Salary: . Date posted: 04/06/2024 02:14 AM
Business Intel Engineer I, AOP
Amazon, Hyderabad, TS, IN
DESCRIPTIONAmazon Transportation team is looking for an innovative, hands-on and customer-obsessed Business Analyst for Analytics team. Candidate must be detail oriented, have superior verbal and written communication skills, strong organizational skills, excellent technical skills and should be able to juggle multiple tasks at once.Ideal candidate must be able to identify problems before they happen and implement solutions that detect and prevent outages. The candidate must be able to accurately prioritize projects, make sound judgments, work to improve the customer experience and get the right things done.This job requires you to constantly hit the ground running and have the ability to learn quickly. Primary responsibilities include defining the problem and building analytical frameworks to help the operations to streamline the process, identifying gaps in the existing process by analyzing data and liaising with relevant team(s) to plug it and analyzing data and metrics and sharing update with the internal teams.Key job responsibilities1) Apply multi-domain/process expertise in day to day activities and own end to end roadmap.2) Translate complex or ambiguous business problem statements into analysis requirements and maintain high bar throughout the execution.3) Define analytical approach; review and vet analytical approach with stakeholders.4) Proactively and independently work with stakeholders to construct use cases and associated standardized outputs5) Scale data processes and reports; write queries that clients can update themselves; lead work with data engineering for full-scale automation6) Have a working knowledge of the data available or needed by the wider business for more complex or comparative analysis7) Work with a variety of data sources and Pull data using efficient query development thatrequires less post processing (e.g., Window functions, virt usage)8) When needed, pull data from multiple similar sources to triangulate on data fidelity9) Actively manage the timeline and deliverables of projects, focusing on interactions in the team10) Provide program communications to stakeholders11) Communicate roadblocks to stakeholders and propose solutions12) Represent team on medium-size analytical projects in own organization and effectively communicate across teamsA day in the life1) Solve ambiguous analyses with less well-defined inputs and outputs; drive to the heart of the problem and identify root causes2) Have the capability to handle large data sets in analysis through the use of additional tools3) Derive recommendations from analysis that significantly impact a department, create new processes, or change existing processes4) Understand the basics of test and control comparison; may provide insights through basic statistical measures such as hypothesis testing5) Identify and implement optimal communication mechanisms based on the data set and the stakeholders involved6) Communicate complex analytical insights and business implications effectivelyAbout the teamAOP (Analytics Operations and Programs) team is missioned to standardize BI and analytics capabilities, and reduce repeat analytics/reporting/BI workload for operations across IN, AU, BR, MX, SG, AE, EG, SA marketplace.AOP is responsible to provide visibility on operations performance and implement programs to improve network efficiency and defect reduction. The team has a diverse mix of strong engineers, Analysts and Scientists who champion customer obsession.We enable operations to make data-driven decisions through developing near real-time dashboards, self-serve dive-deep capabilities and building advanced analytics capabilities.We identify and implement data-driven metric improvement programs in collaboration (co-owning) with Operations teams.We are open to hiring candidates to work out of one of the following locations:Hyderabad, TS, INDBASIC QUALIFICATIONS- 2+ years of analyzing and interpreting data with Redshift, Oracle, NoSQL etc. experience- Experience with data visualization using Tableau, Quicksight, or similar tools- Experience with building and maintain basic data artifacts (e.g. ETL, data models, queries)- Experience with scripting language (e.g., Python, Java, or R)- 2+ years of ecommerce, transportation, finance or related analytical field experiencePREFERRED QUALIFICATIONS- Master's degree, or Advanced technical degree- Knowledge of data modeling and data pipeline design- Experience with statistical analysis, co-relation analysis
Software Development Engineer, Global Logistics
Amazon, Hyderabad, Any
BASIC QUALIFICATIONS- 5+ years of non-internship professional software development experience- 5+ years of programming with at least one software programming language experience- 5+ years of leading design or architecture (design patterns, reliability and scaling) of new and existing systems experience- Experience as a mentor, tech lead or leading an engineering teamDESCRIPTIONAt Amazon we are working to be the most customer-centric company on earth. Towards this, Amazon Global Logistics (AGL) is working to enable the largest selection of inventory on earth for our customers. AGL vision is to build Amazon's global logistics pipeline for any supplier to supply to any location in the world. AGL offers intercontinental logistics services to selling partners to enable them to move their product from an origin country to Amazon fulfillment centers. AGL is also responsible for exports from fulfillment centers to other countries outside of the fulfillment network. AGL enables global expansion for the selling partners with the highest standard of quality, accountability, and transparency.AGL team is building cross border logistics solutions which eliminate borders between buyers and sellers. These solutions enable the third-party sellers to import or export their products across borders, thereby expand their business into new markets and get access to the customers around the world. Amazon is driving this multi-year vision to create new logistical benchmarks and "firsts" through disruptive innovation.We are looking for highly technical, passionate, self-motivated and talented Software Development Engineers who love solving tough and challenging problems. As a Software Development Engineer, You would build complex architecture that would offer thousands of sellers to ship millions of packages over domestic freight, international freight (Air, Ocean) to bring selection closer to the customers across the world. You would be solving complex problems for Sellers, Shippers and Trade agents around International Shipment Management, Customs Compliance, Shipment Tracking, Seller & Shipper Identity, Financial Automation, Small Parcel Consolidation/De-consolidation with a strong focus on offering a cost effective, reliable and high-quality solution for international shipping. You will be working on wide-ranging business functions and technology stacks that need in-depth knowledge. You will be part of a dynamic group of engineers where flexibility, teamwork, and creative thinking are highly valued. The ability to build productive relationships with colleagues at remote sites around the world will make you successful in this role. If you are excited by the opportunity to work with a high performing engineering team in the design and development of this AGL solution, this is the right career move for you! We operate as a startup within Amazon, providing great flexibility in product and technology direction. Do you want to be part of our innovative team?We are open to hiring candidates to work out of one of the following locations:Hyderabad, TS, INDPREFERRED QUALIFICATIONS- 5+ years of full software development life cycle, including coding standards, code reviews, source control management, build processes, testing, and operations experience- Bachelor's degree in computer science or equivalentSalary: . Date posted: 04/16/2024 10:25 PM
Staff Analytics Engineer ( Data Platform)
Discovery, Inc. (Formerly Scripps Networks Interactive), Hyderabad, Any
Every great story has a new beginning, and yours starts here. Welcome to Warner Bros. Discovery... the stuff dreams are made of. Who We Are... When we say, "the stuff dreams are made of," we're not just referring to the world of wizards, dragons and superheroes, or even to the wonders of Planet Earth. Behind WBD's vast portfolio of iconic content and beloved brands, are the storytellers bringing our characters to life, the creators bringing them to your living rooms and the dreamers creating what's next...From brilliant creatives, to technology trailblazers, across the globe, WBD offers career defining opportunities, thoughtfully curated benefits, and the tools to explore and grow into your best selves. Here you are supported, here you are celebrated, here you can thrive. Staff Analytics Engineer - Hyderabad, India . About Warner Bros. Discovery: Warner Bros. Discovery, a premier global media and entertainment company, offers audiences the world's most differentiated and complete portfolio of content, brands and franchises across television, film, streaming and gaming. The new company combines Warner Media's premium entertainment, sports and news assets with Discovery's leading non-fiction and international entertainment and sports businesses.For more information, please visit www.wbd.com . Meet our team: The Data & Analytics organization is at the forefront of developing and maintaining frameworks, tools, and data products vital to WBD, including flagship streaming product Max and non-streaming products such as Films Group, Sports, News and overall WBD eco-system. Our mission is to foster unified analytics and drive data-driven use cases by leveraging a robust multi-tenant platform and semantic layer. We are committed to delivering innovative solutions that empower teams across the company to catalyze subscriber growth, amplify engagement, and execute timely, informed decisions, ensuring our continued success in an ever-evolving digital landscape Roles & Responsibilities: As a Staff Analytics Engineer, you will lead data pipeline, data strategy, and data visualization-related efforts for the Data & Analytics organization at Max. You're an engineer who not only understands how to use big data in answering complex business questions but also how to design semantic layers to best support self-service vehicles. You will manage projects from requirements gathering to planning to implementation of full-stack data solutions (pipelines to data tables to visualizations). You will work closely with cross-functional partners to ensure that business logic is properly represented in the semantic layer and production environments, where it can be used by the wider Product Analytics team to drive business insights and strategy. Design and implement data models that support flexible querying and data visualization.Partner with Product stakeholders to understand business questions and build out advanced analytical solutions.Advance automation efforts that help the team spend less time manipulating & validating data and more time analyzing.Build frameworks that multiply the productivity of the team and are intuitive for other data teams to leverage.Participate in the creation and support of analytics development standards and best practices.Create systematic solutions for solving data anomalies: identifying, alerting, and root cause analysis.Work proactively with stakeholders to ready data solutions for new product and/or feature releases, with a keen eye for uncovering and troubleshooting any data quality issues or nuances.Identify and explore new opportunities through creative analytical and engineering methods. What to Bring : Bachelor's degree, MS or greater in a quantitative field of study (Computer/Data Science, Engineering, Mathematics, Statistics, etc.)8-12 years of relevant experience in business intelligence/data engineeringExpertise in writing SQL (clean, fast code is a must) and in data-warehousing concepts such as star schemas, slowly changing dimensions, ELT/ETL, and MPP databasesExperience in transforming flawed/changing data into consistent, trustworthy datasets, and in developing DAGs to batch-process millions of recordsExperience with general-purpose programming (e.g. Python, Java, Go), dealing with a variety of data structures, algorithms, and serialization formatsExperience with big-data technologies (e.g. Spark, Kafka, Hive)Advanced ability to build reports and dashboards with BI tools (such as Looker and Tableau)Experience with analytics tools such as Athena, Redshift/BigQuery, Splunk, etc.Proficiency with Git (or similar version control) and CI/CD best practicesExperience in managing workflows using Agile practicesAbility to write clear, concise documentation and to communicate generally with a high degree of precisionAbility to solve ambiguous problems independentlyAbility to manage multiple projects and time constraints simultaneouslyCare for the quality of the input data and how the processed data is ultimately interpreted and usedExperience with digital products, streaming services, or subscription products is preferredStrong written and verbal communication skills Characteristics & Traits Naturally inquisitive, critical thinker, proactive problem-solver, and detail-oriented.Positive attitude and an open mindStrong organizational skills with the ability to act independently and responsiblySelf-starter, comfortable initiating projects from design to execution with minimal supervisionAbility to manage and balance multiple (and sometimes competing) priorities in a fast-paced, complex business environment and can manage time effectively to consistently meet deadlinesTeam player and relationship builder What We Offer: A Great Place to work.Equal opportunity employerFast track growth opportunities How We Get Things Done... This last bit is probably the most important! Here at WBD, our guiding principles are the core values by which we operate and are central to how we get things done. You can find them at www.wbd.com/guiding-principles/ along with some insights from the team on what they mean and how they show up in their day to day. We hope they resonate with you and look forward to discussing them during your interview. The Legal Bits... Warner Bros. Discovery embraces the opportunity to build a workforce that reflects the diversity of our society and the world around us. Being an equal opportunity employer means that we take seriously our responsibility to consider qualified candidates on the basis of merit, regardless of sex, gender identity, ethnicity, age, sexual orientation, religion or belief, marital status, pregnancy, parenthood, disability or any other category protected by law.If you're a qualified candidate and you require adjustments or accommodations to search for a job opening or apply for a position, please contact us at [email protected]: . Date posted: 04/18/2024 06:53 AM
Data Engineer, Foundational People Data Services, Amazon
Amazon, Hyderabad, Any
BASIC QUALIFICATIONS- 1+ years of data engineering experience- Experience with SQL- Experience with data modeling, warehousing and building ETL pipelines- Experience with one or more scripting language (e.g., Python, KornShell)- Experience with Java basics.DESCRIPTIONInterested in working with the latest technology in the cloud computing space? Amazon is looking for talented and enthusiastic application engineers to join the Foundational People Data Services Organization. Our team is responsible for integrating HR system to 100+ downstream systems. This job requires you to hit the ground running and your ability to learn quickly will define your success. Expertise on Core Java, SQL (Oracle/Presto), AWS services like Athena, S3, Batch, Lambda, DynamoDb, CloudWatch is a must. If you're thrilled by the idea of ownership, passionate about HR Data and comfortable with the pace of a company shipping software in cycles measured in weeks not years, we would love to connect with you.Key job responsibilities- Manage AWS resources including EC2, RDS, Redshift, Kinesis, EMR, Lambda, Glue, Apache Airflow etc.- Build and deliver high quality data architecture and pipelines to support business analyst, data scientists, and customer reporting needs.- Interface with other technology teams to extract, transform, and load data from a wide variety of data sources- Continually improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers- Provide support of incoming tickets, including extensive troubleshooting tasks, with responsibilities covering multiple products, features and services.A day in the lifeNo two days are the same in FPDS - Payroll Data Providers team - the nature of the work we do and constantly shifting threat landscape means sometimes you'll be working with developing a data pipeline that will ensure efficient delivery of data to partner teams, other days you'll be working on developing data platform that is capable of processing large amounts of data in a short time.Work/Life Harmony: Our team gives high value to work-life harmony and working on a hybrid work environment in which engineers spend 3 days working in the office.On-Call Responsibility: This position involves On-call responsibility. We work as a team to root-cause the issues and ensure our systems are resilient and fault-tolerant.About the teamPDP team is accountable for any Payroll batch outbound and real time integrations to enable New Country eXpansions (NCX), In Country eXpansions (ICX), Vendor transitions, Payroll compliance, legal needs and Payroll Operations and Vendor specific requirement changes. PDP team build continuous improvement initiatives related to real time integrations, defect and anomalies detection and reduction directly within Batch Integration Framework prior to the data reaching to our customers. PDP also drives new operational excellence features and mechanisms to improve the existing framework and processes which helps to enhance customer experience, reduces manual errors and correction of errors (COE's). PDP work is a business-critical function to ensure that payroll is processed for nearly 1.5MM employees in 63 countries across the world and pays Amazonians accurately on time. Currently, payroll integrations produce nearly 300 million records per day. The transfer of data is required to perform payroll, benefits, compensation, security, hiring, and separation activities for Amazonians.We are open to hiring candidates to work out of one of the following locations:Hyderabad, TS, INDPREFERRED QUALIFICATIONS- Experience with AWS technologies like Redshift, S3, AWS Glue, EMR, Kinesis, FireHose, Lambda, and IAM roles and permissionsSalary: . Date posted: 04/22/2024 10:48 PM