hero

Find Your Dream Job Today

Our mission is to help high-achieving LGBTQ+ undergraduates reach their full potential.

Senior Software Engineer

Mastercard

Mastercard

Software Engineering
Pune, Maharashtra, India
Posted 6+ months ago

Job Title:

Senior Software Engineer

Overview:

What is Data & Services?

The Data & Services Technology Organization is a key differentiator for MasterCard, providing the cutting-edge services that help our customers grow. Focused on thinking big and scaling fast around the globe, this agile team is responsible for end-to-end solutions for a diverse global customer base. Centered on data-driven technologies and innovation, these services include payments-focused consulting, loyalty and marketing programs, business experimentation, and data-driven information and risk management services.

What role do we play in the modern world? Are we an enabler of purchases or a facilitator for transactions? We play a much larger role in the world by enabling those that have no access to financial systems. We have the technology, people, and Brand to serve modern society. Today, we are a global tech company that connects everyone to endless possibilities, priceless possibilities.

Job Description Summary

The Technology Foundations, Data and Transformation Solutions (DTS) team provides ETL solutions and data to BI applications, products, and services. We are looking for an experienced Big Data Developer who loves solving complex problems for a full spectrum of technologies. The person in this role will develop and implement data pipelines for sources and downstream systems.

Role

• Highly capable in learning new technologies & frameworks and implementing them as per the project requirements by adhering to quality standards
• Experience in all phases of data warehouse development lifecycle, from gathering requirements to testing, implementation, and support
• Adept in analyzing information system needs, evaluating end-user requirements, custom designing solutions and troubleshooting information systems
• Develop and implement data pipelines that extracts, transforms, and loads data into an information product that helps to inform the organization in reaching strategic goals
• Technical & Analytical Skills and Problem Solving
• Work on ingesting, storing, processing, and analyzing large data sets
• Investigate and analyze alternative solutions to data storing, processing etc. to ensure most streamlined approaches are implemented
• Monitor daily job failure alerts and resolve issues identified
• Ability to write algorithms with different rules
• Data warehousing principles & concepts and modification of existing data warehouse structures

All about you

• Must have experience deploying and working with big data technologies like Hadoop, Spark, and Sqoop
• Experience with streaming frameworks like Kafka and Axon
• Experience designing and building ETL pipeline using NiFi
• Highly proficient in OO programming (Python, PySpark Java, and Scala)
• Experience with the Hadoop Ecosystem (HDFS, Yarn, MapReduce, Spark, Hive, Impala),
• Proficiency on Linux, Unix command line, Unix Shell Scripting, SQL and any Scripting language
• Experience designing and implementing large, scalable distributed systems
• Ability to debug production issues using standard command line tools
• Create design documentation and maintain process documents
• Ability to debug Hadoop / Hive job failures
• Ability to use Cloudera in administering Hadoop

Optional:
Cloud technologies like Databricks, AWS, Azure and GCP.