Mastercard is looking for a Senior Software Engineer in Vancouver, Canada. If you meet the requirements below, you will be eligible to apply via Digital Marketing Community.
Job Responsibilities:
- Associate on the design the next implementation of Mastercard secure, global data and insight architecture.
- Own medium-to-large size data engineering projects.
- Find, ingest, and incorporate new sources of a real-time, streaming, batch, and API-based data into our platform to enhance the insights we get from running tests and expand the ways and properties on which we can test.
- Aid business in utilizing data-driven insights to drive growth and transformation.
- Create and maintain data processing workflows feeding Mastercard analytics domains.
- Facilitate reliable integrations with internal systems and third-party API’s as needed.
- Support data analysts as needed, advising on data definitions and helping them derive meaning from complex datasets.
- Prototype new algorithms, experiment, evaluate and deliver actionable insights.
- Experiment with new tools to streamline the development, testing, deployment, and running of our data pipelines.
- Operate with cross-functional agile teams to drive projects through a full development cycle.
- Aid the team to improve with the usage of data engineering best practices.
- Cooperate with other data engineering teams to improve the data engineering ecosystem and talent within Mastercard.
- Creatively solve problems when facing constraints, whether it is the number of developers, quality or quantity of data, compute power, storage capacity or just time.
- Support awareness of relevant technical and product trends through self-learning/study, training classes and job shadowing.
Job Requirements:
- Master’s degree (or foreign equivalent) in computer science, data science, or a related field.
- Minimum 3 years of experience as a software engineer or a similar role.
- Effective experience in building and deploying production level data-driven applications and data processing workflows/pipelines.
- Proficient experience with application development frameworks (Java/Scala, Spring).
- Familiarity with data processing and storage frameworks like Hadoop, Spark, Kafka.
- Expert knowledge of implementing REST services with support for JSON, XML and other formats.
- Deep knowledge of performance Tuning of Database Schemas, Databases, SQL, ETL Jobs, and related scripts.
- Solid familiarity with batch processing and workflow tools such as NiFi.
- Demonstrated understanding of modern BI and data exploration tools.
- Proven experience in developing integrated cloud applications with services like Azure or GCP.
- Strong skills in implementing machine learning systems at scale in Java, Scala, or Python and deliver analytics involving all phases like data ingestion, feature engineering, modeling, tuning, evaluating, monitoring and presenting.