Software Engineer (Data Engineering)
Limerick, Ireland

The Opportunity:

Through market-leading credential-driven privileges and innovative payments, Transact enables every aspect of student life across campus. We partner with institutions to deliver a mobile-centric, personalized student and family experience both on and off-campus. We are the only true enterprise-class cloud platform, enabling mission-critical capabilities that support student success by powering every aspect of campus life along with data for meaningful institutional insight.

These mission-critical capabilities included integrated payments and personalized payment plans for tuition and fees, credential-driven transactions for comprehensive dining and retail transactions, uniquely configurable security driven privilege management as well as automated class attendance and campus events. Transact solutions easily integrate with campus systems and external partners enabling an extensive and open ecosystem that extends existing institutional investment, accelerates innovation and delivers a frictionless personalized student experience.

Transact is headquartered in Phoenix, AZ and has served the education community for over thirty-five years. Visit transactcampus.com to learn more.

Position Responsibilities

Transact, the global leader in higher education learning software and campus card systems, is searching for a Software Engineer to work on its Data Analytics team. Based in Limerick, Ireland,

  • Work in a dedicated data reporting and analytics team currently building a green-field platform to produce data-driven insights for Transact Campus and our clients.
  • Analyze, interpret and orchestrate complex data across disparate sources comprising unstructured, semi-structured and structured datasets in streaming and batch modes.
  • Support the business by designing and developing real-time data pipelines using the latest Databricks and Delta Lake Azure cloud technologies.
  • Work with data consumers (reporting, analysis, or data science) to provide metrics that meet their needs.
  • Contribute to standards for data producers (application or business teams) streaming data into the Data lake.
  • Has previous experience testing commercial software products.
  • Is familiar with both manual and automated testing processes.
  • Support the life cycle of the application during quality assurance, user acceptance testing, and post release.
  • Comply with and contribute to consistent development guidelines (coding, change control, build, versioning)
  • Participate in peer code reviews.

Required Skills:

  • A bachelor’s degree in Computer Science, IT or related field OR equivalent related work experience – preferably with a focus on Data Analytics
  • 4+ years of experience in software enterprise level Data Engineering
  • Big data workloads
  • Experience with data lakes and scale-out processing on them
  • Relational database design and best practices

Required Technical Skills:

  • Hands-on experience designing and developing Spark data pipelines
  • Strong SQL skills
  • Strong Python skills
  • Solid understanding of the evolving data landscape and cloud-based big data workloads
  • Experience with ETL/ELT patterns, preferably using Databricks jobs
  • Excellent technical documentation skills to translate development components.
  • Experience with data lakes (HDFS, Azure Data Lake or AWS S3)
  • Experience with source code management systems such as Git/TFS/SVN
  • Experience in working in an Agile team (Scrum, XP, Kanban)
  • Ability to present ideas and insights to business stakeholders.
  • Fluency in written and spoken English.

Preferred Skills:

  • Good understanding of Azure Data Services (Azure Databricks, Azure Data Factory, Azure Data Lake Gen 2)
  • Experience with Databricks Delta Lake.
  • Experience with Spark Structured Streaming.
  • Experience NoSQL databases.
  • Experience with Infrastructure as Code technologies such as Terraform or Azure Resource Manager.
  • Experience in Data Science and ML methodologies.
  • Understanding of Azure services for streaming data (EventHub, EventGrid)
  • Understanding of Data strategy including Data Governance and Data management

Bonus Skills

  • Unity Catalog.
  • Delta Sharing.
  • Delta Live Tables.

Why Join Us?

  • Opportunity to work with cutting-edge data technologies and platforms.
  • Collaborative and supportive work environment.
  • Ongoing professional development and training opportunities.
  • Hybrid Working.
  • Regular social, sporting and community events.
  • Benefits include private Health Insurance, Dental Insurance, matched Pension Contribution and 25 Days Annual Leave

This job description is not designed to contain a comprehensive listing of activities, duties, or responsibilities that are required. Nothing in this job description restricts management's right to assign or reassign duties and responsibilities at any time.

Transact is an equal employment opportunity/affirmative action employer and considers qualified applicants for employment without regard to race, gender, age, color, religion, national origin, marital status, disability, sexual orientation, or any other protected factor.