Role OverviewLead the evolution of enterprise data integration by transitioning legacy ETL processes from Informatica PowerCenter to modern cloud-based platforms. This position plays a central role in designing, building, and maintaining scalable data pipelines across on-premises and public cloud environments, ensuring data accuracy, system reliability, and alignment with regulatory standards.
Key Responsibilities
- Drive the migration and enhancement of data integration workflows from Informatica PowerCenter to cloud ETL platforms
- Take full ownership of data interface applications across production and non-production systems
- Design, develop, and optimize ETL mappings using transformations such as Joiner, Lookup, Router, Aggregator, and Update Strategy
- Conduct unit and integration testing, with complete accountability for validation and documentation
- Diagnose and resolve performance bottlenecks and system failures in production environments
- Monitor scheduled jobs, troubleshoot failures, and implement corrective actions within SLA guidelines
- Manage incident tickets and service requests, ensuring timely resolution and closure
- Support minor enhancements and configuration changes in response to business needs
- Collaborate with application, database, and system administrators to investigate and resolve data issues
- Participate in on-call rotations, including weekend and off-hours support when required
- Automate routine data operations to improve efficiency and reduce manual intervention
- Lead requirement-gathering sessions with business stakeholders and translate needs into technical designs
- Work closely with data architects and analysts to resolve source system discrepancies and define data models
- Develop and validate ETL routines to enforce business rules and maintain data integrity
- Investigate data quality issues, assess impact, and implement corrective solutions
- Interpret SQL and PL/SQL logic and convert business rules into Informatica mappings
- Process data from diverse sources including Oracle, MS SQL, SAP HANA, Salesforce, and flat files
- Apply performance tuning techniques using Informatica partitioning, stage tables, and pushdown optimization
- Lead data profiling efforts using Informatica Data Quality (IDQ), including rule creation and scorecard development
- Extract and process JSON data using IDQ data processor transformations
- Write SQL and shell scripts to implement and automate business logic
- Validate data migration outputs and ensure accuracy across environments
- Support deployment of fixes and new functionality into production systems
- Utilize scheduling tools such as TIDAL and Autosys for job orchestration
- Work with Teradata utilities including FLOAD, MLOAD, and EXPORT, leveraging pushdown optimization
- Support compliance initiatives related to SOX, PCI, GDPR, and export controls
- Assess risk in technical and operational decisions, ensuring adherence to policies and ethical standards
- Lead knowledge-sharing sessions on emerging technologies and mentor team members on ETL best practices
Required Qualifications
- Minimum of 5 years of hands-on experience with Informatica PowerCenter (versions 9.x/10.x), IDMC, IICS, and Informatica Data Quality (IDQ)
- Proven experience working with Oracle, MSSQL, TOAD, SQL Developer, and Salesforce as data sources or targets
- At least 3 years of experience with job scheduling and orchestration tools such as Control-M or Redwood
- Five or more years in IT with a focus on data warehousing, including analysis, design, development, testing, and implementation
- Strong problem-solving skills with expertise in debugging and performance tuning
- Deep understanding of ETL architecture and scalable data integration patterns
- Must be a U.S. citizen, permanent resident, or hold protected status under U.S. export control regulations
Preferred Qualifications
- Bachelor’s degree in computer science or a related field, or equivalent professional experience
- Familiarity with the full lifecycle of data warehouse projects, from requirements to

