
IBM Change Data Capture (NCS/Job/ 2170)
For A French Mnc It Company
7 - 12 Years
Full Time
Up to 30 Days
Up to 22 LPA
1 Position(s)
Bangalore / Bengaluru
Posted By : Nilasu Consulting Services Pvt Ltd
Posted 18 Days Ago
Job Skills
Job Description
IBM CDC
Location : Bangalore
Position is 24x7 and the person will need to work in shifts.
Exp- 7-12 yrs
NP: 0-30 days
Job Overview:
We are seeking an experienced IBM Change Data Capture (CDC) Administrator to manage and maintain a high-availability CDC deployment with Db2 as the source and Google BigQuery as the target. The ideal candidate should have strong expertise in CDC replication, database administration, cloud technologies (GCP), and high-availability setups.
Key Responsibilities:
- Design, implement, and manage IBM CDC (IIDR) replication from Db2 to Google BigQuery.
- Ensure high availability (HA) and disaster recovery (DR) configurations for CDC.
- Monitor and troubleshoot CDC latency, performance, and replication failures.
- Optimize Db2 log-based CDC configurations for efficient data capture.
- Manage and optimize data ingestion into Google BigQuery, ensuring schema compatibility.
- Implement security best practices, including data encryption and access control.
- Automate monitoring, alerting, and failover processes using Shell/Python scripting.
- Work with networking teams to ensure seamless and secure data flow between on-prem Db2 and GCP.
- Perform capacity planning, performance tuning, and cost optimization for BigQuery.
- Document procedures, best practices, and troubleshooting guides.
Required Skills & Qualifications:
- IBM InfoSphere Data Replication (IIDR) / CDC Administration
- Strong Db2 database administration experience, including HADR, log-based replication, and SQL tuning
- Hands-on experience with Google BigQuery, including schema design and partitioning
- Cloud networking and security expertise (VPN, VPC, IAM roles in GCP)
- Proficiency in Shell scripting / Python for automation
- Experience working with high-availability and disaster recovery architectures
- Strong problem-solving and troubleshooting skills
- Knowledge of ETL, data pipelines, and real-time streaming architectures is a plus
Preferred Skills:
- Experience with Google Cloud Dataflow, Pub/Sub, or other GCP data services
- Familiarity with Terraform/Ansible for infrastructure automation
- Experience with monitoring tools like Prometheus, Grafana, or GCP Stackdriver
Education & Certifications:
- Bachelor's or Master's degree in Computer Science, Information Technology, or a related field
- IBM Db2 Certification or Google Cloud Certified – BigQuery (preferred)
Matching Jobs
No matching jobs found.