Course Cap
🔴 LIVE: 0 hiring rooms active now
0 HRs ready to interview
Average hiring time improving
0 new rooms opened in last 10 mins
Join Live Rooms - Skip the wait, get hired faster
🔴 LIVE: 0 hiring rooms active now
0 HRs ready to interview
Average hiring time improving
0 new rooms opened in last 10 mins
Join Live Rooms - Skip the wait, get hired faster

Cloud Data Engineer Job in Chennai at LatentView

Interview with HRs instantly—live now.

Skip applications. Get hired faster in Live Rooms.

Join instant video interviews

company-logo
Cloud Data Engineer

LatentView

  Full Time Job

  Not Disclosed

  2-5 years

  Posted  30+ days ago

Location
  • Chennai
Skills Required
  • Agile Methodology
  • Scrum Methodologies
  • Kafka
  • Airflow
About this Job

LatentView is hiring for the role of Cloud Data Engineer!

Responsibilities of the Candidate:

  • Assist in designing scalable, secure, and cost-efficient data architectures using GCP-native services.
  • Support solution architects in preparing technical proposals and RFP responses.
  • Collaborate with business and technology stakeholders to define end-to-end data solutions.
  • Design and maintain conceptual, logical, and physical data models for analytics and reporting.
  • Build and manage ETL/ELT pipelines using GCP services like Dataflow, Dataproc, and BigQuery.
  • Ensure data governance, lineage, and quality controls are integrated into solutions.
  • Optimize data pipelines, queries, and storage layers for scalability and performance.
  • Fine-tune BigQuery cost performance, partitioning, and clustering strategies.
  • Implement monitoring and alerting for proactive performance management.
  • Mentor junior engineers, conduct code reviews, and foster a collaborative team culture.
  • Support pre-sales teams by preparing technical inputs, solution overviews, cost estimations, and POCs for RFPs.
  • Work with architects and bid teams to differentiate our technical capabilities in client proposals.

Requirements:

  • Minimum 2 years of hands-on experience with GCP data engineering services, including: o BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, Composer, Looker
  • Strong expertise in SQL and Python / PySpark for data transformations.
  • Hands-on experience in ETL/ELT pipeline development and data modelling.
  • Familiarity with CI/CD for data pipelines using tools like Cloud Build, Git, Jenkins.
  • Exposure to RFP processes — preparing technical responses, estimations, and solution diagrams.
  • Experience with real-time streaming frameworks (e.g., Pub/Sub, Kafka, Spark Streaming).
  • Knowledge of data lakehouse architectures and open-source tools like Apache Iceberg, Delta Lake, or Hudi.
  • Understanding of data security, IAM roles, DLP policies, and compliance frameworks in GCP.
  • Familiarity with modern orchestration tools like Airflow, dbt, or Dagster.
  • Hands-on experience working in Agile/Scrum environments
Eligible Degrees
MBA / All Courses
Bachelor of Technology/Engineering / All Courses
Master of Technology / All Courses
Bachelor of Arts / All Courses
Bachelor of Science / All Courses

+96 More

Who can apply
Work Experience: 2-5 years
Eligible Graduation Years: 2023, 2022, 2021, 2020, 2019
Documents Required

1. Resume

2. ID Proof (e.g. Aadhar Card, PAN Card, etc.)

About LatentView
Not ready to apply yet?

Explore Live Hiring Rooms and interview with HRs instantly - no waiting, no lengthy applications!

🔴 Live Now

23

Active Rooms

47

HRs Online

👤

Priya S.

Got hired in 2 hours!

"Joined a Live Room at 2pm, interviewed instantly, and got the offer by 4pm. This is revolutionary!"

Stand out and get shortlisted up to 10X more

⚡ How Live Rooms Work
1

Browse live hiring rooms

2

Click to join - HR is waiting

3

Interview instantly, get hired faster

🔥 3 new rooms opened in the last 10 minutes!

Recommended Jobs For You
Not ready to apply yet?

Explore Live Hiring Rooms and interview with HRs instantly - no waiting, no lengthy applications!