Interview with HRs instantly—live now.
Skip applications. Get hired faster in Live Rooms.
Join instant video interviews
Tatvic
Full Time Job
Not Disclosed
3-6 years
Posted 30+ days ago
Responsibilities of a Senior Data Scientist
Responsibilities w.r.t Customer:
1. Communicating with Customers to discover and understand the problem statement.
2. Design a Solution that clearly aligns the Problem Statement, Solution Details, Output, Success Criteria and how the impact of the Solution aligns to a specific Business Objective.
3. Designing solution.
a. Bird’s eye view of the Platform to create an effective solution.
b. Feature Engineering: Identify features that would matter and ensure that the logic for feature selection is transparent & explainable.
c. Model Selection: Use Pre-trained Models, AutoML, APIs or individual algorithms & libraries for effective model selection for optimal implementation.
d. Optimize the model to increase its effectiveness with proper data cleansing and feature engineering refinements.
e. Deploy the model for Batch or real-time predictions using methodologies like MLOps.
f. Display or export the output into a visualization platform.
4. Create a POC for providing data insight for the customer at short notice.
5. Maintaining and Managing Project execution trackers and documentation.
6. Keep the promises made to the customer in terms of deliverables, deadlines and quality.
Innovation and Asset building Responsibilities
1. Design and Build reusable solutions that can be reused for multiple customers.
2. Create clear documentation on architecture and design concepts & technical decisions in the project.
3. Conduct internal sessions to educate cross-team stakeholders to improve literacy of the domain & solutions.
4. Maintain coding standards & build reusable code & libraries for future use and enhancing Engineering at Tatvic.
5. Stay up-to-date with innovations in data science and its applications in Tatvic relevant domains. Frequently Perform POCs to get hands-on experience with new technologies, including Google Cloud tools designed for Data Science applications.
6. Explore the usage of data science in various business and web analytics applications
Technical Skills:
1. Data Handling: Manage data from diverse sources, including structured tables, unstructured text, images, videos, and streaming/real-time data. For scalable data processing and analysis, utilize cloud platforms (preferably Google Cloud) such as BigQuery, VertexAI, and Cloud Storage.
2. Feature Engineering: Identify and select relevant features with transparent and explainable logic. Design new derived features to enhance model performance and enable deeper insights. Utilize advanced techniques like Pearson Coefficient and SHAP values for feature importance and correlation analysis.
3. Model Development:
a. Select and build models based on problem requirements, using pre-trained models, AutoML, or custom algorithms. Experience in Linear, Non-linear, Timeseries (RNNs, LSTMs), Tree-based models (XGBoost, LightGBM), and other foundational approaches.
b. Apply advanced modeling techniques like CNNs for image processing, RCNNs, YOLO for object detection, and RAGs and LLM tuning for text and search-related tasks.
c. Optimize models with hyperparameter tuning, Bayesian optimization, and appropriate evaluation strategies.
4. Model Evaluation:
Assess model performance using metrics suited for data type and problem type:
a. For categorical data: Precision, Recall, F1 Score, ROC-AUC, and Precision-Recall curves.
b. For numerical data: Metrics like Mean Absolute Error (MAE), Root Mean Square Error (RMSE), Mean Squared Error (MSE), and R-squared (R²).
5. Deployment & MLOps: Deploy models for batch or real-time predictions using MLOps frameworks, leveraging tools such as VertexAI and Kubeflow for efficient and scalable deployment pipelines. Integrate outputs with visualization platforms to deliver actionable insights and drive decision-making.
6. Innovation: Stay current with trends in AI and data science, including LLMs, grounding techniques, and innovations in temporal and sequential data modeling. Regularly conduct POCs to experiment with emerging tools and technologies.
7. Code Practices & Engineering:
a. Write clean, maintainable, and scalable code following industry best practices. Adhere to version control (e.g., Git) for collaborative development and maintain coding standards.
b. Implement error handling, logging, and monitoring to ensure reliability in production systems.
c. Collaborate with other teams to integrate data science models into broader system architectures.
8. Performance Optimization: Optimize model and data processing pipelines for computational efficiency and scalability. Use parallel processing, distributed computing, and hardware accelerators (e.g., GPUs, TPUs) where applicable.
9. Documentation & Reusability: Maintain comprehensive technical documentation for all solutions. Design and build reusable assets to streamline future implementations.
Functional Competencies
1. Applying Technical knowledge (SQL/no SQL/ML/DL)
2. Python / Libraries/Packages/ API/Web Services
3. Data Analysis
4. Cloud And Infra Services / Monitoring & Debugging/ Orchestration Frameworks
5. Solution/Architecture Design
Round 1 :
Aptitude Online Test
Round 2 :
Technical Interview
Round 3 :
Technical Online Test
Round 4 :
HR Round
Explore Live Hiring Rooms and interview with HRs instantly - no waiting, no lengthy applications!
23
Active Rooms
47
HRs Online
Priya S.
Got hired in 2 hours!
"Joined a Live Room at 2pm, interviewed instantly, and got the offer by 4pm. This is revolutionary!"
Stand out and get shortlisted up to 10X more
Browse live hiring rooms
Click to join - HR is waiting
Interview instantly, get hired faster
🔥 3 new rooms opened in the last 10 minutes!
Explore Live Hiring Rooms and interview with HRs instantly - no waiting, no lengthy applications!