Role: Principal Data Science Engineer
Exp: 10+ Years
Location: Remote
Notice Period: 30 Days Max
Key Responsibilities:
• Model Development and Implementation:
• Design, develop, and deploy advanced data science models and algorithms.
• Ensure models are scalable, maintainable, and optimized for performance.
• Leadership and Mentorship:
• Lead and mentor a team of data scientists and engineers, providing technical guidance and support.
• Foster a culture of continuous learning and innovation within the team.
• Collaboration:
• Work closely with product managers, engineers, and business stakeholders to understand data requirements and deliver solutions.
• Collaborate with data engineering teams to ensure seamless integration of data science models into production systems.
• Research and Innovation:
• Stay current with the latest advancements in data science and machine learning.
• Evaluate and implement new tools, techniques, and technologies to enhance our data science… capabilities.
• Data Analysis and Insight Generation:
• Conduct exploratory data analysis to uncover trends, patterns, and insights.
• Translate complex data into actionable insights and recommendations for business stakeholders.
• Project Management:
• Lead data science projects from conception to completion, ensuring timely delivery and high-quality results.
• Develop project plans, timelines, and resource allocation.
Qualifications:
• Education and Experience:
• Bachelor’s or master’s degree in computer science, Statistics, Mathematics, or a related field.
• 10+ years of experience in data science, with a proven track record of developing and deploying advanced models.
• Technical Skills:
• Experience with Databricks
• Proficiency in programming languages – Python, Pyspark.
• Extensive experience with machine learning frameworks (e.g., TensorFlow, PyTorch, scikit-learn).
• Strong knowledge of statistical analysis, data mining, and predictive modelling. Machine learning algorithms (supervised/unsupervised/deep learning). Model evaluations & validation.
• Experience with big data technologies (e.g., Hadoop, Spark)
• Familiarity with cloud platforms (e.g., AWS, GCP, Azure) and their data services. AWS knowledge is must.
• Docker, Kubernetes
• Any DB experience, SQL query writing and optimization
• Soft Skills:
• Excellent problem-solving and analytical skills.
• Strong communication and interpersonal skills.
• Ability to work effectively in a fast-paced, dynamic environment.
Job Type: Permanent
Pay: ?3,000,000.00 – ?5,000,000.00 per year
Experience:
• total work: 10 years (Required)
• Machine Learning Frameworks: 3 years (Required)
• Machine learning algorithms: 3 years (Required)
• Big Data Technologies: 2 years (Required)
• Cloud Platforms (e.g., AWS, GCP, Azure): 2 years (Required)
• AWS: 1 year (Required)
• Docker/Kubernetes: 3 years (Required)
• Database: 3 years (Required)
• Python: 2 years (Required)
• Pyspark: 2 years (Required)
Work Location: Remote