Job Title: Senior Data Engineer
Location: Hybrid – Phoenix, AZ / Raleigh, NC (4 days per week)
Experience Level: 10–15 Years
Employment Type: 12+ Month Contract
Payrate- $75 to 90/hr
Job Overview:
We are seeking a highly experienced Senior Data Engineer to join our growing data team. The ideal candidate will have 10–15 years of experience in data engineering with a strong focus on Python, SQL, Snowflake, and DBT. This role requires a deep understanding of building robust, scalable data pipelines in a cloud-based, modern data stack environment.
Key Responsibilities:
Design, build, and maintain scalable and reliable data pipelines to support analytics, reporting, and data science initiatives.
Develop and optimize DBT (Data Build Tool) models for efficient data transformation and data lineage.
Work with Snowflake to design schemas, optimize queries, and manage cloud-based data warehouses.
Write efficient, reusable, and testable Python and SQL code for ETL/ELT processes.
Collaborate with cross-functional teams including data analysts, scientists, and business stakeholders to understand data needs and deliver effective solutions.
Ensure data quality, consistency, and integrity across all data platforms.
Implement data governance and best practices for security, privacy, and compliance.
Troubleshoot and debug production issues in data workflows and pipelines.
Required Skills & Qualifications:
10–15 years of hands-on experience in data engineering roles.
Strong experience in DBT such as Snowflake and building modular SQL-based transformation models.
Proven expertise in Snowflake data warehouse architecture and management.
Advanced proficiency in Python and SQL for data processing.
Experience with modern data pipeline and orchestration tools.
Solid understanding of data modeling, performance tuning, and best practices in cloud data engineering
Familiarity with CI/CD processes and version control tools like Git.
Strong problem-solving skills and the ability to work in a fast-paced, collaborative environment.
Preferred Qualifications:
Experience with tools such as Airflow, Fivetran, or Cloud Composer.
Familiarity with cloud platforms such as AWS, GCP, or Azure.
Experience in financial, healthcare, or retail data environments (optional).
Excellent communication and documentation skills.
...techniques (preferred but not required): Sterile techniques training Buffer preparation Enzyme-linked immunosorbent assay (ELISA) Protein isolation from tissue Colorimetric assays (e.g., BCA) Dialysis SDS-PAGE/Western blot Tissue culture PCR...
...Experience with bill of materials (BOMs), work orders, and low-volume production. Knowledge of textile manufacturing, soft goods, or upholstery. Ability to guide and support junior team members. Send Resume Directly to : ****@*****.***...
...Who we are looking for: VML is looking for a Senior Project Manager - Freelance with a breadth and depth of expertise to champion Digital and Social campaigns across multiple workstreams with a focus on Pharma at the center. This is an exciting opportunity to lead...
...Platform Expertise Leverage major AI/ML and data platforms, including Azure Machine Learning, AWS SageMaker, Google Vertex AI, and Databricks ML. Integrate AI models with existing utility systems (e.g., SCADA, OMS, DERMS) to deliver actionable insights. AI...
...Data Analyst (Remote) 24-Month Contract Remote (U.S.-based preferred) Duration: 24-Month Contract $80k-$100k, depending on experience... ...a 24-month remote contract opportunity to support enterprise-level analytics initiatives. This role requires a high level of...