Node Engineer - Senior
As a Data Engineer, you will design, develop, and optimize data pipelines and infrastructure on GCP to enable advanced analytics and reporting solutions. You will work closely with business stakeholders to deliver robust, scalable solutions that support business intelligence and machine learning initiatives.
This
- on role requires a deep understanding of Big
Query, data engineering best practices, and the ability to translate business requirements into technical solutions. If you are passionate about working with big data and cloud technologies, we would love to hear from you!
Key Responsibilities:
- Data Pipeline Development: Design and build ETL/ELT data pipelines using Big
Query and other GCP services to ingest, process, and transform large datasets from multiple sources. - Data Modeling & Architecture: Develop and optimize data models and schemas to support analytics, reporting, and machine learning requirements.
- Performance Optimization: Implement best practices for performance tuning, partitioning, and clustering to optimize data queries and reduce costs in Big
Query. - Data Integration & Transformation: Collaborate with data scientists and analysts to design data solutions that integrate seamlessly with BI tools, machine learning models, and
- party applications. - Data Quality & Governance: Establish and enforce data quality standards, data governance frameworks, and security policies for data storage and access on GCP.
- Automation & Monitoring: Automate workflows using Cloud Composer, Cloud Functions, or other orchestration tools to ensure reliable and scalable data pipelines.
- Documentation & Knowledge Sharing: Create comprehensive documentation for data pipelines, workflows, and processes. Share best practices and mentor junior data engineers.
Required Qualifications:
- 7+ years of experience working as a Data Engineer, with a focus on GCP and Big
Query. - Strong proficiency in SQL and experience in developing complex queries, stored procedures, and views in Big
Query. - Hands-on experience with GCP services such as Cloud Storage, Dataflow, Cloud Composer, and Cloud Functions.
- Deep understanding of data warehousing concepts, dimensional modeling, and building data marts.
- Experience with ETL/ELT tools like Apache Beam, Dataflow, or dbt.
- Proven track record managing teams and projects.
- Proven ability to work with large datasets and
- effectively optimize query performance. - Excellent communication and interpersonal skills, with the ability to collaborate effectively with
- functional teams. - GCP Professional Data Engineer Certification is a plus.
Preferred Skills:
- Experience with machine learning on GCP using Vertex AI or AI Platform.
- Knowledge of data governance and security best practices in a cloud environment.
- Experience working with
- time streaming data and tools like Pub/Sub or Kafka.
Hiring Process
Learn more about how we interview and select candidates.
Find a role that best matches your skill set and career goals.
#J-18808-LjbffrSeja o primeiro a candidar-se à vaga de emprego oferecida!
-
Porque procurar um emprego no Vagas.pt?
Todos os dias oferecemos novas vagas de emprego. Pode escolher entre uma vasta gama de empregos: O nosso objectivo é oferecer a escolha mais vasta possível Receba novas ofertas por e-mail Ser o primeiro a responder a novas ofertas de emprego Todas as ofertas de emprego num só lugar (de empregadores, agências e outros portais de emprego) Todos os serviços para quem procura emprego são gratuitos Vamos ajudá-lo a encontrar um novo emprego