2018 / 10 / 02- Post-doctoral position : MR-Guided Focused Ultrasound in experimental model (18[...]
Descripción del trabajo
Responsibilities :
- Design, build, and maintain scalable and robust data pipelines to support data integration and data warehousing.
- Develop and optimize ETL processes to ingest, clean, and transform data from various sources like Qualys, CMDB, etc.
- Ensure the reliability, availability, and performance of data systems.
- Data Management :
- Manage and maintain data architecture, data models, and data schemas.
- Implement and maintain data governance and data quality standards.
- Work with relational and No
SQL databases, ensuring data integrity and security. - Performance Monitoring and Optimization :
- Monitor database performance and optimize query execution for maximum efficiency.
- Troubleshoot and resolve
- related issues. - Data Integrity and Security :
- Ensure data integrity and security, including managing user access and permissions.
- Develop and implement backup and recovery procedures to minimize data loss in the event of hardware or software failure.
- Cloud-Based Database Management :
- Manage
- based databases on platforms like AWS, Azure, and Google Cloud Platform. - Keep
-
- date with the latest Postgre
SQL / Mongo
DB releases, features, and patches. - Collaboration and Documentation :
- Collaborate with developers and other IT staff to ensure database systems meet business requirements.
- Document database processes and procedures.
Requirements
- 5-10 years of experience as a Database Administrator / Architect, with specific experience in Postgre
SQL and Mongo
DB along with good Python skills. - Proficiency in Postgre
SQL and Mongo
DB. - Strong skills in Python and Ansible for development and automation.
- Familiarity with
- based database management (AWS, Azure, Google Cloud Platform). - Experience in creating and managing databases, tables, and indexes.
- Strong understanding of database performance monitoring and optimization.
- Knowledge of data integrity and security best practices.
- Proficient in developing backup and recovery procedures.
- Ability to analyze and organize raw data and build data systems and pipelines.
- Experience in conducting complex data analysis and building ETL solutions.
- Knowledge of Agile Methodology.
- Good to have knowledge about Vulnerability Management and broader knowledge of cybersecurity (ISO27001 ISMS framework).
- Bachelor’s degree in Computer Science, Information Technology, or a related field.
- Excellent
- solving and troubleshooting skills. - Strong collaboration and communication skills.
Benefits :
- Permanent contract.
- Hybrid remote work model to enhance your flexibility.
- Flexible hours to organize your day as you prefer.
- Intensive hours on Fridays to enjoy your weekend.
- Relocation opportunities and support in finding housing.
- Continuous training and professional development programs.
- Compensation meals.
- Special discounts for taking out group employee insurance policies.
- Life and accident insurance.
- Annual medical
- ups.
Don't miss this unique opportunity! If you're interested in being part of a challenging and growing project, send us your CV and discover how you can impact the future of financial technology.
#J-18808-Ljbffr-
Informações detalhadas sobre a oferta de emprego
Empresa: France Life Imaging Localização: Espinho
Espinho, Aveiro District, PortugalPublicado: 15. 3. 2025
Vaga de emprego atual
Seja o primeiro a candidar-se à vaga de emprego oferecida!