Data Quality Engineer - Elastic Stack
As one of the premier supplier to our client, one of the most established financial institutions in Switzerland, Swisslinx are looking for a Data Quality Engineer - Elastic Stack.
Main Tasks:
- Identify, assess and support owners with new data source onboarding into the data ecosystem
- Develop and maintain efficient data pipelines for ingesting and processing diverse types of log source types (Platform, Application, Security, etc)
- Design and implement data parsing and transformation to ensure data is structured and accessible
- Collaborate with Security Analysts and Data owners to understand data requirement
- Ensure proper data quality standards and processes are maintained
- Perform data review, monitor data pipelines and resolve any data quality related issue
- Assist internal stakeholders in accessing and using data and searches more effectively
- Troubleshoot queries and provide technical support
- Provide Guidance, create and maintain content, visualization, dashboard, reports on the supported platforms
- Create and maintain documentation, including configuration guides and standard operating procedures.
Required skills:
- Hands-on experience with Elastic Stack
- Proficiency in Linux operating systems (e.g., CentOS, Ubuntu, Red Hat).
- Proven experience in data management
- Proficiency in data modeling and processing large data sets, SQL, KQL, SPL
- Relevant certifications or technology work experience (e.g., CompTIA Linux+, Red Hat, Elastic, Splunk, etc ) is a plus.
- Strong understanding of network protocols, security principles, and system administration best practices.
- Excellent analytical and problem-solving skills, with the ability to troubleshoot and resolve complex technical issues.
- Strong interpersonal and communication skills, with the ability to work effectively in a teamoriented environment
- Excellent verbal & writing skills with English is a must
Main Tasks:
- Identify, assess and support owners with new data source onboarding into the data ecosystem
- Develop and maintain efficient data pipelines for ingesting and processing diverse types of log source types (Platform, Application, Security, etc)
- Design and implement data parsing and transformation to ensure data is structured and accessible
- Collaborate with Security Analysts and Data owners to understand data requirement
- Ensure proper data quality standards and processes are maintained
- Perform data review, monitor data pipelines and resolve any data quality related issue
- Assist internal stakeholders in accessing and using data and searches more effectively
- Troubleshoot queries and provide technical support
- Provide Guidance, create and maintain content, visualization, dashboard, reports on the supported platforms
- Create and maintain documentation, including configuration guides and standard operating procedures.
Required skills:
- Hands-on experience with Elastic Stack
- Proficiency in Linux operating systems (e.g., CentOS, Ubuntu, Red Hat).
- Proven experience in data management
- Proficiency in data modeling and processing large data sets, SQL, KQL, SPL
- Relevant certifications or technology work experience (e.g., CompTIA Linux+, Red Hat, Elastic, Splunk, etc ) is a plus.
- Strong understanding of network protocols, security principles, and system administration best practices.
- Excellent analytical and problem-solving skills, with the ability to troubleshoot and resolve complex technical issues.
- Strong interpersonal and communication skills, with the ability to work effectively in a teamoriented environment
- Excellent verbal & writing skills with English is a must
Über die Firma