To analyse and Execute Data Strategy within Data Architecture and Data Platforms Environment
To research, design and propose new features and optimizations
To lead Specific Streams under Data Architecture topic to create synergy between Backend Transactional Systems, Backend Analytical and Reporting Systems and Core Systems
To work in particular stream/squad with business representatives and teams to collect and translate information requirements into data to develop data-centric solutions
To apply principles and standards for modelling, regulations, and security to meet technical and business goals defined by company
To participate in Design and Development phases of data related services using various technologies and architectural patterns for enabling and supporting purposes of existing reporting and analytical use cases for internal and external use cases
To lead particular squad (stream) on products as well as internal stakeholders to create new data products, features in data products as well as identify the root causes of problems, turn analyses into insights and solutions
To bring hands-on experience and expertise in actual project implementation and tasks execution
To participate in quality management of Data Stack by having Test Driven Development approach, having UNIT, Integration and Regression tests within Development Lifecycle
Deliver the documentation according the company defined documentation standards for Data Stack and constantly deliver documentation within delivery Agile Sprints or within Waterfall Milestones
Xüsusi tələblər
Higher or relevant experience
Specific areas of expertise: Data Engineering, Data Architecture, Consulting, Product Management, Digital, Technology
Minimum 3 years of relevant experience in a data engineering environment
3+ years of Data Engineering/Software Engineering in Senior positions within Telco, Banking and Retail industry
Azerbaijani, English, Russian language skills
Extensive hands-on experience in data ecosystems (cloud ecosystems would be beneficial); covering data ingestion, data modeling, and data provisioning to consumers and downstream systems
Excellent coding skills in at least two relevant languages (e.g. SQL, Python, Java, Scala) as well as deep knowledge of Cost Based Optimization techniques
Experience in ETL/ELT design, data and interface specifications, quality assurance and testing methods. Good knowledge of data pipeline orchestration (e.g. Apache Airflow, Ni-Fi)
Knowledge in distributed computation frameworks (e.g. Synapse, Spark)
Experience in implementing DataOps and MLOps concepts
Proven track record of delivering value from data. Experience in building robust, scalable, high-quality data products in iterations and integrating them into existing and new data-pipelines
Experience with agile methodologies in a professional development environment (CI/CD)
Experience with RESTful and SOAP APIs
Practical knowledge of Messaging Queue technologies, like Apache Kafka, RabbitMQ
Practical knowledge of NoSQL technologies, like Redis, Elasticsearch, Neo4j
Practical knowledge about when and what to test (Unit Tests, E2E Tests)
Theability to work autonomously and in a result-oriented way.