Solutions Architectsergiy.sobolev@gmail.com +380987882471 Skype: crushersbk |
AWS Solution Architect Associate | Link to Certificate |
Microsoft Azure Data Engineer | Link to Certificate |
Microsoft Azure Data Fundamentals | Link to Certificate |
Microsoft Azure Fundamentals | Link to Certificate |
Google Cloud Platform Professional Architect | Link to Certificate |
Programming languages: | Go, Java, Python, Kotlin, JavaScript, EGL |
Platforms, Technologies, APIs: | Java EE, Spring Framework (Core, Security, Integration, Data, Rest, HATEOAS, OAuth2, ACL), SAP Hybris, SAP Data Hub, SAP ECC, SAP Gigya |
Big Data Platforms: | Hadoop (MapReduce, HDFS, Oozie), Spark, Databricks |
Search Engines: | Apache Solr, Elasticsearch, GCP Search |
Containerization: | Docker (Compose, Swarm), Kubernetes |
Databases: | Aerospike, Oracle, MySQL, MS SQL, PostgreSQL, Redis, Hazelcast, Apache Cassandra, Azure Cosmos |
Cloud Providers: | Azure, AWS, GCP |
Testing Environments: | Apache JMeter, Postman, Gatling, Dynatrace |
Infrastructure tools: | Terraform, Ansible, Azure ARM, GCP Deployment Manager |
Reporting portal & Data Hub | ||
Client: | The United Nations Educational, Scientific and Cultural Organization is a specialized agency of the United Nations aimed at promoting world peace and security through international cooperation in education, arts, sciences and culture | |
Description: | Data portal based on Azure & Power BI. Salesforce is used as data source. Batch + Real-time integration. 150 different reports and dashboards | |
Platform: | Microsoft Azure(Data Factory, Data Lake, Sql, Synapse Analytics), Microsoft Power BI, Salesforce, Mulesoft | |
Languages: | Python, .NET | |
Environments: | Azure Data Factory, Azure Functions | |
Position: | Solutions Architect | |
Activities: | System architecture design, documentation and development; Communication from other vendor's teams; Calculation cost of maintaining solution; Preparing solution architecture views for different stakeholders (data flow view, deployment view, process view); Conducting investigation of old functionality; |
Easy2Get project(Delivery\Deployment of multi-component solution with one click) | ||
Client: | Global casual game producer | |
Description: | The goal of a project was to introduce the way that allows to deploy multi-component solutions(few services, databases, message brokers, etc.) with one-button click. To accomplish this, deployment of each solution was described as a workflow that is executed by Netflix Conductor, an open-source platform created by Netflix to orchestrate workflows that span across microservices. | |
Platform: | Netflix Conductor, Kubernetes, Jenkins | |
Languages: | Go, Java | |
Environments: | Netflix Conductor, Kubernetes, ElasticSearch | |
Position: | System Architect | |
Activities: |
Everything from business driver emerging(necessity to deploy multi-component solutions quickly) to production support:
|
Migration from GCP to premises | ||
Client: | Global casual game producer | |
Description: | Project migration from GCP Cloud platform to premises. Three goals had been pursued: cost saving, performance improvement and leveraging custom services that deployed on-premises. Migration includes live logical data synchronisation between PostrgreSql clusters, bi-directional real time Redis clusters synchronisation, migration from GCP Pub/Sub to Apache Kafka, from GCP Cloud Run to Kubernetes and few other services. | |
Platform: | GCP, two premise DC | |
Languages: | Java, Go, NodeJS | |
Environments: | GCP, Kubernetes, PGLogical(PostgreSQL extension for logical replication), Gatling Tool | |
Position: | System Architect | |
Activities: | Detailed technical design and roadmap for migration Investigation of new technical approaches for data and application migration Investigation of hardware that necessary to be used on premises |
Cloud Native Data & Analytics Platform | ||
Client: | British multinational consumer goods company | |
Description: | A new company branched from main client's company and developed all necessary systems. D&A platform has to gather data from different sources: SAP S\4Hana, multiple custom systems, production machineries, etc. In total, it was about 1TB data per month. System is responsible for data preparation, transformation, serving and maintaining few data products. | |
Platform: | Microsoft Azure(Data Factory, Data Lake, Databricks, Synapse Analytics), Microsoft Power BI, SAP S/4 HANA | |
Languages: | Python | |
Environments: | Azure Data Factory, Azure Databricks Workspaces | |
Position: | Solutions Architect | |
Activities: | System architecture design, documentation and development; Communication from other vendor's teams; Calculation cost of maintaining solution; Preparing solution architecture views for different stakeholders (data flow view, deployment view, process view); Conducting investigation of old functionality; |
Digital transformation of e-commerce company | ||
Client: | UK-based e-commerce company | |
Description: | Client's goal is to increase speed of internal processes: prices changes, new products onboarding, etc. | |
Platform: | Microsoft Azure | |
Languages: | Java(Spring family: Boot, Security) | |
Environments: | Azure Kubernetes Service, Blob Storage, MySql | |
Position: | Solutions Architect | |
Activities: | System architecture design and documentation; Calculation cost. Creating POC for the new ideas. |
Migration from IBM Legacy Stack to AWS | ||
Client: | US-based fintech company | |
Description: | Client's goal is to increase own product's agility by replacing old IBM techonlogies. Second priority is to streamline security testing procedures. Main challenges of this project are lack of documentation and experienced engineers. Now this project at starting phase. | |
Platform: | IBM EGL, IBM RBD, AWS(S3, EC2, Lambda Functions, Aurora) | |
Languages: | Java, EGL | |
Environments: | AWS Workspaces | |
Position: | Solutions Architect | |
Activities: | System architecture design, documentation and development; Calculation cost of maintaining solution Preparing solution architecture views for different stakeholders (data flow view, deployment view, process view); Conducting investigation of old functionality. |
Content Management Tool | ||
Client: | US-based pharmacy company | |
Description: | The goal of the project was to replace the legacy content delivering system by high-performant cloud-based distributed one. Main requirements were a high performance of delivering content that was achieved by using regional content delivery servers and data-driven approach to present products stored content related to. Similar to the YouTube Thumbnails approach was used for storing a big number of small pictures: using a GCP Bigtable. System surveillance (monitoring, logging, performance measurement) also was based on special GCP service: GCP Spanner. | |
Platform: | GCP, SAP Hybris | |
Languages: | Java, Python | |
Environments: | GCP Storage, GCP Big Table, GCP Dataproc, GCP Search, Docker, Kubernetes | |
Position: | Solution Architect, Team Leader | |
Activities: | System architecture design, documentation and development; Calculation cost of maintaining solution on GCP, AWS and Microsoft Azure cloud providers; Preparing solution architecture views for different stakeholders (deployment view, process view); Requirements management; Team management; Transition solution to the client’s IT team. |
Pharmacy B2B E-commerce System | ||
Client: | US-based pharmacy company | |
Description: | The project aimed to improve the performance of existing e-commerce system transitioned from another vendor. The client tried to replace his legacy (more than 20-year-old) system with a new one based on SAP E-commerce platforms. But implemented by third-party vendor first version of a new system suffered from the poor performance: it was able to maintain only 0.5% of the predicted load on production. After different performance tweaks (different level cache optimization, disassembling to separate service and making them redundant to guarantee higher performance, decreasing number of calls to third party system, decreasing number of calls from front-end to back-end, etc.) the system successfully replaced legacy one. | |
Platform: | GCP, SAP Hybris | |
Languages: | Java, Python | |
Environments: | GCP Storage, GCP Big Table, GCP Dataproc, GCP Search, Docker, Kubernetes | |
Position: | Solution Architect, Team Leader | |
Activities: | System architecture design, documentation and development; Calculation cost of maintaining solution on GCP, AWS and Microsoft Azure cloud providers; Preparing solution architecture views for different stakeholders (deployment view, process view); Requirements management; Team management; Transition solution to the support IT team. |
Unified Identity Provider Management for Multiple E-commerce Platforms | ||
Client: | Canadian e-commerce company | |
Description: | The goal of the project was to create a unified customer identity provider for 4 independent systems, create a single customer data model, migrate all data, guarantee distributed session cache management and token-based access approach. The main challenge was a tremendous number of customers (~100kk) should be transformed into the unified customer data model and migrated to the new system. | |
Platform: | SAP Hybris, SAP Gigya, SAP Data Hub, MySQL, Oracle | |
Languages: | Java, Python, JavaScript | |
Environments: | Java Spring Framework, Node.js, Redis | |
Position: | Solution Architect, Team Leader | |
Activities: | System architecture design; Investigation of different customer identity provides to choose most suitable for clients need; Creating C4 software documentation; Communication with stakeholders for requirements refinement sake; Maintaining project scrum procedure; Transition solution to the client’s IT team; Development of initial migration appliance, framework for checking validity of access tokens and introducing distributed session cache. |
Integrating the Automated Pickup System to an Existing E-commerce Platform | ||
Client: | Canadian e-commerce company | |
Description: | The client made the marketing investigation and realized that it takes about 20 minutes for users to receive their orders when they come to pick up points. That was considered as quite long. The solution was to introduce service to integrate with third-party automated pickup system with the opportunity to change this system with minimal cost. As far as the client has more than 500 pickup points in Canada, the decent initial data migration to pick up system was required. The introduced adapter-based appliance was able to change or add more pickup system vendors. Development included changes to multiple systems: order management subsystem, client web and mobile interfaces and pick up point works web and mobile interfaces. Project successfully accomplished the integration of automated pickup devices to already existing e-commerce solution. The solution was documented according to client standards and transitioned to support team. | |
Platform: | SAP Hybris, Oracle, MySQL | |
Languages: | Java, Python | |
Environments: | Java Spring Framework, JBehave, Robot, Gatling, Apache Solr | |
Position: | Solution Architect, Team Leader | |
Activities: | Overall system architecture design; Developed integration service with automated pickup system from scratch; Team management. |
Pregnancy Monitoring System | ||
Client: | Israel healthcare company | |
Description: | Project was a software part of medical complex comprises hardware devices, mobile applications, cloud part and computational appliance. The solution was intended to monitor the expectant mother body and discover the pulse of the fetus. The main difference between it and already existing solution is that monitoring is passive. Main challenge was storing, validating and analysis of the huge amount of data (about 2 GB for each patient per day). Another significant requirement was integrating with MATLAB based computational systems to perform calculations, storing results, handling different types of fails and failures. | |
Platform: | Kaa IoT platform, AWS, MATLAB, Elasticsearch | |
Languages: | Java | |
Environments: | Hazelcast, MySQL, Apache Cassandra, Apache Hadoop (HDFS, Oozie), | |
Position: | Solution Architect, Lead Engineer | |
Activities: | System architecture design and development; Documentation using UML; Working with hardware engineers and mathematicians to agree with the communication interfaces and protocols; Team management. Development and testing |
Document Management and Reporting System | ||
Client: | US-based bank | |
Description: | The project intends to maintain and enhance the system for storing, validating, crafting and searching the different types of banking documents. The solution must to maintain web and desktop interfaces, handle multiple pipelines of processing different types of the document with the ability to configure those pipelines without additional development. | |
Platform: | Spring Framework, JUnit, MS SQL | |
Languages: | Java | |
Position: | Software Engineer | |
Activities: | Development, documenting, testing. |
Risk Management System | ||
Client: | Germany-based investment bank | |
Description: | The main goal of the system was to calculate the necessary amount of money bank should hold every moment to cover possible loses if creditors fail to return investments. The main accent was made on software quality so a lot of different testing types were engaged in the process of delivering: unit, integration, endurance and peak testing. | |
Platform: | Spring Framework, GWT, Flex, JUnit, MS SQL | |
Languages: | Java | |
Position: | Software Engineer | |
Activities: | Development, documenting, testing. |
Playtika | System Architect | 2021 - till now |
N-iX | Solutions Architect | 2019 - 2021 |
EPAM | Lead Software Engineer | 2017 - 2019 |
Cybervision | Team Leader | 2016 - 2017 |
Luxoft | Software Developer | 2014 - 2016 |
Nemiroff | Software Developer | 2010 - 2014 |
Aricent | Software Developer | 2008 - 2010 |