Data Archival (using Containers)
In large enterprise Insights implementation (>35 million Nodes), Neo4j query performance is greatly reduced ,and leads to bottlenecks in dashboard loading. This is because of limitations in Neo4j (Community Edition), where only a single server is used to store all the data. It is not Scalable. Key features like Scaling, HA,Replia ,etc other features are available in paid version only.
In order to overcome this limitation in Neo4j, Insights should have the capability to scale horizontally and store data. We intend to use data split approaches to solve the problem. This enable us to ingest much more data at scale and resolve performance bottleneck.
The below image shows the flow of data in the solution.
Data Archival module consists of two agents:
Neo4jArchival agent: This agent is responsible for creating backup of data from Neo4j data source to Elasticsearch.
ElasticTransfer agent: This agent is responsible for creating containers from the data that has been backed up in Elasticsearch.
©2021 Cognizant, all rights reserved. US Patent 10,410,152