Which technology is best suited to store and query a huge readonly graph?

Posted by asmaier on Stack Overflow See other posts from Stack Overflow or by asmaier
Published on 2009-12-22T11:02:18Z Indexed on 2010/04/24 22:03 UTC
Read the original article Hit count: 373

Filed under:
|

I have a huge directed graph: It consists of 1.6 million nodes and 30 million edges. I want the users to be able to find all the shortest connections (including incoming and outgoing edges) between two nodes of the graph (via a web interface). At the moment I have stored the graph in a PostgreSQL database. But that solution is not very efficient and elegant, I basically need to store all the edges of the graph twice (see my question PostgreSQL: How to optimize my database for storing and querying a huge graph).

It was suggested to me to use a GraphDB like neo4j or AllegroGraph. However the free version of AllegroGraph is limited to 50 million nodes and also has a very high-level API (RDF), which seems too powerful and complex for my problem. Neo4j on the other hand has only a very low level API (and the python interface is not mature yet). Both of them seem to be more suited for problems, where nodes and edges are frequently added or removed to a graph. For a simple search on a graph, these GraphDBs seem to be too complex.

One idea I had would be to "misuse" a search engine like Lucene for the job, since I'm basically only searching connections in a graph.

Another idea would be, to have a server process, storing the whole graph (500MB to 1GB) in memory. The clients could then query the server process and could transverse the graph very quickly, since the graph is stored in memory. Is there an easy possibility to write such a server (preferably in Python) using some existing framework?

Which technology would you use to store and query such a huge readonly graph?

© Stack Overflow or respective owner

Related posts about graph

Related posts about Technology