This blog post is just a brief introduction to the AGE Viewer, a tool for visualizing graphs created with Apache AGE. Specifically, I will show how to visualize the citation graph created in the last post of the "How an RDBMS works" series.
To use AGE Viewer, it will be necessary to download and install it. If you do not have the program on your machine yet, I recommend following this step-by-step guide: Installing apache age-viewer. It is very simple. I am using Ubuntu 22.04 LTS with AGE Viewer 1.0.
Through PostgreSQL CLI, it is possible to display all vertices of the graph with the query below, but it is not very easy to understand the output generated:
SELECT * FROM cypher('citation_graph', $$
MATCH (v)
RETURN v
$$) as (v agtype);
Output:
v
-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------
{"id": 844424930131970, "label": "Article", "properties": {"id": "williams2013towards", "year": 2013, "title": "Towards affective algorithmic composition", "author": ["Duncan Williams", "Alexis Kirke", "Eduardo R Miranda", "Etienne Roesch", "Slawomir Nasuto"], "publisher": "University of Jyväskylä, Department of Music"}}::vertex
{"id": 844424930131971, "label": "Article", "properties": {"id": "schubert1999measurement", "year": 1999, "title": "Measurement and Time Series Analysis of Emotion in Music", "author": ["Emery Schubert"], "publisher": "University of New South Wales. Music & Music Education"}}::vertex
{"id": 844424930131973, "label": "Article", "properties": {"id": "scherer2004which", "year": 2004, "title": "Which Emotions Can be Induced by Music? What Are the Underlying Mechanisms? And How Can We Measure Them?", "author": ["Klaus R. Scherer"], "publisher": "Journal of New Music Research"}}::vertex
{"id": 844424930131974, "label": "Article", "properties": {"id": "gabrielsson2001emotion", "year": 2001, "title": "Emotion perceived and emotion felt: Same or different?", "author": ["Alf Gabrielsson"], "publisher": "Musicae Scientiae"}}::vertex
{"id": 844424930131975, "label": "Article", "properties": {"id": "scherer2001emotional", "year": 2001, "title": "Emotional effects of music: Production rules", "author": ["Klaus Scherer", "Marcel Zentner"], "publisher": "Oxford University Press"}}::vertex
{"id": 844424930131976, "label": "Article", "properties": {"id": "juslin2001communicating", "year": 2001, "title": "Communicating emotion in music performance: A review and a theoretical framework", "author": ["Patrik Juslin"], "publisher": "Oxford University Press"}}::vertex
{"id": 844424930131977, "label": "Article", "properties": {"id": "williams2015dynamic", "year": 2015, "title": "Dynamic Game Soundtrack Generation in Response to a Continuously Varying Emotional Trajectory", "author": ["Duncan Williams", "Alexis Kirke", "Joel Eaton", "Eduardo Miranda", "Ian Daly", "James Hallowell", "Etienne Roesch", "Faustina Hwang", "Slawomir Nasuto"], "publisher": "Audio Engineering Society Conference: 56th International Conference: Audio for Games"}}::vertex
{"id": 844424930131978, "label": "Article", "properties": {"id": "williams2015investigating", "year": 2015, "title": "Investigating Perceived Emotional Correlates of Rhythmic Density in Algorithmic Music Composition", "author": ["Duncan Williams", "Alexis Kirke", "Joel Eaton", "Eduardo Miranda", "Ian Daly", "James Hallowell", "James Weaver", "Asad Malik", "Etienne Roesch", "Faustina Hwang", "Slawomir Nasuto"], "publisher": "Association for Computing Machinery"}}::vertex
{"id": 844424930131979, "label": "Article", "properties": {"id": "daly2015towards", "year": 2015, "title": "Towards human-computer music interaction: Evaluation of an affectively-driven music generator via galvanic skin response measures", "author": ["Duncan Williams", "Alexis Kirke", "Eduardo Miranda", "Ian Daly", "Faustina Hwang", "Slawomir Nasuto", "Asad Malik", "James Weaver"], "publisher": "2015 7th Computer Science and Electronic Engineering Conference (CEEC)"}}::vertex
{"id": 844424930131980, "label": "Article", "properties": {"id": "kirke2013artificial", "year": 2013, "title": "Artificial affective listening towards a machine learning tool for sound-based emotion therapy and control", "author": ["Alexis Kirke", "Eduardo Miranda", "Slawomir Nasuto"], "publisher": "Proceedings of the Sound and Music Computing Conference"}}::vertex
{"id": 844424930131981, "label": "Article", "properties": {"id": "kirke2012learningto", "year": 2012, "title": "Learning to Make Feelings: Expressive Performance as a part of a machine learning tool for sound-based emotion therapy and control", "author": ["Alexis Kirke", "Eduardo Miranda", "Slawomir Nasuto"], "publisher": "the 9th Intl Symp on Computer Music Modeling and Retrieval"}}::vertex
{"id": 844424930131982, "label": "Article", "properties": {"id": "lopez2010real", "year": 2010, "title": "Real-Time Emotion-Driven Music Engine", "author": ["Alex Lopez", "Antonio Oliveira", "Amilcar Cardoso"], "publisher": "the 9th Intl Symp on Computer Music Modeling and Retrieval"}}::vertex
{"id": 844424930131983, "label": "Article", "properties": {"id": "oliveira2008affective", "year": 2008, "title": "Affective-driven music production: selection and transformation of music", "author": ["Antonio Oliveira", "Amilcar Cardoso"], "publisher": "ARTECH"}}::vertex
{"id": 844424930131984, "label": "Article", "properties": {"id": "oliveira2008modeling", "year": 2008, "title": "Modeling affective content of music: A knowledge base approach", "author": ["Antonio Oliveira", "Amilcar Cardoso"], "publisher": "Sound and Music Computing Conference"}}::vertex
{"id": 844424930131985, "label": "Article", "properties": {"id": "livingstone2007controlling", "year": 2007, "title": "Controlling musical emotionality: an affective computational architecture for influencing musical emotions", "author": ["Steven R. Livingstone", "Ralf Mühlberger", "Andrew Brown", "Andrew Loch"], "publisher": "Digital Creativity"}}::vertex
{"id": 844424930131986, "label": "Article", "properties": {"id": "livingstone2005dynamic", "year": 2005, "title": "Dynamic Response: Real-Time Adaptation for Music Emotion", "author": ["teven R. Livingstone", "Andrew Brown"], "publisher": "Creativity & Cognition Studios Press"}}::vertex
{"id": 844424930131987, "label": "Article", "properties": {"id": "oliveira2009automatic", "year": 2009, "title": "Automatic manipulation of music to express desired emotions", "author": ["Antonio Oliveira", "Amilcar Cardoso"], "publisher": "Proceedings of the 6th Sound and Music Computing Conference"}}::vertex
{"id": 844424930131969, "label": "Article", "properties": {"id": "williams2015affect", "year": 2015, "title": "Investigating affect in algorithmic composition systems", "author": ["Duncan Williams", "Alexis Kirke", "Eduardo R Miranda", "Etienne Roesch", "Ian Daly", "Slawomir Nasuto"], "publisher": "Psychology of Music"}}::vertex
{"id": 844424930131988, "label": "Article", "properties": {"id": "russell1980circumplex", "year": 1980, "title": "A circumplex model of affect", "author": ["James Russel"], "publisher": "American Psychological Association"}}::vertex
To visualize and analyze the graph we built, we need to configure the connection with the database. We use nvm
to start the AGE Viewer at localhost:3000
, and this step must be executed within the age-viewer
directory:
cd age-viewer
npm run setup
npm run start
When opening the page, provide the URL where the AGE Viewer is running, the port number of the database, database name, and the username and password for the database. Below is an example of the configuration. Then click on the "Connect" button.
Connect URL: localhost
Connect Port: 5432
Database Name: papersdb
User Name: postgres
Password: 12345
At the top of the page, there is a field to define queries. We will retrieve the data of the vertices and edges, and then we can visualize the graph we built. After specifying the query, click on the play button "Run Query" next to the editing field.
SELECT * FROM cypher('citation_graph', $$
MATCH (a)
OPTIONAL MATCH (a)-[e]->(b)
RETURN a, e, b
$$) as (a agtype, e agtype, b agtype);
Next, the system will return the graph with all vertices and edges. We can see in Figure 1 that all papers have connections, indicating that they have at least one citation made by a related research source.
In addition to displaying the graph, we can explore other queries to perform different types of analysis. For instance, We can check for self-citation, filter articles by author names, and detect circular citations, among other things. However, for today's purposes, we will stop here and save more complex analyses for a future post!
Errata
My intention is to provide access to technology information through reliable sources. If you have found any incorrect information, please let your contribution in the comments below 😊.
Related Articles
- Installing apache age-viewer
- How an RDBMS works #4: Creating a citation graph with PostgreSQL + Apache AGE
- How an RDBMS works #1: Lessons from “The Internals of PostgreSQL”
Contribute to Apache AGE
Apache AGE website: https://age.apache.org/
Apache AGE Github: https://github.com/apache/age
Apache AGE Viewer GitHub: https://github.com/apache/age-viewer
Top comments (0)