Convert Figma logo to code with AI

danielegrattarola logospektral

Graph Neural Networks with Keras and Tensorflow 2.

2,364
334
2,364
69

Top Related Projects

Graph Neural Network Library for PyTorch

13,348

Python package built to ease deep learning on graph, on top of existing DL frameworks.

1,914

Benchmark datasets, data loaders, and evaluators for graph machine learning

Build Graph Nets in Tensorflow

Quick Overview

Spektral is a Python library for graph neural networks (GNNs) built on top of Keras and TensorFlow 2. It provides a flexible and efficient framework for developing and deploying GNN models, supporting various graph-based machine learning tasks such as node classification, graph classification, and link prediction.

Pros

  • Seamless integration with Keras and TensorFlow 2 ecosystem
  • Supports a wide range of GNN layers and models
  • Efficient implementation for handling large-scale graphs
  • Comprehensive documentation and examples

Cons

  • Steeper learning curve for users unfamiliar with graph neural networks
  • Limited support for dynamic graphs
  • Dependency on TensorFlow may be a drawback for PyTorch users
  • Some advanced GNN architectures may not be directly implemented

Code Examples

  1. Creating a simple GCN model for node classification:
import spektral
from spektral.layers import GCNConv
from tensorflow.keras.models import Model
from tensorflow.keras.layers import Input, Dropout

# Define the model
x_in = Input(shape=(F,))
a_in = Input(shape=(N,))

gc1 = GCNConv(32, activation='relu')([x_in, a_in])
gc2 = GCNConv(16, activation='relu')([gc1, a_in])
output = GCNConv(n_classes, activation='softmax')([gc2, a_in])

model = Model(inputs=[x_in, a_in], outputs=output)
  1. Loading a dataset and preparing it for training:
from spektral.datasets import Citation
from spektral.data import Graph

# Load the Cora dataset
dataset = Citation('cora')
graph = dataset[0]

# Prepare the data
x = graph.x
a = graph.a
y = graph.y
  1. Training the model:
from tensorflow.keras.optimizers import Adam

# Compile the model
model.compile(optimizer=Adam(lr=0.01), loss='categorical_crossentropy', metrics=['accuracy'])

# Train the model
model.fit([x, a], y, epochs=200, batch_size=N, validation_split=0.1)

Getting Started

To get started with Spektral, follow these steps:

  1. Install Spektral using pip:

    pip install spektral
    
  2. Import the necessary modules:

    import spektral
    from spektral.layers import GCNConv
    from spektral.datasets import Citation
    from tensorflow.keras.models import Model
    from tensorflow.keras.layers import Input
    
  3. Load a dataset and create a simple GNN model:

    # Load dataset
    dataset = Citation('cora')
    graph = dataset[0]
    
    # Create model
    x_in = Input(shape=(graph.n_node_features,))
    a_in = Input(shape=(graph.n_nodes,))
    output = GCNConv(graph.n_classes, activation='softmax')([x_in, a_in])
    model = Model(inputs=[x_in, a_in], outputs=output)
    
  4. Compile and train the model:

    model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'])
    model.fit([graph.x, graph.a], graph.y, epochs=200, batch_size=graph.n_nodes, validation_split=0.1)
    

Competitor Comparisons

Graph Neural Network Library for PyTorch

Pros of PyTorch Geometric

  • Larger community and more frequent updates
  • Wider range of implemented graph neural network models
  • Better integration with PyTorch ecosystem

Cons of PyTorch Geometric

  • Steeper learning curve for beginners
  • Less focus on Keras-style simplicity
  • Potentially more complex API for basic tasks

Code Comparison

Spektral example:

from spektral.layers import GCNConv
from tensorflow.keras.models import Model

class GCN(Model):
    def __init__(self):
        super().__init__()
        self.conv1 = GCNConv(16, activation='relu')
        self.conv2 = GCNConv(10, activation='softmax')

PyTorch Geometric example:

import torch
from torch_geometric.nn import GCNConv

class GCN(torch.nn.Module):
    def __init__(self):
        super().__init__()
        self.conv1 = GCNConv(in_channels, 16)
        self.conv2 = GCNConv(16, out_channels)

Both libraries offer similar functionality for creating graph neural networks, but PyTorch Geometric provides a more PyTorch-native approach, while Spektral follows a Keras-inspired API design.

13,348

Python package built to ease deep learning on graph, on top of existing DL frameworks.

Pros of DGL

  • Broader framework support (PyTorch, MXNet, TensorFlow)
  • More extensive documentation and tutorials
  • Larger community and more frequent updates

Cons of DGL

  • Steeper learning curve for beginners
  • More complex API compared to Spektral's simplicity
  • Potentially slower for smaller graph datasets

Code Comparison

Spektral example:

from spektral.layers import GCNConv
from tensorflow.keras.models import Model

class GCN(Model):
    def __init__(self):
        super().__init__()
        self.conv1 = GCNConv(16, activation='relu')
        self.conv2 = GCNConv(10, activation='softmax')

DGL example:

import dgl
import torch.nn as nn
import dgl.nn as dglnn

class GCN(nn.Module):
    def __init__(self):
        super().__init__()
        self.conv1 = dglnn.GraphConv(in_feats, 16, activation=nn.ReLU())
        self.conv2 = dglnn.GraphConv(16, 10)
1,914

Benchmark datasets, data loaders, and evaluators for graph machine learning

Pros of OGB

  • Focuses on benchmarking graph machine learning tasks with standardized datasets and evaluation protocols
  • Provides a wide range of graph datasets across various domains (e.g., social networks, molecular graphs)
  • Offers easy-to-use data loaders and evaluators for consistent experimentation

Cons of OGB

  • Limited to specific graph learning tasks and datasets, less flexible for custom applications
  • Requires additional libraries for model implementation and training
  • May have a steeper learning curve for users new to graph machine learning

Code Comparison

OGB:

from ogb.nodeproppred import NodePropPredDataset
dataset = NodePropPredDataset(name="ogbn-arxiv")
graph, label = dataset[0]

Spektral:

from spektral.datasets import Citation
dataset = Citation("cora")
graph = dataset[0]

Summary

OGB is tailored for benchmarking and standardized evaluation in graph machine learning, offering a wide range of datasets and evaluation protocols. Spektral, on the other hand, provides a more flexible framework for building graph neural networks with various layers and models. OGB is ideal for researchers focusing on comparative studies, while Spektral offers more versatility for custom graph-based applications.

Build Graph Nets in Tensorflow

Pros of graph_nets

  • Developed and maintained by Google DeepMind, ensuring high-quality and cutting-edge implementations
  • Offers a more comprehensive set of graph neural network models and operations
  • Provides better integration with TensorFlow and other Google AI tools

Cons of graph_nets

  • Less frequent updates and maintenance compared to Spektral
  • Steeper learning curve, especially for those not familiar with TensorFlow
  • Limited documentation and examples compared to Spektral's extensive tutorials

Code Comparison

Spektral:

from spektral.layers import GCNConv
from spektral.models import GCN

model = GCN(n_layers=2, channels=16, activation='relu')

graph_nets:

import graph_nets as gn
import sonnet as snt

model = gn.modules.GraphNetwork(
    edge_model_fn=lambda: snt.nets.MLP([32, 32]),
    node_model_fn=lambda: snt.nets.MLP([32, 32]),
    global_model_fn=lambda: snt.nets.MLP([32, 32])
)

Both libraries offer powerful tools for working with graph neural networks, but they cater to different user needs. Spektral provides a more user-friendly experience with its Keras-like API and extensive documentation, making it ideal for beginners and rapid prototyping. On the other hand, graph_nets offers more advanced features and better integration with the Google AI ecosystem, making it suitable for complex research projects and production environments.

Convert Figma logo designs to code with AI

Visual Copilot

Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.

Try Visual Copilot

README

Welcome to Spektral

Spektral is a Python library for graph deep learning, based on the Keras API and TensorFlow 2. The main goal of this project is to provide a simple but flexible framework for creating graph neural networks (GNNs).

You can use Spektral for classifying the users of a social network, predicting molecular properties, generating new graphs with GANs, clustering nodes, predicting links, and any other task where data is described by graphs.

Spektral implements some of the most popular layers for graph deep learning, including:

and many others (see convolutional layers).

You can also find pooling layers, including:

Spektral also includes lots of utilities for representing, manipulating, and transforming graphs in your graph deep learning projects.

See how to get started with Spektral and have a look at the examples for some templates.

The source code of the project is available on Github.
Read the documentation here.

If you want to cite Spektral in your work, refer to our paper:

Graph Neural Networks in TensorFlow and Keras with Spektral
Daniele Grattarola and Cesare Alippi

Installation

Spektral is compatible with Python 3.6 and above, and is tested on the latest versions of Ubuntu, MacOS, and Windows. Other Linux distros should work as well.

The simplest way to install Spektral is from PyPi:

pip install spektral

To install Spektral from source, run this in a terminal:

git clone https://github.com/danielegrattarola/spektral.git
cd spektral
python setup.py install  # Or 'pip install .'

To install Spektral on Google Colab:

! pip install spektral

New in Spektral 1.0

The 1.0 release of Spektral is an important milestone for the library and brings many new features and improvements.

If you have already used Spektral in your projects, the only major change that you need to be aware of is the new datasets API.

This is a summary of the new features and changes:

  • The new Graph and Dataset containers standardize how Spektral handles data. This does not impact your models, but makes it easier to use your data in Spektral.
  • The new Loader class hides away all the complexity of creating graph batches. Whether you want to write a custom training loop or use Keras' famous model-dot-fit approach, you only need to worry about the training logic and not the data.
  • The new transforms module implements a wide variety of common operations on graphs, that you can now apply() to your datasets.
  • The new GeneralConv and GeneralGNN classes let you build models that are, well... general. Using state-of-the-art results from recent literature means that you don't need to worry about which layers or architecture to choose. The defaults will work well everywhere.
  • New datasets: QM7 and ModelNet10/40, and a new wrapper for OGB datasets.
  • Major clean-up of the library's structure and dependencies.
  • New examples and tutorials.

Contributing

Spektral is an open-source project available on Github, and contributions of all types are welcome. Feel free to open a pull request if you have something interesting that you want to add to the framework.

The contribution guidelines are available here and a list of feature requests is available here.