Apache MXNet: Ultra-scalable Deep Learning Framework

1.9.1 · abandoned · verified Mon Apr 13

Apache MXNet was an ultra-scalable deep learning framework known for its flexibility, efficiency, and multi-language support. It allowed users to mix and match imperative and symbolic programming for deep learning tasks. The project's last stable release was 1.9.1 in May 2022, and it was moved to the Apache Attic in September 2023, signifying that it is no longer actively developed or maintained.

Warnings

Install

Imports

Quickstart

This quickstart demonstrates defining a simple Multi-Layer Perceptron (MLP) using MXNet's Gluon API, initializing its parameters, and performing a forward pass with dummy data. It also includes a basic NDArray operation. Ensure you have the correct CPU or GPU package installed for optimal performance, and select the appropriate context (CPU/GPU).

import mxnet as mx
from mxnet import gluon, nd
from mxnet.gluon import nn

# Define a simple neural network
class MLP(nn.Block):
    def __init__(self, **kwargs):
        super(MLP, self).__init__(**kwargs)
        self.dense0 = nn.Dense(128, activation='relu')
        self.dense1 = nn.Dense(64, activation='relu')
        self.dense2 = nn.Dense(10)

    def forward(self, x):
        x = self.dense0(x)
        x = self.dense1(x)
        x = self.dense2(x)
        return x

# Create an instance of the network
net = MLP()

# Initialize parameters
ctx = mx.cpu(0) # Or mx.gpu(0) if GPU is available and MXNet-GPU is installed
net.initialize(mx.init.Xavier(), ctx=ctx)

# Create a dummy input (e.g., for a batch of 1 with 784 features)
dummy_input = nd.random.uniform(shape=(1, 784), ctx=ctx)

# Perform a forward pass
output = net(dummy_input)
print(f"Network output shape: {output.shape}")

# Simple tensor operation
a = nd.ones((2, 3), ctx=ctx)
b = a * 2
print(f"Simple NDArray operation result: {b.asnumpy()}")

view raw JSON →