Skill
Community

deep-learning

Install
1
Install the plugin
$
npx claudepluginhub aznatkoiny/zai-skills --plugin AI-Toolkit

Want just this skill?

Then install: npx claudepluginhub u/[userId]/[slug]

Description

Comprehensive guide for Deep Learning with Keras 3 (Multi-Backend: JAX, TensorFlow, PyTorch). Use when building neural networks, CNNs for computer vision, RNNs/Transformers for NLP, time series forecasting, or generative models (VAEs, GANs). Covers model building (Sequential/Functional/Subclassing APIs), custom training loops, data augmentation, transfer learning, and production best practices.

Tool Access

This skill uses the workspace's default tool permissions.

Supporting Assets
View in Repository
references/advanced_cv.md
references/basics.md
references/best_practices.md
references/computer_vision.md
references/generative_dl.md
references/keras3_changes.md
references/keras_working.md
references/nlp_transformers.md
references/timeseries.md
scripts/quick_train.py
scripts/visualize_filters.py
Skill Content

Deep Learning with Keras 3

Patterns and best practices based on Deep Learning with Python, 2nd Edition by François Chollet, updated for Keras 3 (Multi-Backend).

Core Workflow

  1. Prepare Data: Normalize, split train/val/test, create tf.data.Dataset
  2. Build Model: Sequential, Functional, or Subclassing API
  3. Compile: model.compile(optimizer, loss, metrics)
  4. Train: model.fit(data, epochs, validation_data, callbacks)
  5. Evaluate: model.evaluate(test_data)

Model Building APIs

Sequential - Simple stack of layers:

model = keras.Sequential([
    layers.Dense(64, activation="relu"),
    layers.Dense(10, activation="softmax")
])

Functional - Multi-input/output, shared layers, non-linear topologies:

inputs = keras.Input(shape=(64,))
x = layers.Dense(64, activation="relu")(inputs)
outputs = layers.Dense(10, activation="softmax")(x)
model = keras.Model(inputs=inputs, outputs=outputs)

Subclassing - Full flexibility with call() method:

class MyModel(keras.Model):
    def __init__(self):
        super().__init__()
        self.dense1 = layers.Dense(64, activation="relu")
        self.dense2 = layers.Dense(10, activation="softmax")

    def call(self, inputs):
        x = self.dense1(inputs)
        return self.dense2(x)

Quick Reference: Loss & Optimizer Selection

TaskLossFinal Activation
Binary classificationbinary_crossentropysigmoid
Multiclass (one-hot)categorical_crossentropysoftmax
Multiclass (integers)sparse_categorical_crossentropysoftmax
Regressionmse or maeNone

Optimizers: rmsprop (default), adam (popular), sgd (with momentum for fine-tuning)

Domain-Specific Guides

TopicReferenceWhen to Use
Keras 3 Migrationkeras3_changes.mdSTART HERE: Multi-backend setup, keras.ops, import keras
Fundamentalsbasics.mdOverfitting, regularization, data prep, K-fold validation
Keras Deep Divekeras_working.mdCustom metrics, callbacks, training loops, tf.function
Computer Visioncomputer_vision.mdConvnets, data augmentation, transfer learning
Advanced CVadvanced_cv.mdSegmentation, ResNets, Xception, Grad-CAM
Time Seriestimeseries.mdRNNs (LSTM/GRU), 1D convnets, forecasting
NLP & Transformersnlp_transformers.mdText processing, embeddings, Transformer encoder/decoder
Generative DLgenerative_dl.mdText generation, VAEs, GANs, style transfer
Best Practicesbest_practices.mdKerasTuner, mixed precision, multi-GPU, TPU

Essential Callbacks

callbacks = [
    keras.callbacks.EarlyStopping(monitor="val_loss", patience=3),
    keras.callbacks.ModelCheckpoint("best.keras", save_best_only=True),
    keras.callbacks.TensorBoard(log_dir="./logs")
]
model.fit(..., callbacks=callbacks)

Utility Scripts

ScriptDescription
quick_train.pyReusable training template with standard callbacks and history plotting
visualize_filters.pyVisualize convnet filter patterns via gradient ascent
Stats
Stars0
Forks0
Last CommitFeb 13, 2026

Similar Skills