Top Related Projects
A set of examples around pytorch in Vision, Text, Reinforcement Learning, etc.
12 weeks, 26 lessons, 52 quizzes, classic Machine Learning for all
Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
scikit-learn: machine learning in Python
Quick Overview
The tensorflow/examples
repository on GitHub contains a collection of example code and tutorials for the TensorFlow machine learning library. It serves as a resource for developers and researchers to learn and explore various applications of TensorFlow, from basic usage to more advanced techniques.
Pros
- Comprehensive Examples: The repository covers a wide range of TensorFlow use cases, from image classification to natural language processing, providing a diverse set of examples for users to learn from.
- Active Maintenance: The repository is actively maintained by the TensorFlow team, ensuring that the examples stay up-to-date with the latest TensorFlow releases and best practices.
- Community Contributions: The repository welcomes contributions from the TensorFlow community, allowing for the expansion and improvement of the available examples.
- Detailed Documentation: Each example comes with detailed documentation, including explanations of the code, usage instructions, and references to relevant TensorFlow concepts.
Cons
- Potential Complexity: Some of the more advanced examples may require a significant understanding of TensorFlow and machine learning concepts, which could be a barrier for beginners.
- Dependency on TensorFlow: The examples are inherently tied to the TensorFlow library, which means that users must have TensorFlow installed and configured correctly to run the examples.
- Potential Outdated Content: While the repository is actively maintained, there is a possibility that some examples may become outdated as TensorFlow evolves, requiring users to adapt the code to the latest version.
- Limited Scope: The repository focuses solely on TensorFlow examples, and does not provide examples for other machine learning frameworks or libraries.
Code Examples
Here are a few short code examples from the tensorflow/examples
repository:
- Image Classification with TensorFlow Lite:
import tensorflow as tf
# Load the TFLite model
interpreter = tf.lite.Interpreter(model_path="path/to/model.tflite")
interpreter.allocate_tensors()
# Get input and output tensors
input_details = interpreter.get_input_details()
output_details = interpreter.get_output_details()
# Prepare the input data
input_data = np.array([...]) # Your input data
# Run the inference
interpreter.set_tensor(input_details[0]['index'], input_data)
interpreter.invoke()
output_data = interpreter.get_tensor(output_details[0]['index'])
This code demonstrates how to load a TensorFlow Lite model, prepare the input data, and run inference using the model.
- Text Generation with TensorFlow.js:
import * as tf from '@tensorflow/tfjs';
async function generateText(model, seed, length) {
let input = tf.tensor([seed.charCodeAt(0)]);
for (let i = 0; i < length; i++) {
const output = await model.predict(input);
const nextIndex = tf.argMax(output, 1).dataSync()[0];
input = tf.tensor([nextIndex]);
}
return seed + String.fromCharCode(input.dataSync()[0]);
}
// Load the pre-trained model
const model = await tf.loadLayersModel('path/to/model.json');
const generatedText = await generateText(model, 'The ', 100);
console.log(generatedText);
This code demonstrates how to use a pre-trained TensorFlow.js model to generate text based on a given seed.
- Object Detection with TensorFlow Lite:
import tensorflow as tf
import cv2
import numpy as np
# Load the TFLite model
interpreter = tf.lite.Interpreter(model_path="path/to/model.tflite")
interpreter.allocate_tensors()
# Get input and output tensors
input_details = interpreter.get_input_details()
output_details = interpreter.get_output_details()
# Prepare the input image
image = cv2.imread("path/to/image.jpg")
input_data = cv2.resize(image, (300, 300))
input_data = np.expand_dims(input_data, axis=0)
# Run the inference
interpreter.set_tensor(input_details[0]['index'], input_data
Competitor Comparisons
A set of examples around pytorch in Vision, Text, Reinforcement Learning, etc.
Pros of PyTorch/examples
- Diverse Examples: PyTorch/examples provides a wider range of example projects, covering areas like computer vision, natural language processing, and reinforcement learning.
- Detailed Documentation: The examples in PyTorch/examples often include detailed README files, providing clear instructions and explanations for each project.
- Active Maintenance: The PyTorch/examples repository appears to be more actively maintained, with recent updates and contributions from the community.
Cons of PyTorch/examples
- Organization: The organization of the PyTorch/examples repository can be less intuitive compared to the structure of tensorflow/examples, making it slightly more challenging to navigate.
- Fewer Tutorials: While PyTorch/examples has a broader range of examples, tensorflow/examples may provide more in-depth tutorial-style projects, especially for beginners.
- Dependency on PyTorch: As the name suggests, the examples in PyTorch/examples are specific to the PyTorch framework, limiting their usefulness for those working with TensorFlow.
Code Comparison
Here's a brief comparison of the code structure for a simple image classification example in both repositories:
tensorflow/examples/image_classification/simple_image_classification.py
import tensorflow as tf
from tensorflow.keras.datasets import cifar10
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, Flatten
from tensorflow.keras.optimizers import Adam
(x_train, y_train), (x_test, y_test) = cifar10.load_data()
pytorch/examples/vision/classification/imagenet/main.py
import torch
import torch.nn as nn
import torch.optim as optim
from torchvision import datasets, transforms
from torch.utils.data import DataLoader
transform = transforms.Compose([
transforms.Resize(224),
transforms.ToTensor(),
transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5))
])
12 weeks, 26 lessons, 52 quizzes, classic Machine Learning for all
Pros of ML-For-Beginners
- Provides a comprehensive set of tutorials and examples for beginners to learn machine learning, covering a wide range of topics and techniques.
- Includes interactive Jupyter Notebooks that allow users to experiment with the code and concepts.
- Offers a structured learning path with clear learning objectives and step-by-step instructions.
Cons of ML-For-Beginners
- May not be as in-depth or advanced as the examples provided in tensorflow/examples, which are more focused on specific TensorFlow features and use cases.
- The repository is primarily focused on Microsoft Azure and may not be as applicable to users working with other cloud platforms or local environments.
- The examples may not be as frequently updated as the tensorflow/examples repository, which is actively maintained by the TensorFlow team.
Code Comparison
Here's a brief comparison of the code structure between the two repositories:
tensorflow/examples:
import tensorflow as tf
from tensorflow.keras.datasets import mnist
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, Flatten
from tensorflow.keras.optimizers import Adam
(x_train, y_train), (x_test, y_test) = mnist.load_data()
x_train, x_test = x_train / 255.0, x_test / 255.0
model = Sequential([
Flatten(input_shape=(28, 28)),
Dense(128, activation='relu'),
Dense(10, activation='softmax')
])
model.compile(optimizer=Adam(lr=0.001),
loss='sparse_categorical_crossentropy',
metrics=['accuracy'])
model.fit(x_train, y_train, epochs=5, validation_data=(x_test, y_test))
ML-For-Beginners:
import numpy as np
from sklearn.datasets import load_iris
from sklearn.model_selection import train_test_split
from sklearn.linear_model import LogisticRegression
from sklearn.metrics import accuracy_score
iris = load_iris()
X, y = iris.data, iris.target
X_train, X_test, y_train, y_test = train_test_split(X, y, test_size=0.2, random_state=42)
model = LogisticRegression()
model.fit(X_train, y_train)
y_pred = model.predict(X_test)
accuracy = accuracy_score(y_test, y_pred)
print(f'Accuracy: {accuracy:.2f}')
Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
Pros of jax
- More flexible and customizable for advanced machine learning research
- Better support for automatic differentiation and GPU/TPU acceleration
- Simpler API with functional programming paradigms
Cons of jax
- Steeper learning curve for beginners compared to TensorFlow
- Smaller ecosystem and fewer pre-built models/tools
- Less extensive documentation and community support
Code Comparison
jax example:
import jax.numpy as jnp
from jax import grad, jit
def predict(params, x):
return jnp.dot(x, params)
@jit
def loss(params, x, y):
return jnp.mean((predict(params, x) - y) ** 2)
grad_loss = jit(grad(loss))
TensorFlow example:
import tensorflow as tf
model = tf.keras.Sequential([
tf.keras.layers.Dense(1, input_shape=(1,))
])
model.compile(optimizer='adam', loss='mse')
model.fit(x_train, y_train, epochs=100)
The jax example showcases its functional approach and use of JIT compilation, while the TensorFlow example demonstrates its high-level Keras API for quick model creation and training.
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
Pros of huggingface/transformers
- Comprehensive collection of pre-trained Transformer models for various NLP tasks
- Extensive documentation and tutorials for easy integration and usage
- Active community with frequent updates and bug fixes
Cons of huggingface/transformers
- Larger codebase and dependencies compared to TensorFlow/Keras examples
- Potential learning curve for users not familiar with Transformers or the HuggingFace ecosystem
Code Comparison
tensorflow/examples
model = tf.keras.Sequential([
tf.keras.layers.Embedding(input_dim=vocab_size, output_dim=embedding_dim),
tf.keras.layers.LSTM(units=64),
tf.keras.layers.Dense(units=1, activation='sigmoid')
])
huggingface/transformers
from transformers import BertForSequenceClassification, BertTokenizer
model = BertForSequenceClassification.from_pretrained('bert-base-uncased')
tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')
scikit-learn: machine learning in Python
Pros of scikit-learn/scikit-learn
- Comprehensive library of machine learning algorithms and tools
- Well-documented and easy to use API
- Large and active community with many contributed modules
Cons of scikit-learn/scikit-learn
- Limited support for deep learning compared to TensorFlow
- May not be as performant as specialized libraries for certain tasks
- Fewer examples and tutorials compared to TensorFlow/examples
Code Comparison
scikit-learn/scikit-learn
from sklearn.datasets import load_iris
from sklearn.tree import DecisionTreeClassifier
iris = load_iris()
X, y = iris.data, iris.target
clf = DecisionTreeClassifier()
clf.fit(X, y)
tensorflow/examples
import tensorflow as tf
from tensorflow.keras.datasets import mnist
from tensorflow.keras.models import Sequential
from tensorflow.keras.layers import Dense, Flatten
(x_train, y_train), (x_test, y_test) = mnist.load_data()
x_train, x_test = x_train / 255.0, x_test / 255.0
model = Sequential([
Flatten(input_shape=(28, 28)),
Dense(128, activation='relu'),
Dense(10, activation='softmax')
])
Convert designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual CopilotREADME
TensorFlow Examples
Most important links!
- Community examples
- Course materials for the Deep Learning class on Udacity
If you are looking to learn TensorFlow, don't miss the core TensorFlow documentation which is largely runnable code. Those notebooks can be opened in Colab from tensorflow.org.
What is this repo?
This is the TensorFlow example repo. It has several classes of material:
- Showcase examples and documentation for our fantastic TensorFlow Community
- Provide examples mentioned on TensorFlow.org
- Publish material supporting official TensorFlow courses
- Publish supporting material for the TensorFlow Blog and TensorFlow YouTube Channel
We welcome community contributions, see CONTRIBUTING.md and, for style help, Writing TensorFlow documentation guide.
To file an issue, use the tracker in the tensorflow/tensorflow repo.
License
Top Related Projects
A set of examples around pytorch in Vision, Text, Reinforcement Learning, etc.
12 weeks, 26 lessons, 52 quizzes, classic Machine Learning for all
Composable transformations of Python+NumPy programs: differentiate, vectorize, JIT to GPU/TPU, and more
🤗 Transformers: State-of-the-art Machine Learning for Pytorch, TensorFlow, and JAX.
scikit-learn: machine learning in Python
Convert designs to code with AI
Introducing Visual Copilot: A new AI model to turn Figma designs to high quality code using your components.
Try Visual Copilot