DEV Community

Dolly Sharma
Dolly Sharma

Posted on

Part-03 Tensorflow

πŸ“Š TensorFlow Computational Graph

πŸ”Ή What is a Computational Graph?

A computational graph is a directed graph used to represent mathematical computations.

  • Nodes (vertices) β†’ Operations (like addition, multiplication, activation)
  • Edges β†’ Tensors (data flowing between operations)

πŸ‘‰ Simple idea:
Graph = Operations + Data flow


πŸ”Ή Components of the Graph

1. Nodes (Operations / Ops)

  • Represent computations
  • Examples:

    • Matrix multiplication (matmul)
    • Addition
    • Activation functions (ReLU, Sigmoid)

πŸ‘‰ They take tensors as input and produce tensors as output


2. Edges (Tensors)

  • Represent data flowing between nodes
  • Carry:

    • Inputs
    • Intermediate results
    • Outputs

πŸ‘‰ Think: Edges = Data pipeline


πŸ”Ή How to Build a Computational Graph

Step 1: Define Operations

  • Decide what computations you need (e.g., multiplication, loss calculation)

Step 2: Create Tensors

  • Inputs
  • Model parameters
  • Intermediate values

Step 3: Connect Operations

  • Link outputs of one operation to inputs of another

πŸ‘‰ This creates a graph structure


πŸ”Ή Example (Conceptual)

import tensorflow as tf

a = tf.constant([[1, 2], [3, 4]])
b = tf.constant([[5, 6], [7, 8]])

c = tf.matmul(a, b)
Enter fullscreen mode Exit fullscreen mode

πŸ‘‰ Here:

  • a, b β†’ tensors (edges/data)
  • matmul β†’ node (operation)

πŸ”Ή Execution of Graph (Sessions)

In TensorFlow 1.x style:

  • Graph is built first
  • Then executed using a session
with tf.compat.v1.Session() as sess:
    result = sess.run(c)
    print(result)
Enter fullscreen mode Exit fullscreen mode

πŸ‘‰ Important:

  • Graph = Blueprint 🧠
  • Session = Execution πŸš€

πŸ”Ή Visualization (TensorBoard)

TensorFlow provides a tool called
TensorBoard

Use:

  • Visualize graph structure
  • Understand data flow
  • Debug models

Example:

writer = tf.summary.FileWriter("logs/", graph=tf.compat.v1.get_default_graph())
writer.close()
Enter fullscreen mode Exit fullscreen mode

Then run TensorBoard to view it.


πŸ”Ή Benefits of Computational Graph

βœ… 1. Optimization

  • TensorFlow optimizes execution
  • Improves speed and memory usage

βœ… 2. Portability

  • Graph can be saved and reused

βœ… 3. Debugging

  • Visualization helps track errors

βœ… 4. Parallelism

  • Independent operations can run simultaneously

πŸ”Ή Important Note (Modern TensorFlow)

In TensorFlow 2.x:

  • No need for explicit sessions
  • Uses eager execution (default)

πŸ‘‰ Code runs immediately, like normal Python

But:

  • Computational graph still exists internally (for optimization)

🎯 One-Line Summary (Exam Ready)

A TensorFlow computational graph is a directed graph where nodes represent operations and edges represent tensors, enabling efficient execution, optimization, and visualization of machine learning models.


⚑ What is Eager Execution (TensorFlow 2.x)?

πŸ‘‰ Eager Execution = β€œRun code immediately”

  • As soon as you write a line, it executes instantly
  • Works like normal Python / NumPy

βœ… Example:

import tensorflow as tf

x = tf.constant([1, 2, 3])
print(x)
Enter fullscreen mode Exit fullscreen mode

πŸ‘‰ Output comes immediately


🧠 Simple Understanding

Eager = β€œNo waiting, no graph building, just direct result”


🧩 What is Graph Execution (TensorFlow 1.x)?

πŸ‘‰ Graph Execution = β€œBuild first, run later”

  • First: create a computational graph (blueprint)
  • Then: execute using a session

❗ Key Idea:

Code does NOT run immediately


🧠 Simple Understanding

Graph = β€œPlan everything first, then execute”


βš”οΈ Eager vs Graph Execution (Easy Comparison)

Feature Eager Execution ⚑ Graph Execution πŸ“Š
Execution Immediate Delayed
Debugging Easy Hard
Style Like Python Like building a model graph
Flexibility High Less
Performance Slower (sometimes) Faster (optimized)
Default TF 2.x TF 1.x

πŸ”₯ Important Concept: @tf.function

TensorFlow 2.x gives both worlds πŸ‘‡

πŸ‘‰ By default: Eager mode
πŸ‘‰ With @tf.function: Graph mode


Example:

import tensorflow as tf

@tf.function
def matmul(a, b):
    return tf.matmul(a, b)
Enter fullscreen mode Exit fullscreen mode

πŸ‘‰ What happens:

  • Function is converted into a computational graph
  • Runs faster (optimized)

🧠 Simple Understanding

@tf.function = β€œConvert Python code β†’ Graph for speed”


πŸš€ Why Use Graph Mode?

Even though eager is easy, graph mode is powerful:

βœ… Benefits:

  • Faster execution (optimization)
  • Better for large models
  • Can use XLA (Accelerated Linear Algebra) optimization
  • Easy to save & deploy models

🎯 Final Exam Answer (Short)

Eager execution in TensorFlow 2.x executes operations immediately like normal Python, making debugging easy. In contrast, graph execution builds a computational graph first and executes it later in a session. TensorFlow 2.x uses eager execution by default but allows graph execution using @tf.function for better performance and optimization.


πŸ‘€ How to Visualize a TensorFlow Graph

The main tool is πŸ‘‰ TensorBoard


πŸ”Ή Method (TensorFlow 2.x)

You need to:

  1. Convert your function into a graph using @tf.function
  2. Log it
  3. Open TensorBoard

βœ… Step-by-Step

1. Create a Graph Function

import tensorflow as tf

@tf.function
def my_func(a, b):
    return tf.matmul(a, b)
Enter fullscreen mode Exit fullscreen mode

2. Enable Logging

log_dir = "logs/graph"

writer = tf.summary.create_file_writer(log_dir)

tf.summary.trace_on(graph=True, profiler=False)

a = tf.constant([[1, 2]])
b = tf.constant([[3], [4]])

my_func(a, b)

with writer.as_default():
    tf.summary.trace_export(
        name="my_graph",
        step=0,
        profiler_outdir=log_dir
    )
Enter fullscreen mode Exit fullscreen mode

3. Run TensorBoard

Open terminal and run:

tensorboard --logdir=logs/graph
Enter fullscreen mode Exit fullscreen mode

Then open browser:
πŸ‘‰ http://localhost:6006


🧠 What You’ll See

  • Nodes = operations (MatMul, etc.)
  • Edges = tensors (data flow)
  • Full computational graph visualization

πŸ”₯ Simple Understanding

TensorBoard = β€œGraph ka map” πŸ—ΊοΈ
It shows how data flows inside your model


⚠️ Important Notes

  • Without @tf.function, graph won’t appear properly
  • Eager execution does not create a visible static graph
  • Graph is created only when tracing happens

🎯 One-Line Answer (Exam)

TensorFlow graphs can be visualized using TensorBoard by tracing a @tf.function and exporting it to log files, which are then displayed as a computational graph.


βš™οΈ How to Manage Graphs and Sessions in TensorFlow

πŸ”Ή 1. Managing Graphs

βœ… a) Use Default Graph (simple cases)

  • Just define operations normally
  • TensorFlow automatically handles the graph

πŸ‘‰ Use when:

  • Small programs
  • No need for separation

βœ… b) Create Separate Graphs (important)

g = tf.Graph()

with g.as_default():
    a = tf.constant(2)
    b = tf.constant(3)
    c = tf.add(a, b)
Enter fullscreen mode Exit fullscreen mode

πŸ‘‰ Why manage like this?

  • Avoid mixing operations
  • Run multiple models independently

βœ… c) Switch Between Graphs

  • Use with graph.as_default()
  • Ensures operations go to the correct graph

πŸ‘‰ Think:

β€œControl where your computation is stored”


πŸ”Ή 2. Managing Sessions

βœ… a) Use Context Manager (BEST PRACTICE)

with tf.compat.v1.Session(graph=g) as sess:
    result = sess.run(c)
Enter fullscreen mode Exit fullscreen mode

πŸ‘‰ Automatically:

  • Opens session
  • Closes session
  • Prevents memory leaks

βœ… b) Manually Control Session

sess = tf.compat.v1.Session(graph=g)
result = sess.run(c)
sess.close()
Enter fullscreen mode Exit fullscreen mode

πŸ‘‰ Use when:

  • You need long-running sessions

βœ… c) Configure Session

config = tf.compat.v1.ConfigProto()
config.gpu_options.allow_growth = True

sess = tf.compat.v1.Session(config=config)
Enter fullscreen mode Exit fullscreen mode

πŸ‘‰ Helps manage:

  • GPU memory
  • CPU threads
  • Device usage

πŸ”Ή 3. Resource Management

βœ… Always close sessions

  • Frees memory
  • Avoids crashes

βœ… Use with statement

  • Cleaner
  • Safer

πŸ”Ή 4. Key Strategy (Very Important)

πŸ‘‰ Good management means:

  • Keep graphs separate and organized
  • Run them in controlled sessions
  • Release resources properly

🎯 Final Short Answer (Exam Ready)

Managing graphs and sessions in TensorFlow involves controlling where operations are added using graphs and executing them using sessions. Graphs can be managed by creating separate graphs and using as_default() to organize computations. Sessions are managed by executing graphs using run(), configuring resources, and properly closing them using context managers to ensure efficient memory usage.


πŸ”’ How to Close a Session in TensorFlow

πŸ”Ή 1. Manually Close the Session

import tensorflow as tf

sess = tf.compat.v1.Session()

# do some work
sess.close()
Enter fullscreen mode Exit fullscreen mode

πŸ‘‰ This releases memory and resources


πŸ”Ή 2. Best Practice (Recommended) βœ…

Use a context manager (with statement):

import tensorflow as tf

with tf.compat.v1.Session() as sess:
    result = sess.run(...)
Enter fullscreen mode Exit fullscreen mode

πŸ‘‰ Automatically:

  • Opens session
  • Closes session after block ends βœ” No need to call close() manually

🧠 Simple Understanding

sess.close() = β€œFinish work and free memory”
with Session() = β€œAuto close, safer method”


⚠️ Important Notes

  • Always close sessions in TensorFlow 1.x
  • Not closing can cause:

    • Memory leaks
    • GPU memory issues

πŸ”₯ TensorFlow 2.x Reality

πŸ‘‰ No need to close sessions
Because:

  • Uses Eager Execution
  • Sessions are mostly removed

🎯 Exam One-Liner

A TensorFlow session can be closed using sess.close() or automatically using a context manager (with tf.Session()), which ensures proper resource management.


πŸ“Š Managing Graphs and Sessions in TensorFlow

Managing graphs and sessions is essential in TensorFlow for organizing computations and executing them efficiently.


πŸ”Ή 1. Managing Graphs

βœ… What is a Graph?

A TensorFlow graph (computational graph) represents:

  • Nodes β†’ operations
  • Edges β†’ tensors (data flow)

πŸ‘‰ Simple idea:
Graph = Plan of computation


βœ… Default Graph

  • TensorFlow automatically creates a default graph
  • All operations are added to it by default
a = tf.constant(2)
b = tf.constant(3)
c = tf.add(a, b)
Enter fullscreen mode Exit fullscreen mode

βœ… Multiple Graphs (Management)

You can create separate graphs for better control:

g = tf.Graph()

with g.as_default():
    a = tf.constant(2)
    b = tf.constant(3)
    c = tf.add(a, b)
Enter fullscreen mode Exit fullscreen mode

πŸ‘‰ Benefits:

  • Avoid mixing operations
  • Handle multiple models
  • Better organization

πŸ”Ή 2. Managing Sessions

βœ… What is a Session?

A session executes the graph.

πŸ‘‰
Graph = Blueprint 🧠
Session = Execution πŸš€


βœ… Running a Session

with tf.compat.v1.Session(graph=g) as sess:
    result = sess.run(c)
    print(result)
Enter fullscreen mode Exit fullscreen mode

πŸ‘‰ run() executes the graph and gives output


πŸ”Ή 3. Closing Sessions & Resource Management

βœ… Closing Session

sess.close()
Enter fullscreen mode Exit fullscreen mode

πŸ‘‰ Meaning:

  • You are manually ending the session
  • Frees:

    • RAM
    • GPU memory

🧠 Why important?

If you don’t close:

  • Memory keeps getting used ❌
  • Program may slow down or crash ❌

βœ… Better Way (Auto-close)

with tf.compat.v1.Session() as sess:
    result = sess.run(c)
Enter fullscreen mode Exit fullscreen mode

πŸ‘‰ Automatically closes β†’ safer βœ”


πŸ”Ή 2. Configuring Sessions

This part is about controlling how TensorFlow runs


βœ… Step 1: Create Config

config = tf.compat.v1.ConfigProto()
Enter fullscreen mode Exit fullscreen mode

πŸ‘‰ This creates a settings object


βœ… Step 2: Set Options

config.log_device_placement = True
Enter fullscreen mode Exit fullscreen mode

πŸ‘‰ Shows:

  • Whether operations run on CPU or GPU

config.gpu_options.allow_growth = True
Enter fullscreen mode Exit fullscreen mode

πŸ‘‰ Very important:

  • GPU memory grows only when needed
  • Prevents full memory usage at start

βœ… Step 3: Use Config in Session

with tf.compat.v1.Session(config=config) as sess:
    result = sess.run(c)
    print("Result:", result)
Enter fullscreen mode Exit fullscreen mode

πŸ‘‰ Now session runs with your custom settings


🧠 Simple Summary

  • sess.close() β†’ manually free memory
  • with Session() β†’ auto close (best)
  • ConfigProto β†’ control CPU/GPU & performance

🎯 One-Line Exam Answer

Sessions can be managed by closing them using sess.close() to release resources, and configured using tf.compat.v1.ConfigProto() to control execution settings like device placement and GPU memory usage.


πŸ”₯ Key Summary

  • Graphs β†’ organize computations
  • Sessions β†’ execute computations
  • sess.close() β†’ free resources
  • with Session() β†’ automatic management
  • ConfigProto β†’ control performance & hardware

⚠️ Important (Exam Point)

πŸ‘‰ In TensorFlow 2.x:

  • Sessions are mostly removed
  • Eager execution is default
  • Graphs are handled internally

🎯 Final Exam Answer

In TensorFlow, graphs are used to represent computations as a series of operations and data flow, while sessions are used to execute these graphs. By default, operations are added to a default graph, but multiple graphs can be created using tf.Graph() for better management. Sessions execute graphs using run() and should be properly closed using sess.close() or managed using context managers to release resources. Additionally, sessions can be configured using tf.compat.v1.ConfigProto() to control device placement and memory usage, ensuring efficient execution.


This is a comprehensive guide to TensorFlow Computational Graphs, Execution Modes, and Resource Management, formatted for maximum readability and exam preparation.


🧠 TensorFlow: Computational Graphs & Session Management

πŸ“Š 1. The Computational Graph

A computational graph is a directed graph used to represent mathematical computations. It serves as the "blueprint" of a model.

  • Nodes (Vertices): Represent Operations (e.g., addition, matrix multiplication, activation functions).
  • Edges: Represent Tensors (the data flowing between operations).

Concept: Graph = Operations + Data Flow

Key Components:

  1. Nodes (Ops): Take tensors as input and produce tensors as output.
  2. Edges (Tensors): Act as the data pipeline carrying inputs and intermediate results.

⚑ 2. Eager vs. Graph Execution

TensorFlow offers two primary ways to execute code:

Feature Eager Execution (TF 2.x) ⚑ Graph Execution (TF 1.x) πŸ“Š
Execution Immediate (runs line-by-line) Delayed (build first, run later)
Debugging Easy (standard Python tools) Hard (requires session tracing)
Performance Slower (overhead) Faster (highly optimized)
Default Yes (in modern TensorFlow) No (manual setup required)

The Best of Both Worlds: @tf.function

In TensorFlow 2.x, you can use the @tf.function decorator to convert a Python function into a high-performance static graph.

@tf.function
def fast_multiply(a, b):
    return tf.matmul(a, b)
Enter fullscreen mode Exit fullscreen mode

πŸ—οΈ 3. Managing Graphs & Sessions

In older versions (or compatibility mode), managing resources is crucial to prevent memory leaks.

Managing Graphs

  • Default Graph: Created automatically by TF.
  • Custom Graphs: Used to run multiple independent models.

    g = tf.Graph()
    with g.as_default():
        # Ops defined here belong to graph 'g'
        a = tf.constant(5)
    

Managing Sessions

A Session is the environment where the graph is actually executed.

  1. Manual Close: Use sess.close() to free RAM and GPU memory.
  2. Context Manager (Best Practice): Use the with statement to ensure the session closes automatically.

    with tf.compat.v1.Session() as sess:
        result = sess.run(my_op)
    # Session is automatically closed here
    

βš™οΈ 4. Session Configuration (ConfigProto)

You can control how TensorFlow interacts with your hardware using ConfigProto.

  • Device Placement: log_device_placement = True shows if ops are on CPU or GPU.
  • GPU Memory Growth: allow_growth = True prevents TF from hogging all available VRAM immediately.

    config = tf.compat.v1.ConfigProto()
    config.gpu_options.allow_growth = True
    sess = tf.compat.v1.Session(config=config)
    

πŸ” 5. Visualization with TensorBoard

TensorBoard is the "Map" of your model. To visualize a graph:

  1. Use @tf.function to define the logic.
  2. Use tf.summary.trace_on to record the graph.
  3. Export the trace to a log directory.
  4. Launch via terminal: tensorboard --logdir=logs/.

🎯 Exam-Ready Summaries

Q: What is a Computational Graph?

A directed graph where nodes represent operations and edges represent tensors, enabling efficient execution and optimization of ML models.

Q: Why use a Context Manager (with statement) for sessions?

It ensures that resources (RAM/GPU) are automatically released by closing the session as soon as the code block finishes, preventing memory leaks.

Q: How does TF 2.x handle graphs differently than TF 1.x?

TF 2.x uses Eager Execution by default for a better user experience, whereas TF 1.x required building a static graph and running it inside a Session.

Top comments (0)