Explore essential concepts of computational graphs, sessions, and operations in TensorFlow. This quiz assesses fundamental understanding for beginners interested in AI and machine learning architectures using TensorFlow.
Which statement best describes a computational graph in TensorFlow?
Explanation: A computational graph illustrates how operations are performed and how data flows between them, using nodes for operations and edges for data. It is not a table or a database; tables do not capture operations, and databases are used for storage, not computation. Visualization dashboards display results but do not represent computational flow directly.
What is the primary purpose of a session in TensorFlow (as used in earlier versions)?
Explanation: Sessions are primarily used to run and evaluate the operations and variables set up in a computational graph. Drawing visual graphs and creating constants are separate processes, not handled by sessions. Image conversion is an operation, but sessions do not specifically perform conversions themselves.
In TensorFlow, what does an operation (op) represent within a computational graph?
Explanation: Operations, or ops, act as nodes that perform computations and manipulate data in the form of tensors. A static dataset or variable is simply data, not an operation. Storing a float is too narrow and not relevant to TensorFlow ops. Assigned values belong to variables, not operations.
Which code line best defines a constant tensor holding the value 7?
Explanation: The tf.constant function is used for creating tensors with fixed values in TensorFlow. tf.Variable is for mutable tensors, not constants. tf.placeholder is for defining inputs that are fed at runtime, not fixed values. tf.compute does not exist as a standard function.
What is the correct sequence to run an operation in TensorFlow 1.x?
Explanation: The correct workflow is to first build the computational graph, next create a session, and finally execute operations within the session. Defining a session before building the graph is illogical. Running a session before the graph exists is not possible. Variables must be assigned within a session; omitting a session is not valid in TensorFlow 1.x.
What is a tensor in the context of TensorFlow computations?
Explanation: A tensor is, by definition, a container for multi-dimensional data and is central to operations in the framework. Functions operate on tensors but are not tensors themselves. Graphs and log files are organizational elements, not data containers.
If you have a TensorFlow operation named 'add_op', how can you retrieve its result?
Explanation: Operations must be executed within a session to produce results in TensorFlow 1.x. Printing or assigning outside a session merely provides a reference to the operation, not the computational result. Declaring as constant creates a new object, not evaluating the op.
Why does TensorFlow use computational graphs for managing computations?
Explanation: Graphs provide a structured way to optimize computation and enable running the same model on various devices or platforms. They are not designed for variable storage or hardware monitoring. Programming is still necessary to define the graph.
Why is it important to close a session after running operations in TensorFlow 1.x?
Explanation: Closing a session releases the resources allocated for its execution, helping to avoid memory issues. It does not delete the computational graph, nor does it reset variables or display their values; those require separate steps.
In TensorFlow, what does an addition operation node compute if given two input tensors: [2, 4] and [3, 5]?
Explanation: The addition operation performs element-wise addition, so [2+3, 4+5] equals [5, 9]. Multiplication would yield [6, 20], which is not addition. Subtraction would result in [1, -1]. [2, 3, 4, 5] is simply a concatenation, not an elemental operation.