- Theano - Discussion
- Theano - Useful Resources
- Theano - Quick Guide
- Theano - Conclusion
- Theano - Trivial Training Example
- Theano - Functions
- Theano - Shared Variables
- Theano - Variables
- Theano - Data Types
- Theano - Computational Graph
- Theano - Expression for Matrix Multiplication
- Theano - A Trivial Theano Expression
- Theano - Installation
- Theano - Introduction
- Theano - Home
Selected Reading
- Who is Who
- Computer Glossary
- HR Interview Questions
- Effective Resume Writing
- Questions and Answers
- UPSC IAS Exams Notes
Theano - Shared Variables
Many a times, you would need to create variables which are shared between different functions and also between multiple calls to the same function. To cite an example, while training a neural network you create weights vector for assigning a weight to each feature under consideration. This vector is modified on every iteration during the network training. Thus, it has to be globally accessible across the multiple calls to the same function. So we create a shared variable for this purpose. Typically, Theano moves such shared variables to the GPU, provided one is available. This speeds up the computation.
Syntax
You create a shared variable you use the following syntax −
import numpy W = theano.shared(numpy.asarray([0.1, 0.25, 0.15, 0.3]), W )
Example
Here the NumPy array consisting of four floating point numbers is created. To set/get the W value you would use the following code snippet −
import numpy W = theano.shared(numpy.asarray([0.1, 0.25, 0.15, 0.3]), W ) print ("Original: ", W.get_value()) print ("Setting new values (0.5, 0.2, 0.4, 0.2)") W.set_value([0.5, 0.2, 0.4, 0.2]) print ("After modifications:", W.get_value())
Output
Original: [0.1 0.25 0.15 0.3 ] Setting new values (0.5, 0.2, 0.4, 0.2) After modifications: [0.5 0.2 0.4 0.2]Advertisements