- Use the Python
memory_profiler
module. This module allows you to track memory usage line-by-line in your code. - Use the cProfile module. This module is a profiler for measuring the time taken for execution of certain sections of code. It can also be used to measure memory consumption.
- Use the
psutil
module. This module can be used to measure the amount of memory used by Python processes. - Use the
objgraph
module. This module can be used to track and visualize the memory consumption of Python objects.
Method 1: Using the Python Memory_Profiler Module

The Python memory_profiler
module is a great way to track memory usage line-by-line in your code. It measures the memory consumption of individual lines of code and functions.
To use the module, install it with the pip
command.
π Recommended: How to Install a Module in Python
On PyCharm, you can simply go File > Settings > + > "memory-profiler" > Install Package
:

After installation, use the @profile
decorator to measure the memory consumption of functions. When the function is called, it will return the memory usage for each line in the function.
Here’s an example:
from memory_profiler import profile @profile def my_function(): a = [1] * (10 ** 6) b = [2] * (2 * 10 ** 7) del b return a my_function()
Here’s the output on my machine:

Here’s a funny story: First, I couldn’t make this program run because it raised a strange error message:
... import code File "C:\Users\xcent\Desktop\Python\codeExamples\code.py", line 3, in <module> @memory_profiler.profile AttributeError: partially initialized module 'memory_profiler' has no attribute 'profile' (most likely due to a circular import)
So, the package memory_profiler
seems to import another module called code.py
. By pure chance, I tested the above script in a Python file called code.py
, too! So, Python tried to import itself in a circular way. πππ
I’m sure I’m the only person in the world that stupid bug ever happened to. π€―
Method 2: Using the cProfile Module

The cProfile module is another great way to measure the time taken for the execution of certain sections of code. It can also be used to measure memory consumption.
To use the cProfile module, simply import it from the Python standard library. No need to install it!
π Recommended: cProfile – Speed Up Your Python Code
After that, you can use the cProfile.run()
function to measure the memory consumption of functions. This function will return a report with the time taken for each line of code and the memory consumption for each line.
Here’s a simple example:
import cProfile def my_function(): a = [1] * (10 ** 6) b = [2] * (2 * 10 ** 7) del b return a cProfile.run('my_function()')
The output on my computer? Here it is:
4 function calls in 0.093 seconds
Ordered by: standard name
ncalls tottime percall cumtime percall filename:lineno(function)
1 0.002 0.002 0.092 0.092 <string>:1(<module>)
1 0.091 0.091 0.091 0.091 code.py:3(my_function)
1 0.001 0.001 0.093 0.093 {built-in method builtins.exec}
1 0.000 0.000 0.000 0.000 {method 'disable' of '_lsprof.Profiler' objects}
It looks prettier in my PyCharm IDE:

Method 3: Using the psutil Module

The psutil
module is a great way to measure the amount of memory used by Python processes.
To use it, first install it with the pip
command, i.e., pip install psutil
.
π Recommended: Pip Commands — The Ultimate Guide
After installation, use the psutil.Process()
function to get the memory consumption of a process. It will return the total memory consumption of the process.
Here’s a simple example:
import psutil process = psutil.Process() memory_usage = process.memory_info()[0] / float(2 ** 20) print(memory_usage)
Here’s the output of this code snippet in my PyCharm environment:
14.06640625
Note that you may be interested in other memory info, here’s the full output without pulling out the one value and normalizing it:
print(process.memory_info()) # pmem(rss=14761984, vms=7651328, num_page_faults=3768, peak_wset=14761984, wset=14761984, peak_paged_pool=158624, paged_pool=158624, peak_nonpaged_pool=14008, nonpaged_pool=14008, pagefile=7651328, peak_pagefile=7651328, private=7651328)
Method 4: Using the objgraph Module

The objgraph
module is another great way to track and visualize the memory consumption of Python objects.
To use the module, install it with pip install objgraph
. After installation, use the objgraph.show_most_common_types()
function to get a graph of the most common Python objects in memory. The graph will show the memory consumption of each type of object.
Here’s a minimal example:
import objgraph objgraph.show_most_common_types()
Here’s the output in my PyCharm environment:
function 2866
dict 1580
tuple 1423
wrapper_descriptor 1120
weakref 867
builtin_function_or_method 796
method_descriptor 768
getset_descriptor 446
type 409
cell 378
Conclusion
The memory_profiler
, cProfile
, psutil
, and objgraph
modules are all great tools for measuring and visualizing memory consumption. With these modules, you can track and visualize the memory usage of individual lines of code, functions, processes, and objects.
Easy, isn’t it?
If you want to keep improving and learning, check out our free email academy — we have cheat sheets and lots of practical coding projects!

While working as a researcher in distributed systems, Dr. Christian Mayer found his love for teaching computer science students.
To help students reach higher levels of Python success, he founded the programming education website Finxter.com that has taught exponential skills to millions of coders worldwide. He’s the author of the best-selling programming books Python One-Liners (NoStarch 2020), The Art of Clean Code (NoStarch 2022), and The Book of Dash (NoStarch 2022). Chris also coauthored the Coffee Break Python series of self-published books. He’s a computer science enthusiast, freelancer, and owner of one of the top 10 largest Python blogs worldwide.
His passions are writing, reading, and coding. But his greatest passion is to serve aspiring coders through Finxter and help them to boost their skills. You can join his free email academy here.